Python 面向句子相似性错误的通用句子编码

Python 面向句子相似性错误的通用句子编码,python,tensorflow,encoding,neural-network,nlp,Python,Tensorflow,Encoding,Neural Network,Nlp,我正试图在一个由两列文本组成的数据集上实现通用句子编码,其思想是在这两列上实现通用句子编码,然后实现一个神经网络。我想自从从tf1.0迁移到tf2以来,签名变量出现了问题。你能帮我解决这个问题吗 TypeError:函数获得意外的关键字参数签名 [守则] import tensorflow as tf import tensorflow_hub as hub # enabling the pretrained model for trainig our custom model using te

我正试图在一个由两列文本组成的数据集上实现通用句子编码,其思想是在这两列上实现通用句子编码,然后实现一个神经网络。我想自从从tf1.0迁移到tf2以来,签名变量出现了问题。你能帮我解决这个问题吗

TypeError:函数获得意外的关键字参数签名

[守则]

import tensorflow as tf
import tensorflow_hub as hub
# enabling the pretrained model for trainig our custom model using tensorflow hub
module_url = "https://tfhub.dev/google/universal-sentence-encoder-large/5"
embed = hub.load(module_url)

# creating a method for embedding and will using method for every input layer 
def UniversalEmbedding(x):
    return embed(tf.squeeze(tf.cast(x, tf.string)), signature="default", as_dict=True)["default"]

DROPOUT = 0.1

# Taking the question1 as input and ceating a embedding for each question before feed it to neural 
network
q1 = layers.Input(shape=(1,), dtype=tf.string)
embedding_q1 = layers.Lambda(UniversalEmbedding, output_shape=(512,))(q1)
# Taking the question2 and doing the same thing mentioned above, using the lambda function
q2 = layers.Input(shape=(1,), dtype=tf.string)
embedding_q2 = layers.Lambda(UniversalEmbedding, output_shape=(512,))(q2)

# Concatenating the both input layer
merged = layers.concatenate([embedding_q1, embedding_q2])
merged = layers.Dense(200, activation='relu')(merged)
merged = layers.Dropout(DROPOUT)(merged)

# Normalizing the input layer,applying dense and dropout  layer for fully connected model and to 
avoid overfitting 
merged = layers.BatchNormalization()(merged)
merged = layers.Dense(200, activation='relu')(merged)
merged = layers.Dropout(DROPOUT)(merged)

merged = layers.BatchNormalization()(merged)
merged = layers.Dense(200, activation='relu')(merged)
merged = layers.Dropout(DROPOUT)(merged)

merged = layers.BatchNormalization()(merged)
merged = layers.Dense(200, activation='relu')(merged)
merged = layers.Dropout(DROPOUT)(merged)

# Using the Sigmoid as the activation function and binary crossentropy for binary classifcation as 0 
or 1
merged = layers.BatchNormalization()(merged)
pred = layers.Dense(2, activation='sigmoid')(merged)
model = Model(inputs=[q1,q2], outputs=pred)
model.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy'])
model.summary()