Warning: file_get_contents(/data/phpspider/zhask/data//catemap/7/arduino/2.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Keras 如何修复AttributeError:';非类型';对象没有属性'_入站节点';这是在使用曼哈顿距离创建lstm模型时产生的?_Keras_Nlp_Lstm_Quora - Fatal编程技术网

Keras 如何修复AttributeError:';非类型';对象没有属性'_入站节点';这是在使用曼哈顿距离创建lstm模型时产生的?

Keras 如何修复AttributeError:';非类型';对象没有属性'_入站节点';这是在使用曼哈顿距离创建lstm模型时产生的?,keras,nlp,lstm,quora,Keras,Nlp,Lstm,Quora,我正在尝试创建一个神经网络模型,该模型使用曼哈顿LSTM(例如)返回两个句子的相似性分数。我使用了quora问题对数据集,并使用GoogleBert生成了它们的嵌入。现在,我想创建一个类似上述示例的LSTM模型并使用它,但我得到以下错误: Using TensorFlow backend. (100000, 1, 768) (100000, 1, 768) (100000,) (100000, 100) Traceback (most recent call last): File "tra

我正在尝试创建一个神经网络模型,该模型使用曼哈顿LSTM(例如)返回两个句子的相似性分数。我使用了quora问题对数据集,并使用GoogleBert生成了它们的嵌入。现在,我想创建一个类似上述示例的LSTM模型并使用它,但我得到以下错误:

Using TensorFlow backend.
(100000, 1, 768)
(100000, 1, 768)
(100000,)
(100000, 100)
Traceback (most recent call last):
  File "train_model_manhattan.py", line 151, in <module>
    model = Model(inputs=[inp1,inp2], outputs=[malstm_distance])
  File "/home/manishp/anaconda3/envs/bert_env/lib/python3.6/site-packages/keras/legacy/interfaces.py", line 91, in wrapper
    return func(*args, **kwargs)
  File "/home/manishp/anaconda3/envs/bert_env/lib/python3.6/site-packages/keras/engine/network.py", line 93, in __init__
    self._init_graph_network(*args, **kwargs)
  File "/home/manishp/anaconda3/envs/bert_env/lib/python3.6/site-packages/keras/engine/network.py", line 231, in _init_graph_network
    self.inputs, self.outputs)
  File "/home/manishp/anaconda3/envs/bert_env/lib/python3.6/site-packages/keras/engine/network.py", line 1366, in _map_graph_network
    tensor_index=tensor_index)
  File "/home/manishp/anaconda3/envs/bert_env/lib/python3.6/site-packages/keras/engine/network.py", line 1353, in build_map
    node_index, tensor_index)
  File "/home/manishp/anaconda3/envs/bert_env/lib/python3.6/site-packages/keras/engine/network.py", line 1353, in build_map
    node_index, tensor_index)
  File "/home/manishp/anaconda3/envs/bert_env/lib/python3.6/site-packages/keras/engine/network.py", line 1325, in build_map
    node = layer._inbound_nodes[node_index]
AttributeError: 'NoneType' object has no attribute '_inbound_nodes'
这是我的全部代码


import os
data_file='quora_duplicate_questions.tsv'
# 0 means dont load, 1 means fetch from file
LOAD_ENCODING_FROM_FILE=1 
encoding_data_file_quest1='encoding_quest1'
encoding_data_file_quest2='encoding_quest2'
encoding_data_file_label='quest_label'

#################################################
import numpy as np
import pandas as pd
import tensorflow as tf
import re
from bert_serving.client import BertClient
from keras.preprocessing.text import Tokenizer
from keras.preprocessing.sequence import pad_sequences
import numpy as np
import pickle
from keras import models
from keras import layers
from keras import optimizers
from keras.layers import Dropout
from keras import backend as K
from keras.layers import Lambda
#################################################
maxlen = 125  # We will cut reviews after 125 words

# The next step is to tranform all sentences to fixed length encoding using bert embeddings
# [0.1 0.4 0.4] [0.9 0.6 0.1] 2.4
# [0.4 0.1 0.3] [0.5 0.6 0.1] 1.0

# Save the encodings in a file 
if LOAD_ENCODING_FROM_FILE == 1:
    with open(encoding_data_file_quest1, "rb") as fp:
        vec1=pickle.load(fp)
    with open(encoding_data_file_quest2, "rb") as fp:   
        vec2=pickle.load(fp)
    with open(encoding_data_file_label, "rb") as fp: 
        label=pickle.load(fp)


train_vec1 = np.asarray(vec1, np.float32)
train_vec2 = np.asarray(vec2, np.float32)

train_vec1 = train_vec1.reshape((100000,1,768))
train_vec2 = train_vec2.reshape((100000,1,768))


train_vec1_tensor = K.cast(train_vec1,dtype='float32')
train_vec2_tensor = K.cast(train_vec2,dtype='float32')

train_label = np.asarray(label,np.float32)
print(np.shape(train_vec1))
print(np.shape(train_vec2))
print(np.shape(train_label))
#################################################
def exponent_neg_manhattan_distance(left, right):
    return np.exp(-np.sum(np.abs(left-right), axis=1, keepdims=True))

def manhattan_distance(left, right):
    ''' Helper function for the similarity estimate of the LSTMs outputs'''
    print(np.shape(left))
    return K.sum(K.abs(left - right), axis=1, keepdims=True)    
#################################################

import keras
from keras.layers import Input, LSTM, Dense
from keras.models import Model


inp1= Input(shape=(768,))
inp2= Input(shape=(768,))


x = keras.layers.concatenate([inp1, inp2],axis=-1)
x = Dense(1024, activation='relu')(x)
x = Dropout(0.5) (x)
x = Dense(256, activation='relu')(x)
x = Dropout(0.5) (x)
x = Dense(64, activation='relu')(x)
out=Dense(1)(x)


# Since this is a siamese network, both sides share the same LSTM
shared_lstm = LSTM(100)

left_output = shared_lstm(train_vec1_tensor)
right_output = shared_lstm(train_vec2_tensor)

# Calculates the distance as defined by the MaLSTM model
malstm_distance = Lambda(function=lambda x: manhattan_distance(x[0], x[1]),output_shape=lambda x: (x[0][0], 1))([left_output, right_output])

#######################
Getting error when code flow reaches the following line
#######################
model = Model(inputs=[inp1,inp2], outputs=[malstm_distance])

model.summary()
optimizer = optimizers.Adadelta(clipnorm=gradient_clipping_norm)
model.compile(optimizer,
              loss='mean_squared_error',
              metrics=['accuracy'])

history=model.fit([train_vec1, train_vec2], train_label, 
    epochs=30,batch_size=200,
    validation_split=0.2)


我希望模型取两个嵌入,计算嵌入的曼哈顿距离并返回距离。

left\u output
right\u output
LSTM
层获得。输入被送入
输入
层,并通过一系列
密集
层。但是,请注意,密集层组和LSTM之间没有任何连接。
模型
需要来自LSTM层的输出,这是不可能的。这一行
keras.layers.concatenate
应该使用
共享\u lstm
的输出,而不是直接使用输入层的输出。像这样

keras.layers.concatenate([left_output, right_output],axis=-1)

只有这样,这才是一个暹罗网络。

谢谢你@稻草人。你是对的,LSTM和致密层之间没有联系。这就是我所做的:`train_vec1_tensor=Input(shape=(768,1))x=keras.layers.concatenate([left_output,right_output],axis=-1)model=model(inputs=[train_vec1_tensor,train_vec2_tensor],outputs=[malstm_distance]))历史=模型拟合([train_vec1,train_vec2],train_标签,历次=30,批次大小=200,验证分割=0.2)`
keras.layers.concatenate([left_output, right_output],axis=-1)