Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/python/326.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Python 如何添加注意层二进制分类RNN_Python_Keras_Deep Learning_Lstm - Fatal编程技术网

Python 如何添加注意层二进制分类RNN

Python 如何添加注意层二进制分类RNN,python,keras,deep-learning,lstm,Python,Keras,Deep Learning,Lstm,我想在我的LSTM层之后增加注意。下面是代码 visible = Input(shape=(251,)) embed=Embedding(vocab_size, 50)(visible) bilstm = Bidirectional(LSTM(units=25, return_sequences=True))(embed) att==?? predictions=Dense(1, activation='sigmoid')(att) keras中是否有像lstm或gru

我想在我的LSTM层之后增加注意。下面是代码

visible = Input(shape=(251,))    
embed=Embedding(vocab_size, 50)(visible)
bilstm = Bidirectional(LSTM(units=25, return_sequences=True))(embed)    
att==??    
predictions=Dense(1, activation='sigmoid')(att)
keras中是否有像lstm或gru等引起注意的层。

您可以使用此层

bilstm = Bidirectional(LSTM(units=25, return_sequences=True))(embed) 
attention=TimeDistributed(Dense(1, activation = 'tanh'))(bilstm)
attention=L.Softmax(axis=1)(attention)
context=L.Multiply()([attention, bilstm])
cout=L.Lambda(lambda x: K.sum(x,axis=1))(context)
它允许您检索每个层。默认情况下,它将返回权重*序列作为双向LSTM的输出。如果要保留序列,可以避免使用Lambda层


从keras github问题中找到代码。我写的解决方案来自keras Devloper(我想是吧?)。因此,很遗憾,我不能给出积分。

您可以使用这样的解决方案(这是一个顺序模型):