如何使用keras向CNN_BLSTM模型添加注意层?

如何使用keras向CNN_BLSTM模型添加注意层?,keras,deep-learning,conv-neural-network,lstm,Keras,Deep Learning,Conv Neural Network,Lstm,我试图为我的模型添加一个注意层,但每次看到示例时,我都会发现嵌入层、编码和解码机制,所以有没有一种方法可以不使用它们而添加它 def cnn_blsm(): # define CNN model model = Sequential() model.add(TimeDistributed(Conv2D(20, (3,3), activation='tanh',padding = 'same'), input_shape=(1,11,11,1))) mo

我试图为我的模型添加一个注意层,但每次看到示例时,我都会发现嵌入层、编码和解码机制,所以有没有一种方法可以不使用它们而添加它

def cnn_blsm():
     # define CNN model
    model = Sequential()
    
    model.add(TimeDistributed(Conv2D(20, (3,3), activation='tanh',padding = 'same'), input_shape=(1,11,11,1)))
    model.add(TimeDistributed(MaxPooling2D(pool_size=(2, 2))))
    model.add(TimeDistributed(Conv2D(40, (3,3), activation='tanh',padding = 'same')))
    model.add(TimeDistributed(MaxPooling2D(pool_size=(2, 2))))       
    model.add(TimeDistributed(Conv2D(60, (3,3), activation='tanh',padding = 'same')))
    model.add(TimeDistributed(MaxPooling2D(pool_size=(2, 2))))     
    model.add(TimeDistributed(Flatten()))
    model.add(Bidirectional(LSTM(40)))
#     model.add(Attention(use_scale=False))
#     model.add((Dense(1, activation='relu')))
    model.add(Dense(320, activation='relu'))
    model.add(Dropout(0.1))
# #     model.add(Dense(1024, activation='tanh'))
    model.add(Dense(1, activation='sigmoid'))
    
    return model