Arrays 如何修复层之间阵列形状的不匹配

Arrays 如何修复层之间阵列形状的不匹配,arrays,keras,deep-learning,Arrays,Keras,Deep Learning,我正在构建钢筋DNN(DQN),但在将数据发送到模型时出现以下错误: ValueError:检查目标时出错:预期密集_2有2个维度,但得到了形状为(64,4,1)的数组。 我使用(1139)的输入,最小批量为64,使其为:(641139) 我对模型进行了总结: Model: "sequential_1" _________________________________________________________________ Layer (type) O

我正在构建钢筋DNN(DQN),但在将数据发送到模型时出现以下错误:
ValueError:检查目标时出错:预期密集_2有2个维度,但得到了形状为(64,4,1)的数组。

我使用(1139)的输入,最小批量为64,使其为:(641139)

我对模型进行了总结:

 Model: "sequential_1"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
lstm_1 (LSTM)                (None, 1, 128)            137216    
_________________________________________________________________
dropout_1 (Dropout)          (None, 1, 128)            0         
_________________________________________________________________
batch_normalization_1 (Batch (None, 1, 128)            512       
_________________________________________________________________
lstm_2 (LSTM)                (None, 1, 128)            131584    
_________________________________________________________________
dropout_2 (Dropout)          (None, 1, 128)            0         
_________________________________________________________________
batch_normalization_2 (Batch (None, 1, 128)            512       
_________________________________________________________________
dense_1 (Dense)              (None, 1, 32)             4128      
_________________________________________________________________
dropout_3 (Dropout)          (None, 1, 32)             0         
_________________________________________________________________
flatten_1 (Flatten)          (None, 32)                0         
_________________________________________________________________
dense_2 (Dense)              (None, 4)                 132       
=================================================================
Total params: 274,084
Trainable params: 273,572
Non-trainable params: 512
_________________________________________________________________
None
Model: "sequential_2"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
lstm_3 (LSTM)                (None, 1, 128)            137216    
_________________________________________________________________
dropout_4 (Dropout)          (None, 1, 128)            0         
_________________________________________________________________
batch_normalization_3 (Batch (None, 1, 128)            512       
_________________________________________________________________
lstm_4 (LSTM)                (None, 1, 128)            131584    
_________________________________________________________________
dropout_5 (Dropout)          (None, 1, 128)            0         
_________________________________________________________________
batch_normalization_4 (Batch (None, 1, 128)            512       
_________________________________________________________________
dense_3 (Dense)              (None, 1, 32)             4128      
_________________________________________________________________
dropout_6 (Dropout)          (None, 1, 32)             0         
_________________________________________________________________
flatten_2 (Flatten)          (None, 32)                0         
_________________________________________________________________
dense_4 (Dense)              (None, 4)                 132       
=================================================================
Total params: 274,084
Trainable params: 273,572
Non-trainable params: 512
_________________________________________________________________
None

展平层不应该使其成为二维阵列吗?有什么想法吗-/

这条线没有任何意义

model.add(展平())
在一个致密层之后。我相信你应该把它放在你的第二个LSTM之后,对吗

 Model: "sequential_1"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
lstm_1 (LSTM)                (None, 1, 128)            137216    
_________________________________________________________________
dropout_1 (Dropout)          (None, 1, 128)            0         
_________________________________________________________________
batch_normalization_1 (Batch (None, 1, 128)            512       
_________________________________________________________________
lstm_2 (LSTM)                (None, 1, 128)            131584    
_________________________________________________________________
dropout_2 (Dropout)          (None, 1, 128)            0         
_________________________________________________________________
batch_normalization_2 (Batch (None, 1, 128)            512       
_________________________________________________________________
dense_1 (Dense)              (None, 1, 32)             4128      
_________________________________________________________________
dropout_3 (Dropout)          (None, 1, 32)             0         
_________________________________________________________________
flatten_1 (Flatten)          (None, 32)                0         
_________________________________________________________________
dense_2 (Dense)              (None, 4)                 132       
=================================================================
Total params: 274,084
Trainable params: 273,572
Non-trainable params: 512
_________________________________________________________________
None
Model: "sequential_2"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
lstm_3 (LSTM)                (None, 1, 128)            137216    
_________________________________________________________________
dropout_4 (Dropout)          (None, 1, 128)            0         
_________________________________________________________________
batch_normalization_3 (Batch (None, 1, 128)            512       
_________________________________________________________________
lstm_4 (LSTM)                (None, 1, 128)            131584    
_________________________________________________________________
dropout_5 (Dropout)          (None, 1, 128)            0         
_________________________________________________________________
batch_normalization_4 (Batch (None, 1, 128)            512       
_________________________________________________________________
dense_3 (Dense)              (None, 1, 32)             4128      
_________________________________________________________________
dropout_6 (Dropout)          (None, 1, 32)             0         
_________________________________________________________________
flatten_2 (Flatten)          (None, 32)                0         
_________________________________________________________________
dense_4 (Dense)              (None, 4)                 132       
=================================================================
Total params: 274,084
Trainable params: 273,572
Non-trainable params: 512
_________________________________________________________________
None