Python 简单LSTM错误:检查输入时出错:预期LSTM_20_输入具有形状(无,10,3),但获得具有形状(1,64,3)的数组
我有一个批处理生成器功能,它没有将正确的批处理形状提供给LSTM。当我测试函数时,它似乎返回了正确的形状[n_samples,n_timesteps,n_features],但这会在拟合模型时引发错误 我通过在生成器上循环检查批处理形状来检查函数,它们返回正确的样本数、时间步长等Python 简单LSTM错误:检查输入时出错:预期LSTM_20_输入具有形状(无,10,3),但获得具有形状(1,64,3)的数组,python,tensorflow,keras,lstm,forecasting,Python,Tensorflow,Keras,Lstm,Forecasting,我有一个批处理生成器功能,它没有将正确的批处理形状提供给LSTM。当我测试函数时,它似乎返回了正确的形状[n_samples,n_timesteps,n_features],但这会在拟合模型时引发错误 我通过在生成器上循环检查批处理形状来检查函数,它们返回正确的样本数、时间步长等 from keras.models import Sequential from keras.layers import Dense, LSTM, TimeDistributed, RepeatVector def
from keras.models import Sequential
from keras.layers import Dense, LSTM, TimeDistributed, RepeatVector
def batch_generator(x_train_scaled, y_train_scaled, batch_size, sequence_length):
"""
Generator function to develop sequential batches of data.
Args:
batch_size:
sequence_length:
"""
# Infinite loop.
while True:
# Allocate a new array for the batch of input-signals.
x_shape = np.array((batch_size, sequence_length, num_x_signals))
x_batch = np.zeros(shape=x_shape, dtype=np.float16)
# Allocate a new array for the batch of output-signals.
y_shape = np.array((batch_size, sequence_length, 1))
y_batch = np.zeros(shape=y_shape, dtype=np.float16)
for i in range(batch_size):
# Copy the sequences of data starting at this index.
x_batch[i] = x_train_scaled[i:i+sequence_length]
y_batch[i] = y_train_scaled[i:i+sequence_length]
yield (x_batch, y_batch)
# test function
batch_size = 10
sequence_length = 10
batch_gen = batch_generator(x_train_scaled, y_train_scaled,batch_size=batch_size,
sequence_length=sequence_length)
x_batch, y_batch = next(batch_gen)
# test that returns correct shape (10, 10, 3) and (10, 10, 1)
print(x_batch.shape)
print(y_batch.shape)
def build_model(generator, n_outputs):
# define encoder/decoder architecture, use Time Distributed layer
model = Sequential()
model.add(LSTM(10, activation='relu', input_shape=(x_batch.shape[1],
x_batch.shape[2])))
model.add(RepeatVector(n_outputs))
model.add(LSTM(10, activation='relu', return_sequences=True))
model.add(TimeDistributed(Dense(5, activation='relu')))
model.add(TimeDistributed(Dense(1)))
model.compile(loss='mse', optimizer='adam')
# fit network
model.fit_generator(generator=batch_gen,
epochs=20,
steps_per_epoch=10,
validation_data=validation_data,
verbose = 1)
return model