Python Keras不连续,尺寸和形状有问题
我正在尝试与本例中的架构类似:Python Keras不连续,尺寸和形状有问题,python,machine-learning,deep-learning,keras,Python,Machine Learning,Deep Learning,Keras,我正在尝试与本例中的架构类似: 然而,对于我的数据,我一直存在维度问题,而且我还没有找到一个好的网站来解释如何使用您自己的数据(而不是MNIST或默认数据)控制维度 上下文:我正在用文本图像尝试前面提到的架构,比如说,我第一次尝试用2000。对于标签,我决定做一个热编码,这是数据特征: 图像固定形状:(2000208352,1)#黑白 one_热标签大小:(2000,346,1)#2000个样本和346个类,最后一个值是具有三维数组,因为softmax显然需要它 现在是代码: nb_class
然而,对于我的数据,我一直存在维度问题,而且我还没有找到一个好的网站来解释如何使用您自己的数据(而不是MNIST或默认数据)控制维度 上下文:我正在用文本图像尝试前面提到的架构,比如说,我第一次尝试用2000。对于标签,我决定做一个热编码,这是数据特征:
图像固定形状:(2000208352,1)#黑白
one_热标签大小:(2000,346,1)#2000个样本和346个类,最后一个值是具有三维数组,因为softmax显然需要它 现在是代码:
nb_classes = 346
max_lin, max_col = (208, 352)
input_shape = ( max_lin, max_col, 1)
conv_filters = 16
kernel_size = (3, 3)
pool_size = 2
time_dense_size = 32
rnn_size = 512
act = 'relu'
input_data = Input(name='the_input', shape=input_shape)
inner = Conv2D(conv_filters, kernel_size, padding='same',
activation=act, name='CONV2D_1')(input_data)
inner = MaxPooling2D(pool_size=(pool_size, pool_size),
name='MXPOOL2D_1')(inner)
inner = Conv2D(conv_filters, kernel_size, padding='same',
activation=act, name='CONV2D_1')(input_data)
inner = MaxPooling2D(pool_size=(pool_size, pool_size),
name='MXPOOL2D_1')(inner)
#This is my problem, I dont really know how to reshape it with my data,
#I chose (104,2816) because other stuff didnt worked and I found it was
#the Layer Before (104,176,16) = (104, 176*16) = (104,2816); others values
#gives me ValueError: total size of new array must be unchanged
conv_to_rnn_dims = (104,2816)
inner = Reshape(target_shape=conv_to_rnn_dims, name='reshape')(inner)
inner = Dense(time_dense_size, activation=act, name='dense1')(inner)
gru_1 = GRU(rnn_size, return_sequences=True, kernel_initializer='he_normal', name='gru1')(inner)
gru_1b = GRU(rnn_size, return_sequences=True, go_backwards=True, kernel_initializer='he_normal', name='gru1_b')(inner)
gru1_merged = add([gru_1, gru_1b])
gru_2 = GRU(rnn_size, return_sequences=True, kernel_initializer='he_normal', name='gru2')(gru1_merged)
gru_2b = GRU(rnn_size, return_sequences=True, go_backwards=True, kernel_initializer='he_normal', name='gru2_b')(gru1_merged)
gru_conc = concatenate([gru_2, gru_2b])
print("GruCOnc: ",gru_conc.shape)
inner = Dense(nb_classes, kernel_initializer='he_normal',
name='DENSE_2')(gru_conc)
print("2ndDense: ",inner.shape)
y_pred = Activation('softmax',name='softmax')(inner)
print(y_pred.shape)
model = Model(inputs=input_data, outputs=y_pred)
print(model.summary())
sgd = SGD(lr=0.02, decay=1e-6, momentum=0.9, nesterov=True, clipnorm=5)
model.compile(loss='categorical_crossentropy',optimizer=sgd)
model.fit(train_data, train_label, batch_size=10, epochs=2, verbose=1)
score = model.evaluate(x_test, y_test, verbose=1)
print(score)
运行代码后,我得到:
ValueError: Error when checking target: expected softmax to have shape (None, 104, 346) but got array with shape (2000, 346, 1)
所以这里的大问题是,那是什么?因为346显然是类的数量,但另一个值让我完全迷失了方向
谢谢大家阅读我的问题
conv\u to\u rnn\u dims=(1042816)
这是虚构的。据我所知,您正在尝试将CNN输出馈送到一个密集的层。但是CNN的最后一层是产生2D输出的MaxPooling。您应该使用展平
进行此连接。让我们看看这个例子
model=Sequential()
model.add(Conv2D(16,3,3,border_mode="same",input_shape=(208,352,1))
#Produces 2000 x 208 x 352 x 16
model.add(Conv2D(32,3,3,activation="tanh",border_mode="valid"))
#Produces 2000 x 208 x 352 x 32
model.add(Flatten())
#Produces 2000 x 2342912
model.add(Dense(100,activation="sigmoid"))
#Produces 2000 x 100
Dense
您应该使用整形
使输出为GRU
做好准备。现在您有100个时间步要阅读。因此,您应该将其重塑为模型。添加(重塑((100,1))
,这样网络的结果现在是2000x100x1。您可以将其安全地安装到GRU
层稠密层的分类问题,目标形状应为2000 x 346,因此最终的稠密层应具有346个节点
好的,我明白了,我需要在我的致密层之前进行展平,然后我可以在GRU层中进行重塑,谢谢你的帮助!