Tensorflow和Keras的不同结果
对于相同网络结构的Tensorflow和Keras,我得到了不同的结果 损失函数看起来像Tensorflow和Keras的不同结果,tensorflow,keras,deep-learning,Tensorflow,Keras,Deep Learning,对于相同网络结构的Tensorflow和Keras,我得到了不同的结果 损失函数看起来像 class MaskedMultiCrossEntropy(object): def loss(self, y_true, y_pred): vec = tf.nn.softmax_cross_entropy_with_logits(logits=y_pred, labels=y_true, dim=1) mask = tf.equal(y_true[:,0,:],
class MaskedMultiCrossEntropy(object):
def loss(self, y_true, y_pred):
vec = tf.nn.softmax_cross_entropy_with_logits(logits=y_pred, labels=y_true, dim=1)
mask = tf.equal(y_true[:,0,:], -1)
zer = tf.zeros_like(vec)
loss = tf.where(mask, x=zer, y=vec)
return loss
我使用的网络层称为CrowdsClassification,由Keras实现。然后我通过
x = Dense(128, input_shape=(input_dim,), activation='relu')(inputs)
x = Dropout(0.5)(x)
x = Dense(N_CLASSES)(x)
x = Activation("softmax")(x)
crowd = CrowdsClassification(num_classes, num_oracles, conn_type="MW")
x = crowd(x)
用Keras训练模型
model = Model(inputs=inputs, outputs=x)
model.compile(optimizer='adam', loss=loss)
model.fit(inputs,
true_class, epochs=100, shuffle=False, verbose=2, validation_split=0.1))
用tensorflow训练模型
optimizer = tf.train.AdamOptimizer(learning_rate=0.01, beta1=0.9, beta2=0.999)
opt_op = optimizer.minimize(loss, global_step=global_step)
sess.run(tf.global_variables_initializer())
for epoch in range(100):
sess.run([loss, opt_op], feed_dict=train_feed_dict)
Tensorflow将得到错误的预测。问题似乎来自于损失函数,即Tensorflow不能反向屏蔽损失。谁能给点建议?Thx很多。您尝试过为Adam使用默认参数吗?是的,我尝试过,但不起作用。