Optimization 损失不';在softmax之后,具有线性层的网络不会发生变化

Optimization 损失不';在softmax之后,具有线性层的网络不会发生变化,optimization,tensorflow,softmax,loss,Optimization,Tensorflow,Softmax,Loss,在这个网络中,我试图得到layer2(softmax层)输出的加权平均值之和。最后一个线性层不参与训练 有人知道为什么在第一个纪元之后损失没有改变吗 # Inputs and Placeholders x = tf.placeholder(tf.float32, shape=[None, 30]) y_ = tf.placeholder(tf.float32) # Inference W_1 = tf.Variable(tf.zeros([30,50])) b_1 = tf.Variable(

在这个网络中,我试图得到layer2(softmax层)输出的加权平均值之和。最后一个线性层不参与训练

有人知道为什么在第一个纪元之后损失没有改变吗

# Inputs and Placeholders
x = tf.placeholder(tf.float32, shape=[None, 30])
y_ = tf.placeholder(tf.float32)

# Inference
W_1 = tf.Variable(tf.zeros([30,50]))
b_1 = tf.Variable(tf.zeros([50]))
layer_1 = tf.sigmoid(tf.add(tf.matmul(x, W_1), b_1))

W_2 = tf.Variable(tf.zeros([50,25]))
b_2 = tf.Variable(tf.zeros([25]))
layer_1_value = tf.add(tf.matmul(layer_1, W_2), b_2)
layer_2 = tf.nn.softmax(layer_1_value)

# W_3 is a fixed weight matrix.
w_linear = np.array([item + 5 for item in xrange(50, 300, 10)])
W_3 = tf.Variable(w_linear, trainable=False)
y = tf.reduce_sum(tf.multiply(tf.to_float(W_3), layer_2))

# Loss
mean_square = tf.losses.mean_squared_error(y_, y)
loss = tf.reduce_mean(mean_square, name='square_mean')

# Training
tf.summary.scalar('loss', loss)
optimizer = tf.train.GradientDescentOptimizer(learning_rate)
global_step = tf.Variable(0, name='global_step', trainable=False)
train_op = optimizer.minimize(loss, global_step=global_step)

在softmax之后设置图层的目的是什么?这是非常不寻常的。你能把这一层设置为trainable=True,看看它是否学到了什么吗?但最后一层不应该改变。因为softmax层的每个节点代表一个值范围,我们试图得到该值的最终值。不……我是说,试试看损耗是否在减少?如果不是,错误可能在其他地方。我只是用W_1=tf.Variable(tf.zeros([30,50])替换了W_1=tf.Variable(tf.random_normal([30,50],stddev=0.35)),它就工作了。我想我不应该使用零作为默认值
('Epoch:0001', 'cost=2499180.068965517')
('Epoch:0002', 'cost=2335760.384482760')
('Epoch:0003', 'cost=2335760.384482760')
('Epoch:0004', 'cost=2335760.384482760')
('Epoch:0005', 'cost=2335760.384482760')
('Epoch:0006', 'cost=2335760.384482760')
('Epoch:0007', 'cost=2335760.384482760')
('Epoch:0008', 'cost=2335760.384482760')
('Epoch:0009', 'cost=2335760.384482760')