Python 当下一批进入时,重量会再次初始化吗?

Python 当下一批进入时,重量会再次初始化吗?,python,tensorflow,deep-learning,Python,Tensorflow,Deep Learning,这是我的完整连接层。我把它用在这张图上 def weight_variable(shape, l2_reg_lambda=None, l1_reg_lambda=None): regularizer = None if l2_reg_lambda: regularizer = tf.contrib.layers.l2_regularizer(l2_reg_lambda) elif l1_reg_lambda: regularizer = tf.contrib.

这是我的完整连接层。我把它用在这张图上

def weight_variable(shape, l2_reg_lambda=None, l1_reg_lambda=None):
  regularizer = None
  if l2_reg_lambda:
      regularizer = tf.contrib.layers.l2_regularizer(l2_reg_lambda)
  elif l1_reg_lambda:
      regularizer = tf.contrib.layers.l1_regularizer(l1_reg_lambda)
  return tf.get_variable('weight', shape, initializer=tf.random_normal_initializer(stddev=0.1), regularizer=regularizer)


def bias_variable(shape):
  return tf.get_variable('bias', shape, initializer=tf.constant_initializer(0.1))


def full_connect(inputs, num_units, activation=None, name='full_connect'):
  with tf.variable_scope(name):
    shape = [inputs.get_shape()[-1], num_units]
    weight = weight_variable(shape)
    bias = bias_variable(shape[-1])
    outputs = tf.matmul(inputs, weight) + bias
    if activation=="relu":
      outputs = tf.nn.relu(outputs)
    elif activation == "tanh":
      outputs = tf.tanh(outputs)
    elif activation == "sigmoid":
      outputs = tf.nn.sigmoid(outputs)
    return outputs
当下一批数据输入时,在weight_变量中初始化的重量是否会再次初始化?或者只是第一次从随机正态分布初始化?
谢谢。

变量初始化与输入数据没有任何共同之处


一旦构建了图形,您的变量就会被分配和初始化。之后,图形是静态的,不会改变。初始化只完成一次。

变量值仅存在于会话tf.session中,而不存在于图表中。只要在批之间保持会话,权重就不会重新初始化


正如E_net4所评论的,会话中需要显式初始化。

不过,变量初始化步骤必须是显式的。
nn_layers.full_connect_(self.wide_deep_embed, config.num_classes, activation='sigmoid', name='output_layer')