Tensorflow 我是否正确使用了tf.get_variable()?
我从中了解到,建议始终使用Tensorflow 我是否正确使用了tf.get_variable()?,tensorflow,Tensorflow,我从中了解到,建议始终使用tf.get_variable(…),尽管这在我尝试实现网络时似乎有点麻烦 例如: def create_weights(shape, name = 'weights',\ initializer = tf.random_normal_initializer(0, 0.1)): weights = tf.get_variable(name, shape, initializer = initializer) print
tf.get_variable(…)
,尽管这在我尝试实现网络时似乎有点麻烦
例如:
def create_weights(shape, name = 'weights',\
initializer = tf.random_normal_initializer(0, 0.1)):
weights = tf.get_variable(name, shape, initializer = initializer)
print("weights created named: {}".format(weights.name))
return(weights)
def LeNet(in_units, keep_prob):
# define the network
with tf.variable_scope("conv1"):
conv1 = conv(in_units, create_weights([5, 5, 3, 32]), create_bias([32]))
pool1 = maxpool(conv1)
with tf.variable_scope("conv2"):
conv2 = conv(pool1, create_weights([5, 5, 32, 64]), create_bias([64]))
pool2 = maxpool(conv2)
# reshape the network to feed it into the fully connected layers
with tf.variable_scope("flatten"):
flatten = tf.reshape(pool2, [-1, 1600])
flatten = dropout(flatten, keep_prob)
with tf.variable_scope("fc1"):
fc1 = fc(flatten, create_weights([1600, 120]), biases = create_bias([120]))
fc1 = dropout(fc1, keep_prob)
with tf.variable_scope("fc2"):
fc2 = fc(fc1, create_weights([120, 84]), biases = create_bias([84]))
with tf.variable_scope("logits"):
logits = fc(fc2, create_weights([84, 43]), biases = create_bias([43]))
return(logits)
每次调用create\u weights
时,我都必须对tf\u variable\u scope(…)使用,而且,如果我想将conv1
变量的权重更改为[7,7,3,32]
而不是[5,5,3,32]
,我必须重新启动内核,因为变量已经存在。另一方面,如果我使用tf.Variable(…)
我就不会有这些问题
我是否使用了tf.variable\u scope(…)
错误?似乎您无法更改变量作用域中已经存在的内容,因此只有在重新启动内核时,您才能更改以前定义的变量。(事实上,您创建了一个新变量,因为之前的变量已被删除)
这只是我的猜测……如果有人能给出详细的答案,我将不胜感激