Machine learning 如何知道是否应该用示例将Tensorboard的操作包装在name\u作用域或variable\u作用域中

Machine learning 如何知道是否应该用示例将Tensorboard的操作包装在name\u作用域或variable\u作用域中,machine-learning,tensorflow,deep-learning,tensorboard,Machine Learning,Tensorflow,Deep Learning,Tensorboard,将可视化操作分组时,下面两个程序之间生成的输出之间存在很大差异。 tf.name\u作用域提供了两个rnn输出,一个在name\u作用域内,一个在外,而tf.variable\u作用域提供了更清晰的表示。我如何知道我是否必须在变量和名称范围内包装某些内容(除了get_variable这种明显的情况之外) 及 上面和下面的代码是相同的。复制粘贴错误,现已修复语法差异仍然最小上面和下面的代码是相同的。复制粘贴错误,已修复语法差异仍然最小 import tensorflow as tf with t

将可视化操作分组时,下面两个程序之间生成的输出之间存在很大差异。
tf.name\u作用域
提供了两个rnn输出,一个在
name\u作用域
内,一个在外,而
tf.variable\u作用域
提供了更清晰的表示。我如何知道我是否必须在变量和名称范围内包装某些内容(除了
get_variable
这种明显的情况之外)


上面和下面的代码是相同的。复制粘贴错误,现已修复语法差异仍然最小上面和下面的代码是相同的。复制粘贴错误,已修复语法差异仍然最小
import tensorflow as tf

with tf.name_scope("bongo"):
    x = tf.placeholder(tf.float32,[1,None, 50], name='myxxxx')
    lstm = tf.contrib.rnn.BasicLSTMCell(77)
    drop = tf.contrib.rnn.DropoutWrapper(lstm, output_keep_prob=0.5)
    cell = tf.contrib.rnn.MultiRNNCell([drop] * 3)

    initial_state = cell.zero_state(33, tf.float32)

with tf.name_scope("alias"):
    outputs, final_state = tf.nn.dynamic_rnn(
        cell,
        x, dtype=tf.float32)

with tf.Session() as sess:
    sess.run(tf.global_variables_initializer())
    fw = tf.summary.FileWriter('/tmp/testmodel/1', sess.graph)
import tensorflow as tf

with tf.name_scope("bongo"):
    x = tf.placeholder(tf.float32,[1,None, 50], name='myxxxx')
    lstm = tf.contrib.rnn.BasicLSTMCell(77)
    drop = tf.contrib.rnn.DropoutWrapper(lstm, output_keep_prob=0.5)
    cell = tf.contrib.rnn.MultiRNNCell([drop] * 3)

    initial_state = cell.zero_state(33, tf.float32)

with tf.variable_scope("alias"):
    outputs, final_state = tf.nn.dynamic_rnn(
        cell,
        x, dtype=tf.float32)

with tf.Session() as sess:
    sess.run(tf.global_variables_initializer())
    fw = tf.summary.FileWriter('/tmp/testmodel/1', sess.graph)