Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/tensorflow/5.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Tensorflow:tf.random_normal使用相同的初始种子获得不同的结果_Tensorflow_Random_Initializer - Fatal编程技术网

Tensorflow:tf.random_normal使用相同的初始种子获得不同的结果

Tensorflow:tf.random_normal使用相同的初始种子获得不同的结果,tensorflow,random,initializer,Tensorflow,Random,Initializer,我想制作一个可重用的随机张量x,并将相同的张量分配给变量y。这意味着它们在Session.run()期间应该具有相同的值 但事实并非如此。那么为什么y不等于x 更新: 连续多次应用sess.run(x)和sess.run(y)后,确认x每次都在变化,而y保持稳定。为什么? import tensorflow as tf x = tf.random_normal([3], seed = 1) y = tf.Variable(initial_value = x) # expect y get th

我想制作一个可重用的
随机张量x
,并将相同的张量分配给
变量y
。这意味着它们在
Session.run()期间应该具有相同的值

但事实并非如此。那么为什么
y
不等于
x

更新: 连续多次应用
sess.run(x)
sess.run(y)
后,确认
x
每次都在变化,而
y
保持稳定。为什么?

import tensorflow as tf

x = tf.random_normal([3], seed = 1)
y = tf.Variable(initial_value = x) # expect y get the same random tensor as x

diff = tf.subtract(x, y)
avg = tf.reduce_mean(diff)

sess = tf.InteractiveSession()
sess.run(y.initializer)

print('x0:', sess.run(x))
print('y0:', sess.run(y))
print('x1:', sess.run(x))
print('y1:', sess.run(y))
print('x2:', sess.run(x))
print('y2:', sess.run(y))
print('diff:', sess.run(diff))
print('avg:', sess.run(avg)) # expected as 0.0

sess.close()
输出:张量x每运行一次就改变一次。运行(x)

真正的原因是:
x=tf.random\u normal(seed=initial\u seed)
在每次应用
sess.run()
时都在演化,但如果
x0-x1-x2
if重新启动运行脚本,则会产生相同的张量序列。提供有关随机种子的一些解释

为了保证每次运行后相同的
x
,我们需要重新初始化它。我不确定有没有合适的方式来处理我的案子。但是我们可以将
x
设置为变量,并使用固定种子进行初始化。
tf.get\u variable
tf.variable
都可以。我觉得我的问题合适

这是我的最终代码。它起作用了

import tensorflow as tf

initializer = tf.random_normal_initializer(seed = 1)
x = tf.get_variable(name = 'x', shape = [3], dtype = tf.float32, initializer = initializer)
y = tf.Variable(initial_value = x)

diff = tf.subtract(x, y)
avg = tf.reduce_mean(diff)

sess = tf.InteractiveSession()
sess.run(tf.global_variables_initializer())

print('x0:', sess.run(x))
print('y0:', sess.run(y))

print('x1:', sess.run(x))
print('y1:', sess.run(y))

print('x2:', sess.run(x))
print('y2:', sess.run(y))

print('diff:', sess.run(diff))
print('avg:', sess.run(avg))
sess.close()

x0: [-0.8113182   1.4845988   0.06532937]
y0: [-0.8113182   1.4845988   0.06532937]
x1: [-0.8113182   1.4845988   0.06532937]
y1: [-0.8113182   1.4845988   0.06532937]
x2: [-0.8113182   1.4845988   0.06532937]
y2: [-0.8113182   1.4845988   0.06532937]
diff: [0. 0. 0.]
avg: 0.0

请尝试
sess.run(tf.global\u variable\u initilizer())
而不是初始化。这有用吗?也许可以将张量的长度设置为2,这样就可以直观地比较两者。@Lau sess.run(tf.global_variable_initilizer())与sess.run(y.initializer)相同。应用sess.run(tf.equal(x,y))你会得到一个全为False的张量。试试看。好的,sry。我现在测试了一些代码
tf.random_normal
在每次调用时绘制正态分布。这意味着,因为你想有时间产生噪音。如果变量是静态的,请使用:
tf.get\u variable(“test”,shape=[3],initializer=tf.random\u normal\u initializer())
import tensorflow as tf

initializer = tf.random_normal_initializer(seed = 1)
x = tf.get_variable(name = 'x', shape = [3], dtype = tf.float32, initializer = initializer)
y = tf.Variable(initial_value = x)

diff = tf.subtract(x, y)
avg = tf.reduce_mean(diff)

sess = tf.InteractiveSession()
sess.run(tf.global_variables_initializer())

print('x0:', sess.run(x))
print('y0:', sess.run(y))

print('x1:', sess.run(x))
print('y1:', sess.run(y))

print('x2:', sess.run(x))
print('y2:', sess.run(y))

print('diff:', sess.run(diff))
print('avg:', sess.run(avg))
sess.close()

x0: [-0.8113182   1.4845988   0.06532937]
y0: [-0.8113182   1.4845988   0.06532937]
x1: [-0.8113182   1.4845988   0.06532937]
y1: [-0.8113182   1.4845988   0.06532937]
x2: [-0.8113182   1.4845988   0.06532937]
y2: [-0.8113182   1.4845988   0.06532937]
diff: [0. 0. 0.]
avg: 0.0