Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/python/280.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/tensorflow/5.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Python tf.变量的Tensorflow占位符_Python_Tensorflow - Fatal编程技术网

Python tf.变量的Tensorflow占位符

Python tf.变量的Tensorflow占位符,python,tensorflow,Python,Tensorflow,下面的代码抛出一个“无渐变”错误 我相信这是因为我的损失函数依赖于占位符而不是变量,但在我的训练函数中,我为占位符传递了一个变量值 有没有办法为变量创建一个占位符?我认为问题在于您的模型中没有可训练的变量。反向传播用于根据损失调整可训练参数。不幸的是这里没有 self.x1 = tf.placeholder(tf.float64) self.x2 = tf.placeholder(tf.float64) self.x3 = tf.placeholder(tf.float64) self.cos

下面的代码抛出一个“无渐变”错误

我相信这是因为我的损失函数依赖于占位符而不是变量,但在我的训练函数中,我为占位符传递了一个变量值


有没有办法为变量创建一个占位符?

我认为问题在于您的模型中没有可训练的变量。反向传播用于根据损失调整可训练参数。不幸的是这里没有

self.x1 = tf.placeholder(tf.float64)
self.x2 = tf.placeholder(tf.float64)
self.x3 = tf.placeholder(tf.float64)

self.cos1_denom = tf.norm(self.x1, axis=0) * tf.norm(self.x2, axis=0)
self.cos1 = tf.matmul(self.x1, self.x2, transpose_b=True) / self.cos1_denom
self.cos2_denom = tf.norm(self.x1, axis=0) * tf.norm(self.x2, axis=0)
self.cos2 = tf.matmul(self.x1, self.x3, transpose_b=True) / self.cos2_denom
self.loss = tf.reduce_mean(self.cos2) - tf.reduce_mean(self.cos1)
self.optimizer = tf.train.AdamOptimizer(learning_rate=self.eta).minimize(self.loss)