Python 如何为tensorflow常量指定新值?

Python 如何为tensorflow常量指定新值?,python,tensorflow,deep-learning,Python,Tensorflow,Deep Learning,我正在从.pb文件加载TensorFlow模型。我想更改所有层的权重。我能够提取权重,但无法更改权重 我将graph_def模型转换为TensorFlow模型,但即使这样,我也无法为权重分配新值,因为权重存储在“Const”类型的张量中 我得到以下错误: AttributeError: 'Tensor' object has no attribute 'assign' 请提供解决此问题的方法。提前感谢。这里有一种方法可以让你实现这样的目标。您希望使用初始化为这些操作的值的变量替换某些常量操作,

我正在从.pb文件加载TensorFlow模型。我想更改所有层的权重。我能够提取权重,但无法更改权重

我将graph_def模型转换为TensorFlow模型,但即使这样,我也无法为权重分配新值,因为权重存储在“Const”类型的张量中

我得到以下错误:

AttributeError: 'Tensor' object has no attribute 'assign'

请提供解决此问题的方法。提前感谢。

这里有一种方法可以让你实现这样的目标。您希望使用初始化为这些操作的值的变量替换某些常量操作,因此可以首先提取这些常量值,然后使用初始化为这些值的变量创建图形。请参见下面的示例

import tensorflow as tf

# Example graph
with tf.Graph().as_default():
    inp = tf.placeholder(tf.float32, [None, 3], name='Input')
    w = tf.constant([[1.], [2.], [3.]], tf.float32, name='W')
    out = tf.squeeze(inp @ w, 1, name='Output')
    gd = tf.get_default_graph().as_graph_def()

# Extract weight values
with tf.Graph().as_default():
    w, = tf.graph_util.import_graph_def(gd, return_elements=['W:0'])
    # Get the constant weight values
    with tf.Session() as sess:
        w_val = sess.run(w)
    # Alternatively, since it is a constant,
    # you can get the values from the operation attribute directly
    w_val = tf.make_ndarray(w.op.get_attr('value'))

# Make new graph
with tf.Graph().as_default():
    # Make variables initialized with stored values
    w = tf.Variable(w_val, name='W')
    init_op = tf.global_variables_initializer()
    # Import graph
    inp, out = tf.graph_util.import_graph_def(
        gd, input_map={'W:0': w},
        return_elements=['Input:0', 'Output:0'])
    # Change value operation
    w_upd = w[2].assign([5.])
    # Test
    with tf.Session() as sess:
        sess.run(init_op)
        print(sess.run(w))
        # [[1.]
        #  [2.]
        #  [3.]]
        sess.run(w_upd)
        print(sess.run(w))
        # [[1.]
        #  [2.]
        #  [5.]]
import tensorflow as tf

# Example graph
with tf.Graph().as_default():
    inp = tf.placeholder(tf.float32, [None, 3], name='Input')
    w = tf.constant([[1.], [2.], [3.]], tf.float32, name='W')
    out = tf.squeeze(inp @ w, 1, name='Output')
    gd = tf.get_default_graph().as_graph_def()

# Extract weight values
with tf.Graph().as_default():
    w, = tf.graph_util.import_graph_def(gd, return_elements=['W:0'])
    # Get the constant weight values
    with tf.Session() as sess:
        w_val = sess.run(w)
    # Alternatively, since it is a constant,
    # you can get the values from the operation attribute directly
    w_val = tf.make_ndarray(w.op.get_attr('value'))

# Make new graph
with tf.Graph().as_default():
    # Make variables initialized with stored values
    w = tf.Variable(w_val, name='W')
    init_op = tf.global_variables_initializer()
    # Import graph
    inp, out = tf.graph_util.import_graph_def(
        gd, input_map={'W:0': w},
        return_elements=['Input:0', 'Output:0'])
    # Change value operation
    w_upd = w[2].assign([5.])
    # Test
    with tf.Session() as sess:
        sess.run(init_op)
        print(sess.run(w))
        # [[1.]
        #  [2.]
        #  [3.]]
        sess.run(w_upd)
        print(sess.run(w))
        # [[1.]
        #  [2.]
        #  [5.]]