Python tf.keras GradientTape:获取关于输入的梯度
Tensorflow版本:Tensorflow 2.1 我想得到关于输入的梯度,而不是关于可训练权重的梯度。我将示例从调整为Python tf.keras GradientTape:获取关于输入的梯度,python,tensorflow,tf.keras,Python,Tensorflow,Tf.keras,Tensorflow版本:Tensorflow 2.1 我想得到关于输入的梯度,而不是关于可训练权重的梯度。我将示例从调整为 但是,这不起作用,因为grads=tape.gradient(loss_value,model.inputs)返回[None]。这是不是有意的行为?如果是,建议使用什么方法来获取输入的梯度?要使其正常工作,需要添加两件事: 将图像转换为tf.变量 使用tape.watch观察相对于所需变量的梯度 image = tf.Variable(input) for iterati
但是,这不起作用,因为grads=tape.gradient(loss_value,model.inputs)返回[None]。这是不是有意的行为?如果是,建议使用什么方法来获取输入的梯度?要使其正常工作,需要添加两件事:
image = tf.Variable(input)
for iteration in range(400):
with tf.GradientTape() as tape:
tape.watch(image)
# Run the forward pass of the layer.
# The operations that the layer applies
# to its inputs are going to be recorded
# on the GradientTape.
prediction = model(image, training=False) # Logits for this minibatch
# Compute the loss value for this minibatch.
loss_value = loss_fun(target, prediction)
# Use the gradient tape to automatically retrieve
# the gradients of the trainable variables with respect to the loss.
grads = tape.gradient(loss_value, image)
#print(grads) # output: [None]
# Run one step of gradient descent by updating
# the value of the variables to minimize the loss.
optimizer = tf.keras.optimizers.Adam(learning_rate=0.001)
optimizer.apply_gradients(zip([grads], [image]))
print('Iteration {}'.format(iteration))
image = tf.Variable(input)
for iteration in range(400):
with tf.GradientTape() as tape:
tape.watch(image)
# Run the forward pass of the layer.
# The operations that the layer applies
# to its inputs are going to be recorded
# on the GradientTape.
prediction = model(image, training=False) # Logits for this minibatch
# Compute the loss value for this minibatch.
loss_value = loss_fun(target, prediction)
# Use the gradient tape to automatically retrieve
# the gradients of the trainable variables with respect to the loss.
grads = tape.gradient(loss_value, image)
#print(grads) # output: [None]
# Run one step of gradient descent by updating
# the value of the variables to minimize the loss.
optimizer = tf.keras.optimizers.Adam(learning_rate=0.001)
optimizer.apply_gradients(zip([grads], [image]))
print('Iteration {}'.format(iteration))