Python 如何计算TensorFlow 2.0中的二阶导数(Hessian的对角线)

Python 如何计算TensorFlow 2.0中的二阶导数(Hessian的对角线),python,tensorflow,hessian-matrix,Python,Tensorflow,Hessian Matrix,我想计算TensorFlow 2.0中所有变量的所有分量的二阶导数(对角hessian)。我也想在这样的活动上签名 我让它在Google Colab上以渴望的模式工作(还有一个小测试): 但是这个实现很慢,我无法让它与签名一起工作。有人有更好的方法吗?有没有一种方法可以在磁带中不引用张量单个分量的梯度?提前谢谢 以下是TF1.x的答案: %tensorflow_version 2.x # for colab import tensorflow as tf x = tf.Variable([[

我想计算TensorFlow 2.0中所有变量的所有分量的二阶导数(对角hessian)。我也想在这样的活动上签名

我让它在Google Colab上以渴望的模式工作(还有一个小测试):

但是这个实现很慢,我无法让它与签名一起工作。有人有更好的方法吗?有没有一种方法可以在磁带中不引用张量单个分量的梯度?提前谢谢

以下是TF1.x的答案:

%tensorflow_version 2.x  # for colab
import tensorflow as tf

x = tf.Variable([[1.], [2.]])
z = tf.Variable([[3., 4.]])

with tf.GradientTape(persistent=True) as tape:
  with tf.GradientTape() as tape2:
    y = (z @ x)**2

  grads = tape2.gradient(y, [x, z])
  # We want references to each component of our Variables in-order
  # This needs to be done in the gradient tape, otherwise, we can't take
  # gradients w.r.t. each component individually
  grads_list = [list(tf.reshape(grad, [-1])) for grad in grads]

second_derivatives_list = []
for grad_list, var in zip(grads_list, [x, z]):
  # Gradient with respect to x returns has the same shape as x
  # So, we get the component that corresponds with the gradient in grads_list
  temp = tf.stack([
    tf.reshape(tape.gradient(g, var), [-1])[i] for i, g in enumerate(grad_list)
  ])
  second_derivatives_list.append(tf.reshape(temp, var.shape))

del tape

assert list(second_derivatives_list[0].numpy().transpose()[0]) == [18., 32.]
assert list(second_derivatives_list[1].numpy()[0]) == [2., 8.]