Python 使用Keras Tensorflow 2.0获取渐变

Python 使用Keras Tensorflow 2.0获取渐变,python,tensorflow,keras,tensorboard,tensorflow2.0,Python,Tensorflow,Keras,Tensorboard,Tensorflow2.0,我想跟踪张力板上的梯度。 但是,由于会话运行语句已不再是一件事,而且tf.keras.callbacks.TensorBoard的write_grads参数已被去除,因此我想知道如何在使用keras或tensorflow 2.0进行训练期间跟踪梯度 我目前的方法是为此创建一个新的回调类,但没有成功。也许其他人知道如何完成这种先进的东西 为测试而创建的代码如下所示,但在将渐变值打印到console或tensorboard时会出现错误 import tensorflow as tf from ten

我想跟踪张力板上的梯度。 但是,由于会话运行语句已不再是一件事,而且tf.keras.callbacks.TensorBoard的write_grads参数已被去除,因此我想知道如何在使用kerastensorflow 2.0进行训练期间跟踪梯度

我目前的方法是为此创建一个新的回调类,但没有成功。也许其他人知道如何完成这种先进的东西

为测试而创建的代码如下所示,但在将渐变值打印到console或tensorboard时会出现错误

import tensorflow as tf
from tensorflow.python.keras import backend as K

mnist = tf.keras.datasets.mnist

(x_train, y_train), (x_test, y_test) = mnist.load_data()
x_train, x_test = x_train / 255.0, x_test / 255.0

model = tf.keras.models.Sequential([
  tf.keras.layers.Flatten(input_shape=(28, 28)),
  tf.keras.layers.Dense(128, activation='relu', name='dense128'),
  tf.keras.layers.Dropout(0.2),
  tf.keras.layers.Dense(10, activation='softmax', name='dense10')
])

model.compile(optimizer='adam', loss='sparse_categorical_crossentropy', metrics=['accuracy'])


class GradientCallback(tf.keras.callbacks.Callback):
    console = True

    def on_epoch_end(self, epoch, logs=None):
        weights = [w for w in self.model.trainable_weights if 'dense' in w.name and 'bias' in w.name]
        loss = self.model.total_loss
        optimizer = self.model.optimizer
        gradients = optimizer.get_gradients(loss, weights)
        for t in gradients:
            if self.console:
                print('Tensor: {}'.format(t.name))
                print('{}\n'.format(K.get_value(t)[:10]))
            else:
                tf.summary.histogram(t.name, data=t)


file_writer = tf.summary.create_file_writer("./metrics")
file_writer.set_as_default()

# write_grads has been removed
tensorboard_cb = tf.keras.callbacks.TensorBoard(histogram_freq=1, write_grads=True)
gradient_cb = GradientCallback()

model.fit(x_train, y_train, epochs=5, callbacks=[gradient_cb, tensorboard_cb])
  • 到控制台的初始偏差梯度(控制台参数=真) 导致:AttributeError:“Tensor”对象没有属性“numpy”
  • 写入tensorboard(控制台参数=False)将创建: 类型错误:不允许将
    tf.Tensor
    用作Python
    bool
    。如果t不是None:则使用
    ,如果t:
    则使用
    来测试是否定义了张量,并使用诸如tf.cond之类的TensorFlow操作来执行以
    张量的值。

要计算重量损失的梯度,请使用

with tf.GradientTape() as tape:
    loss = model(model.trainable_weights)

tape.gradient(loss, model.trainable_weights)
这一点(可以说很糟糕)在上有记录

我们不需要
磁带。监视变量,因为默认情况下监视可训练参数

作为一个函数,它可以写成

def gradient(model, x):
    x_tensor = tf.convert_to_tensor(x, dtype=tf.float32)
    with tf.GradientTape() as t:
        t.watch(x_tensor)
        loss = model(x_tensor)
    return t.gradient(loss, x_tensor).numpy()
请看这里:

richardwth写了一个Tensorboard的儿童课程

import tensorflow as tf
from tensorflow.python.keras import backend as K

mnist = tf.keras.datasets.mnist

(x_train, y_train), (x_test, y_test) = mnist.load_data()
x_train, x_test = x_train / 255.0, x_test / 255.0

model = tf.keras.models.Sequential([
  tf.keras.layers.Flatten(input_shape=(28, 28)),
  tf.keras.layers.Dense(128, activation='relu', name='dense128'),
  tf.keras.layers.Dropout(0.2),
  tf.keras.layers.Dense(10, activation='softmax', name='dense10')
])

model.compile(optimizer='adam', loss='sparse_categorical_crossentropy', metrics=['accuracy'])


class GradientCallback(tf.keras.callbacks.Callback):
    console = True

    def on_epoch_end(self, epoch, logs=None):
        weights = [w for w in self.model.trainable_weights if 'dense' in w.name and 'bias' in w.name]
        loss = self.model.total_loss
        optimizer = self.model.optimizer
        gradients = optimizer.get_gradients(loss, weights)
        for t in gradients:
            if self.console:
                print('Tensor: {}'.format(t.name))
                print('{}\n'.format(K.get_value(t)[:10]))
            else:
                tf.summary.histogram(t.name, data=t)


file_writer = tf.summary.create_file_writer("./metrics")
file_writer.set_as_default()

# write_grads has been removed
tensorboard_cb = tf.keras.callbacks.TensorBoard(histogram_freq=1, write_grads=True)
gradient_cb = GradientCallback()

model.fit(x_train, y_train, epochs=5, callbacks=[gradient_cb, tensorboard_cb])
我将其改编如下:

class ExtendedTensorBoard(tf.keras.callbacks.TensorBoard):
    def _log_gradients(self, epoch):
        writer = self._get_writer(self._train_run_name)

        with writer.as_default(), tf.GradientTape() as g:
            # here we use test data to calculate the gradients
            features, y_true = list(val_dataset.batch(100).take(1))[0]

            y_pred = self.model(features)  # forward-propagation
            loss = self.model.compiled_loss(y_true=y_true, y_pred=y_pred)  # calculate loss
            gradients = g.gradient(loss, self.model.trainable_weights)  # back-propagation

            # In eager mode, grads does not have name, so we get names from model.trainable_weights
            for weights, grads in zip(self.model.trainable_weights, gradients):
                tf.summary.histogram(
                    weights.name.replace(':', '_') + '_grads', data=grads, step=epoch)

        writer.flush()

    def on_epoch_end(self, epoch, logs=None):
        # This function overwrites the on_epoch_end in tf.keras.callbacks.TensorBoard
        # but we do need to run the original on_epoch_end, so here we use the super function.
        super(ExtendedTensorBoard, self).on_epoch_end(epoch, logs=logs)

        if self.histogram_freq and epoch % self.histogram_freq == 0:
            self._log_gradients(epoch)

不应该是:
将tf.GradientTape()作为磁带:loss=model(input)
model()返回预测而不是loss。呃,我们可以将其与
keras.model.fit
keras.callbacks
合并吗?谢谢这个函数应该从回调调用吗?如果从
上调用,那么
x是什么?如果w.name中的'density'或w.name中的'bias'不应该是