Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/tensorflow/5.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
如何在tensorflow keras api中获取自定义损失函数的内部变量?_Tensorflow_Keras - Fatal编程技术网

如何在tensorflow keras api中获取自定义损失函数的内部变量?

如何在tensorflow keras api中获取自定义损失函数的内部变量?,tensorflow,keras,Tensorflow,Keras,我正在尝试添加一个L1损失的批标准化比例因子 例如,让我们采用一个简单的MNIST分类器: inputs = keras.Input(shape=(28, 28, 1)) conv_1 = keras.layers.Conv2D( 32, kernel_size=(3, 3), padding='same', activation=tf.nn.relu)(inputs) bn_1 = keras.layers.BatchNormalization()(conv_1) conv_2 = k

我正在尝试添加一个L1损失的批标准化比例因子

例如,让我们采用一个简单的MNIST分类器:

inputs = keras.Input(shape=(28, 28, 1))
conv_1 = keras.layers.Conv2D(
    32, kernel_size=(3, 3), padding='same', activation=tf.nn.relu)(inputs)
bn_1 = keras.layers.BatchNormalization()(conv_1)
conv_2 = keras.layers.Conv2D(
    32, kernel_size=(3, 3), padding='same', activation=tf.nn.relu)(bn_1)
bn_2 = keras.layers.BatchNormalization()(conv_2)
conv_3 = keras.layers.Conv2D(
    32, kernel_size=(3, 3), padding='same', activation=tf.nn.relu)(bn_2)
bn_3 = keras.layers.BatchNormalization()(conv_3)
conv_4 = keras.layers.Conv2D(
    32, kernel_size=(3, 3), padding='same', activation=tf.nn.relu)(bn_3)
bn_4 = keras.layers.BatchNormalization()(conv_4)
conv_5 = keras.layers.Conv2D(
    10, kernel_size=(3, 3), padding='same')(bn_4)
bn_5 = keras.layers.BatchNormalization()(conv_5)
gap = keras.layers.GlobalAveragePooling2D()(bn_5)
outputs = keras.layers.Activation('softmax')(gap)

model = keras.Model(inputs=inputs, outputs=outputs)
我的目标是通过bn_*层的比例因子找到每个卷积滤波器通道的相对重要性,以便我可以删除它们


有办法做到这一点吗

保留每个BatchNormalization层的引用,如
bn_1_layer=keras.layers.BatchNormalization()
,您可以使用
bn_1_layer.variables