Keras 意外发现BatchNormalization类型的实例。应为符号张量实例

Keras 意外发现BatchNormalization类型的实例。应为符号张量实例,keras,tensor,batch-normalization,Keras,Tensor,Batch Normalization,我在Keras中实现剩余网络时出错。下面是给出错误的代码(错误来自函数定义中最后一步的第一行): 加载包: import numpy as np from keras import layers from keras.layers import Input, Add, Concatenate, Dense, Activation, ZeroPadding2D, BatchNormalization, Flatten, Conv2D, AveragePooling2D, MaxPooling2D,

我在Keras中实现剩余网络时出错。下面是给出错误的代码(错误来自函数定义中最后一步的第一行):

加载包:

import numpy as np
from keras import layers
from keras.layers import Input, Add, Concatenate, Dense, Activation, ZeroPadding2D, BatchNormalization, Flatten, Conv2D, AveragePooling2D, MaxPooling2D, GlobalMaxPooling2D
from keras.models import Model, load_model
from keras.preprocessing import image
from keras.utils import layer_utils
from keras.utils.data_utils import get_file
from keras.applications.imagenet_utils import preprocess_input
import pydot
from IPython.display import SVG
from keras.utils.vis_utils import model_to_dot
from keras.utils import plot_model
from resnets_utils import *
from keras.initializers import glorot_uniform
import scipy.misc
from matplotlib.pyplot import imshow
%matplotlib inline

import keras.backend as K
K.set_image_data_format('channels_last')
K.set_learning_phase(1)
定义函数:(给出错误的是“最后一步”的第一行)

调用/测试上述功能:

tf.reset_default_graph()

with tf.Session() as test:
    np.random.seed(1)
    A_prev = tf.placeholder("float", [3, 4, 4, 6])
    X = np.random.randn(3, 4, 4, 6)
    A = identity_block(A_prev, f = 2, filters = [2, 4, 6], stage = 1, block = 'a')
    test.run(tf.global_variables_initializer())
    out = test.run([A], feed_dict={A_prev: X, K.learning_phase(): 0})
    print("out = " + str(out[0][1][1][0]))
以下是打印消息和错误消息:

在BatchNormalization之前:X=张量(“res1a_branch2c/BiasAdd:0”,shape=(3,4,4,6),dtype=float32) 批处理规范化后:X=

ValueError:意外地找到了类型为``的实例。应为符号张量实例。
下面是完整的日志(以备需要)

---------------------------------------------------------------------------
ValueError回溯(最近一次调用上次)
/assert\u input\u兼容性(self,inputs)中的opt/conda/lib/python3.6/site-packages/keras/engine/topology.py
424试试:
-->425 K.is_keras_张量(x)
426除值错误外:
/is_keras_tensor(x)中的opt/conda/lib/python3.6/site-packages/keras/backend/tensorflow_backend.py
399 tf.SparseTensor):
-->400 raise VALUERROR('意外发现类型为''+str(类型(x))+'`'的实例。'
401“应为符号张量实例”。)
ValueError:意外发现类型为“%1”的实例。应为符号张量实例。
在处理上述异常期间,发生了另一个异常:
ValueError回溯(最近一次调用上次)
在()
5a_prev=tf.占位符(“float”,[3,4,4,6])
6x=np.random.randn(3,4,4,6)
---->7 A=标识块(A\u prev,f=2,过滤器=[2,4,6],阶段=1,块='A')
8测试运行(tf.global\u variables\u initializer())
9 out=test.run([A],feed_dict={A_prev:X,K.learning_phase():0})
标识块中(X、f、过滤器、阶段、块)
43
44#最后一步:将快捷方式值添加到主路径,并通过RELU激活传递它(≈(2行)
--->45 X=添加()([X_快捷方式,X])
46 X=激活('relu')(X)
47
/opt/conda/lib/python3.6/site-packages/keras/engine/topology.py in___调用(self,input,**kwargs)
556#在输入不兼容的情况下引发异常
557#具有图层构造函数中指定的输入规格。
-->558自我断言输入兼容性(输入)
559
560#收集输入形状以构建图层。
/assert\u input\u兼容性(self,inputs)中的opt/conda/lib/python3.6/site-packages/keras/engine/topology.py
429'接收类型:'+
430 str(x型))+'。完整输入:'+
-->431 str(输入)+'。层的所有输入
432'应该是张量。'))
433
ValueError:调用Layer add_1时使用的输入不是符号张量。收到的类型:。完整输入:[,]。层的所有输入都应该是张量。

我想我在函数定义的最后一步遗漏了一些东西,但我不知道为什么会出错。这里的Keras专家能帮我吗?

始终记住将张量传递到层:

print(f'before BatchNormalization: X={X}');
#X = BatchNormalization(axis=3,name=bn_name_base+'2c')    # <--- INCORRECT
X = BatchNormalization(axis=3,name=bn_name_base+'2c')(X)  # <--- CORRECT
print(f'after  BatchNormalization: X={X}');
print(BatchNormalization之前的f'X={X}');

#X=BatchNormalization(axis=3,name=bn_name_base+'2c')#它来自Coursera项目,BatchNormalization明确放在'relu'步骤之前:如果在'relu'步骤之后应用,BatchNormalization将再次引入负数。我们生活在一个多么小的世界啊!我在这一点上犯了完全相同的错误!谢谢你问这个问题!阿里加托。谢谢你,你说得绝对正确!在BatchNormalization之后,我第一次使用了(X),但在修复其他幼稚的bug时,不知怎的放弃了它。我不熟悉堆栈溢出。我怎么能接受你的回答作为我问题的答案^_^另外,我没有为函数定义复制“returnx”部分,因为它不是导致错误的部分。但我的错是,我明白了,接受了你的回答。再次感谢@Jonathan没问题-欢迎来到StackOverflow。no
return X
稍后会出现问题(请尝试查看)-另外,在pro提示中,在各种
X=
定义上运行
X.\uuu dict\uuuu
以查看对象属性及其差异-有助于学习和调试。好的,卢卡同意。我猜我在键入Add()part时按了ctrl+X而不是ctrl+C,可能(或者谁知道^ ^),然后(X)消失了。在“返回X”上。我原来的帖子很短,我对帖子进行了几次编辑,以使我的问题更清楚。“return X”没有包含在我的第一次编辑中:我没有包含函数调用部分,显然不是导致错误的部分。后来我添加了函数调用部分,但忘了添加“returnx”。有一段时间没编程了。现在我真的需要再次提高我的编程技能了。英雄联盟
ValueError: Unexpectedly found an instance of type `<class 'keras.layers.normalization.BatchNormalization'>`. Expected a symbolic tensor instance.
---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
/opt/conda/lib/python3.6/site-packages/keras/engine/topology.py in assert_input_compatibility(self, inputs)
    424             try:
--> 425                 K.is_keras_tensor(x)
    426             except ValueError:

/opt/conda/lib/python3.6/site-packages/keras/backend/tensorflow_backend.py in is_keras_tensor(x)
    399                           tf.SparseTensor)):
--> 400         raise ValueError('Unexpectedly found an instance of type `' + str(type(x)) + '`. '
    401                          'Expected a symbolic tensor instance.')

ValueError: Unexpectedly found an instance of type `<class 'keras.layers.normalization.BatchNormalization'>`. Expected a symbolic tensor instance.

During handling of the above exception, another exception occurred:

ValueError                                Traceback (most recent call last)
<ipython-input-6-b3d1050f50dc> in <module>()
      5     A_prev = tf.placeholder("float", [3, 4, 4, 6])
      6     X = np.random.randn(3, 4, 4, 6)
----> 7     A = identity_block(A_prev, f = 2, filters = [2, 4, 6], stage = 1, block = 'a')
      8     test.run(tf.global_variables_initializer())
      9     out = test.run([A], feed_dict={A_prev: X, K.learning_phase(): 0})

<ipython-input-5-013941ce79d6> in identity_block(X, f, filters, stage, block)
     43 
     44     # Final step: Add shortcut value to main path, and pass it through a RELU activation (≈2 lines)
---> 45     X = Add()([X_shortcut,X])
     46     X = Activation('relu')(X)
     47 

/opt/conda/lib/python3.6/site-packages/keras/engine/topology.py in __call__(self, inputs, **kwargs)
    556                 # Raise exceptions in case the input is not compatible
    557                 # with the input_spec specified in the layer constructor.
--> 558                 self.assert_input_compatibility(inputs)
    559 
    560                 # Collect input shapes to build layer.

/opt/conda/lib/python3.6/site-packages/keras/engine/topology.py in assert_input_compatibility(self, inputs)
    429                                  'Received type: ' +
    430                                  str(type(x)) + '. Full input: ' +
--> 431                                  str(inputs) + '. All inputs to the layer '
    432                                  'should be tensors.')
    433 

ValueError: Layer add_1 was called with an input that isn't a symbolic tensor. Received type: <class 'keras.layers.normalization.BatchNormalization'>. Full input: [<tf.Tensor 'Placeholder:0' shape=(3, 4, 4, 6) dtype=float32>, <keras.layers.normalization.BatchNormalization object at 0x7f169c6d9668>]. All inputs to the layer should be tensors.
print(f'before BatchNormalization: X={X}');
#X = BatchNormalization(axis=3,name=bn_name_base+'2c')    # <--- INCORRECT
X = BatchNormalization(axis=3,name=bn_name_base+'2c')(X)  # <--- CORRECT
print(f'after  BatchNormalization: X={X}');