Python 如何从keras中父模型的摘要中公开子模型的层?
现在,我有一个名为model1的模型:Python 如何从keras中父模型的摘要中公开子模型的层?,python,keras,Python,Keras,现在,我有一个名为model1的模型: Layer (type) Output Shape Param # Connected to ================================================================================================== input_3 (InputLayer)
Layer (type) Output Shape Param # Connected to
==================================================================================================
input_3 (InputLayer) (None, 101, 101, 1) 0
__________________________________________________________________________________________________
up_sampling2d_2 (UpSampling2D) (None, 202, 202, 1) 0 input_3[0][0]
__________________________________________________________________________________________________
zero_padding2d_36 (ZeroPadding2 (None, 256, 256, 1) 0 up_sampling2d_2[0][0]
__________________________________________________________________________________________________
conv2d_3 (Conv2D) (None, 256, 256, 3) 6 zero_padding2d_36[0][0]
__________________________________________________________________________________________________
u-resnet34 (Model) (None, 256, 256, 1) 24453178 conv2d_3[0][0]
__________________________________________________________________________________________________
input_4 (InputLayer) (None, 1, 1, 1) 0
__________________________________________________________________________________________________
cropping2d_2 (Cropping2D) (None, 202, 202, 1) 0 u-resnet34[1][0]
__________________________________________________________________________________________________
lambda_3 (Lambda) (None, 1, 1, 1) 0 input_4[0][0]
__________________________________________________________________________________________________
max_pooling2d_2 (MaxPooling2D) (None, 101, 101, 1) 0 cropping2d_2[0][0]
__________________________________________________________________________________________________
lambda_4 (Lambda) (None, 101, 101, 1) 0 lambda_3[0][0]
__________________________________________________________________________________________________
concatenate_10 (Concatenate) (None, 101, 101, 2) 0 max_pooling2d_2[0][0]
lambda_4[0][0]
__________________________________________________________________________________________________
conv2d_14 (Conv2D) (None, 101, 101, 1) 3 concatenate_10[0][0]
==================================================================================================
Total params: 24,453,187
Trainable params: 24,437,821
Non-trainable params: 15,366
_____________________________________
u-resnet34层是另一个模型,其中包含更多层。我可以打印它的摘要,我可以冻结任何我想要的图层。
当我冻结u-resnet34的图层并打印摘要时,我可以看到可训练参数相应减少
然而,即使我正在冻结model1中的模型层,model1的可训练参数也不会减少
如何冻结u-resnet34的层并使其反映在model1的可训练参数上
编辑: 下面是我的密码
# https://github.com/qubvel/segmentation_models
from segmentation_models import Unet
from keras.models import Model
from keras.layers import Input, Cropping2D, Conv2D
inputs = Input((256, 256, 3))
resnetmodel = Unet(backbone_name='resnet34', encoder_weights='imagenet', input_shape=(256, 256, 3), activation=None)
outputs = resnetmodel(inputs)
outputs = Cropping2D(cropping=((27, 27), (27, 27)) ) (outputs)
outputs = Conv2D(1, (1, 1), activation='sigmoid') (outputs)
model = Model(inputs=inputs, outputs=outputs)
model.compile(optimizer='adam', loss='binary_crossentropy')
model.summary()
这将产生:
Total params: 24,453,180
Trainable params: 24,437,814
Non-trainable params: 15,366
然后:
哪些产出:
Total params: 24,453,178
Trainable params: 0
Non-trainable params: 24,453,178
最后:
model.summary()
哪个输出:
Total params: 48,890,992
Trainable params: 24,437,814
Non-trainable params: 24,453,178
让我们以
ResNet50
为例
from keras.models import Model
from keras.layers import Input, Dense
from keras.applications.resnet50 import ResNet50
res = ResNet50()
res.summary()
#....
#Total params: 25,636,712
#Trainable params: 25,583,592
#Non-trainable params: 53,120
Resnet模型有很多参数需要训练
让我们把它作为模型的一层
x = Input((224,224,3))
y = res(x)
y = Dense(10)(y)
model = Model(x, y)
model.summary()
#.....
#Total params: 25,646,722
#Trainable params: 25,593,602
#Non-trainable params: 53,120
冻结resnet的层
for layer in res.layers:
layer.trainable = False
res.summary()
# ....
#Total params: 25,636,712
#Trainable params: 0
#Non-trainable params: 25,636,712
这也反映在使用resnet的模型上
model.summary()
#.....
#Total params: 25,646,722
#Trainable params: 10,010
#Non-trainable params: 25,636,712
因此,内部模型的冻结层应反映到外部模型
编辑
如果在冻结模型之前编译模型,则需要再次编译 首先,您提到,当冻结u-resnet34的层时,它会反映在模型摘要中。然后你提到它没有反映出来。哪一个是正确的?还是我遗漏了什么?有两个总结。一个用于u-resnet34模型,另一个用于model1,其中包含u-resnet34。在这两种情况下,是否可以添加用于冻结层的代码?第一个和最后一个摘要都属于
模型
,但其中的参数总数不同。这是怎么回事?我自己也在想,但我想你也可以在那里重现这个问题。谢谢你的回答。你的代码在这里有效,但我的代码仍然无效。我在问题中发布了我的代码,因此您可以更好地理解它。我的代码和您的代码之间唯一重要的区别是编译方法。编译模型后,它将停止工作。compile(optimizer='adam',loss='binary\u crossentropy')由于您的代码,我得到了它。您需要再次编译它以更新摘要。我认为你的答案是正确的,但请稍后再提。
model.summary()
#.....
#Total params: 25,646,722
#Trainable params: 10,010
#Non-trainable params: 25,636,712