Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/python/304.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Python 加载后更改MobileNet辍学_Python_Tensorflow_Keras_Dropout_Mobilenet - Fatal编程技术网

Python 加载后更改MobileNet辍学

Python 加载后更改MobileNet辍学,python,tensorflow,keras,dropout,mobilenet,Python,Tensorflow,Keras,Dropout,Mobilenet,我正在研究迁移学习问题。当我从Mobilenet创建一个新模型时,我设置了一个退出 base_model = MobileNet(weights='imagenet', include_top=False, input_shape=(200,200,3), dropout=.15) x = base_model.output x = GlobalAveragePooling2D()(x) x = Dense(10, activation='softmax')(x) 我使用model\u che

我正在研究迁移学习问题。当我从Mobilenet创建一个新模型时,我设置了一个退出

base_model = MobileNet(weights='imagenet', include_top=False, input_shape=(200,200,3), dropout=.15)
x = base_model.output
x = GlobalAveragePooling2D()(x)
x = Dense(10, activation='softmax')(x)
我使用
model\u checkpoint\u callback
在训练时保存模型。当我训练时,我会发现哪里发生了过度拟合,并调整冻结层的数量和学习速度。当我再次保存加载的模型时,是否可以调整退出

我看到了这个,但在Mobilenet中没有实际的退出层,所以

for layer in model.layers:
    if hasattr(layer, 'rate'):
        print(layer.name)
        layer.rate = 0.5

什么都不做。

过去,你必须克隆模型,让新辍学者学习。我最近没试过

# This code allows you to change the dropout
# Load model from .json
model.load_weights(filenameToModelWeights) # Load weights
model.layers[-2].rate = 0.04  # layer[-2] is my dropout layer, rate is dropout attribute
model = keras.models.clone(model) # If I do not clone, the new rate is never used. Weights are re-init now.
model.load_weights(filenameToModelWeights) # Load weights
model.predict(x)
归功于

如果模型一开始就没有辍学层,比如Keras的预训练mobilenet,那么你必须用方法添加它们。这里有一种方法你可以做到

用于在单个层中添加

def insert_single_layer_in_keras(model, layer_name, new_layer):
    layers = [l for l in model.layers]

    x = layers[0].output
    for i in range(1, len(layers)):
        x = layers[i](x)
        # add layer afterward
        if layers[i].name == layer_name:
            x = new_layer(x)

    new_model = Model(inputs=layers[0].input, outputs=x)
    return new_model

用于系统地添加层

def insert_layers_in_model(model, layer_common_name, new_layer):
    import re

    layers = [l for l in model.layers]
    x = layers[0].output
    layer_config = new_layer.get_config()
    base_name = layer_config['name']
    layer_class = type(dropout_layer)
    for i in range(1, len(layers)):
        x = layers[i](x)
        match = re.match(".+" + layer_common_name + "+", layers[i].name)
        # add layer afterward
        if match:
            layer_config['name'] = base_name + "_" + str(i)  # no duplicate names, could be done different
            layer_copy = layer_class.from_config(layer_config)
            x = layer_copy(x)

    new_model = Model(inputs=layers[0].input, outputs=x)
    return new_model
像这样跑

import tensorflow as tf
from tensorflow.keras.applications.mobilenet import MobileNet
from tensorflow.keras.layers import Dropout
from tensorflow.keras.models import Model

base_model = MobileNet(weights='imagenet', include_top=False, input_shape=(192, 192, 3), dropout=.15)

dropout_layer = Dropout(0.5)
# add single layer after last dropout
mobile_net_with_dropout = insert_single_layer_in_model(base_model, "conv_pw_13_bn", dropout_layer)
# systematically add layers after any batchnorm layer
mobile_net_with_multi_dropout = insert_layers_in_model(base_model, "bn", dropout_layer)

顺便说一句,你绝对应该进行实验,但对于像mobilenet这样的小型网络,你不太可能希望在batchnorm的基础上进行额外的正则化。

正如我提到的,
如果Hasatr(层,“速率”)不起作用,因此,不存在以
速率
为基础的层attribute@theastronomist如果一开始就没有退出层,您必须添加它们,请参见“编辑”以了解一种方法。