Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/python/310.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/tensorflow/5.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Python 迁移学习中的批处理规范化_Python_Tensorflow_Keras_Transfer Learning_Batch Normalization - Fatal编程技术网

Python 迁移学习中的批处理规范化

Python 迁移学习中的批处理规范化,python,tensorflow,keras,transfer-learning,batch-normalization,Python,Tensorflow,Keras,Transfer Learning,Batch Normalization,我目前正在使用MobileNet V2体系结构进行迁移学习。在分类之前,我在顶部添加了几个致密层。我应该在这些层之间添加BatchNormalization base_model = MobileNetV2(weights='imagenet', include_top=False, input_shape=(200,200,3)) x = base_model.output x = GlobalAveragePooling2D(name="Class_pool")(x) x

我目前正在使用MobileNet V2体系结构进行迁移学习。在分类之前,我在顶部添加了几个致密层。我应该在这些层之间添加
BatchNormalization

base_model = MobileNetV2(weights='imagenet', include_top=False, input_shape=(200,200,3))
x = base_model.output
x = GlobalAveragePooling2D(name="Class_pool")(x)
x = Dense(512, activation='relu')(x)
x = BatchNormalization()(x)
x = Dropout(.4)(x)
x = Dense(1024, activation='relu')(x)
x = BatchNormalization()(x)
x = Dropout(.4)(x)
x = Dense(512, activation='relu')(x)
x = BatchNormalization()(x)
x = Dropout(.4)(x)
x = Dense(512, activation='relu')(x)
x = BatchNormalization()(x)
X = Dense(20,activation='softmax')(x)
我以前训练过这个网络,但没有这些批处理规范化层,并且很难获得好的精度。在尝试了学习速度和冻结层的多种组合后,我只取得了半成功。我希望这会有所帮助
太多的
BatchNormalization
层是否会对网络有害?

批量标准化将有助于协方差偏移,并且当您对新数据进行批量培训时,这对网络是一件好事。没有什么比BatchNormalization更重要的了,只是放在每一层激活之后