Python 如何使用两层基本CNN算法对剩余块进行编码;tensorflow.keras“;?
我用tensorflow.keras库构建了一个基本的CNN模型代码:Python 如何使用两层基本CNN算法对剩余块进行编码;tensorflow.keras“;?,python,tensorflow,keras,deep-learning,Python,Tensorflow,Keras,Deep Learning,我用tensorflow.keras库构建了一个基本的CNN模型代码: model = Sequential() # First Layer model.add(Conv2D(64, (3,3), input_shape = (IMG_SIZE,IMG_SIZE,1))) model.add(Activation("relu")) model.add(MaxPooling2D(pool_size = (3,3))) # Second Layer model.add(Con
model = Sequential()
# First Layer
model.add(Conv2D(64, (3,3), input_shape = (IMG_SIZE,IMG_SIZE,1)))
model.add(Activation("relu"))
model.add(MaxPooling2D(pool_size = (3,3)))
# Second Layer
model.add(Conv2D(64, (3,3)))
model.add(Activation("relu"))
model.add(MaxPooling2D(pool_size = (3,3)))
# Third Layer
model.add(Conv2D(64, (3,3)))
model.add(Activation("relu"))
model.add(MaxPooling2D(pool_size = (3,3)))
# Fourth Layer
model.add(Conv2D(64, (3,3)))
model.add(Activation("relu"))
model.add(MaxPooling2D(pool_size = (3,3)))
# Fifth Layer
model.add(Conv2D(64, (3,3)))
model.add(Activation("relu"))
model.add(MaxPooling2D(pool_size = (3,3)))
model.add(Flatten())
# Sixth Layer
model.add(Dense(64))
model.add(Activation("relu"))
# Seventh Layer
model.add(Dense(1))
model.add(Activation('sigmoid'))
现在,我想在第二层和第四层之间建立连接,以使用tensorflow.keras库实现剩余块
因此,我应该如何修改代码以实现这样的剩余块?架构的剩余块如下所示:
您需要使用,因为顺序模型太有限。其在Keras的实施情况如下:
from tensorflow.keras import layers
def resblock(x, kernelsize, filters):
fx = layers.Conv2D(filters, kernelsize, activation='relu', padding='same')(x)
fx = layers.BatchNormalization()(fx)
fx = layers.Conv2D(filters, kernelsize, padding='same')(fx)
out = layers.Add()([x,fx])
out = layers.ReLU()(out)
out = layers.BatchNormalization()(out)
return out
batchnormalization()
图层不是必需的,但可能是提高其精度的可靠选项x
还需要与过滤器
参数具有相同数量的过滤器