Python 如何将通过model()建立的神经网络转换为sequential()?
我使用TensorFlow 2.0构建了一个完全连接的神经网络,一个去噪自动编码器。网络结构为128-64-32-64-128(神经元数),经过五层网络处理,输入1000个数据,输出1000个数据 我的代码如下(Model()): 以及相应的model.summary()如下:Python 如何将通过model()建立的神经网络转换为sequential()?,python,tensorflow,keras,neural-network,autoencoder,Python,Tensorflow,Keras,Neural Network,Autoencoder,我使用TensorFlow 2.0构建了一个完全连接的神经网络,一个去噪自动编码器。网络结构为128-64-32-64-128(神经元数),经过五层网络处理,输入1000个数据,输出1000个数据 我的代码如下(Model()): 以及相应的model.summary()如下: Model: "model_1" _________________________________________________________________ Layer (type)
Model: "model_1"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
input_1 (InputLayer) (None, 1000) 0
_________________________________________________________________
dense_1 (Dense) (None, 128) 128128
_________________________________________________________________
dense_2 (Dense) (None, 64) 8256
_________________________________________________________________
dense_3 (Dense) (None, 32) 2080
_________________________________________________________________
dense_4 (Dense) (None, 64) 2112
_________________________________________________________________
dense_5 (Dense) (None, 128) 8320
_________________________________________________________________
dense_6 (Dense) (None, 1000) 129000
=================================================================
Total params: 277,896
Trainable params: 277,896
Non-trainable params: 0
input_size_1000 = 1000
hidden_size_128 = 128
hidden_size_64 = 64
code_size_32 = 32
output_size_1000 = 1000
# using sequential()
model = tf.keras.models.Sequential()
# model.add(tf.keras.layers.Input(input_size_1000,))
model.add(tf.keras.layers.Dense(hidden_size_128,input_dim=1000,activation=tf.nn.relu))
model.add(tf.keras.layers.Dense(hidden_size_64,activation=tf.nn.relu))
model.add(tf.keras.layers.Dense(code_size_32,activation=tf.nn.relu))
model.add(tf.keras.layers.Dense(hidden_size_64,activation=tf.nn.relu))
model.add(tf.keras.layers.Dense(hidden_size_128,activation=tf.nn.relu))
model.add(tf.keras.layers.Dense(output_size_1000,activation=tf.nn.sigmoid))
model.compile(optimizer='adam', loss='mse', metrics=["mse"])
Model: "sequential"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
dense (Dense) (None, 128) 128128
_________________________________________________________________
dense_1 (Dense) (None, 64) 8256
_________________________________________________________________
dense_2 (Dense) (None, 32) 2080
_________________________________________________________________
dense_3 (Dense) (None, 64) 2112
_________________________________________________________________
dense_4 (Dense) (None, 128) 8320
_________________________________________________________________
dense_5 (Dense) (None, 1000) 129000
=================================================================
Total params: 277,896
Trainable params: 277,896
Non-trainable params: 0
_________________________________________________________________
None
我想用sequential()模式重写这个神经网络(因为我需要的Python库必须使用sequential()。我的代码如下:
Model: "model_1"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
input_1 (InputLayer) (None, 1000) 0
_________________________________________________________________
dense_1 (Dense) (None, 128) 128128
_________________________________________________________________
dense_2 (Dense) (None, 64) 8256
_________________________________________________________________
dense_3 (Dense) (None, 32) 2080
_________________________________________________________________
dense_4 (Dense) (None, 64) 2112
_________________________________________________________________
dense_5 (Dense) (None, 128) 8320
_________________________________________________________________
dense_6 (Dense) (None, 1000) 129000
=================================================================
Total params: 277,896
Trainable params: 277,896
Non-trainable params: 0
input_size_1000 = 1000
hidden_size_128 = 128
hidden_size_64 = 64
code_size_32 = 32
output_size_1000 = 1000
# using sequential()
model = tf.keras.models.Sequential()
# model.add(tf.keras.layers.Input(input_size_1000,))
model.add(tf.keras.layers.Dense(hidden_size_128,input_dim=1000,activation=tf.nn.relu))
model.add(tf.keras.layers.Dense(hidden_size_64,activation=tf.nn.relu))
model.add(tf.keras.layers.Dense(code_size_32,activation=tf.nn.relu))
model.add(tf.keras.layers.Dense(hidden_size_64,activation=tf.nn.relu))
model.add(tf.keras.layers.Dense(hidden_size_128,activation=tf.nn.relu))
model.add(tf.keras.layers.Dense(output_size_1000,activation=tf.nn.sigmoid))
model.compile(optimizer='adam', loss='mse', metrics=["mse"])
Model: "sequential"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
dense (Dense) (None, 128) 128128
_________________________________________________________________
dense_1 (Dense) (None, 64) 8256
_________________________________________________________________
dense_2 (Dense) (None, 32) 2080
_________________________________________________________________
dense_3 (Dense) (None, 64) 2112
_________________________________________________________________
dense_4 (Dense) (None, 128) 8320
_________________________________________________________________
dense_5 (Dense) (None, 1000) 129000
=================================================================
Total params: 277,896
Trainable params: 277,896
Non-trainable params: 0
_________________________________________________________________
None
以及相应的model.summary()如下:
Model: "model_1"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
input_1 (InputLayer) (None, 1000) 0
_________________________________________________________________
dense_1 (Dense) (None, 128) 128128
_________________________________________________________________
dense_2 (Dense) (None, 64) 8256
_________________________________________________________________
dense_3 (Dense) (None, 32) 2080
_________________________________________________________________
dense_4 (Dense) (None, 64) 2112
_________________________________________________________________
dense_5 (Dense) (None, 128) 8320
_________________________________________________________________
dense_6 (Dense) (None, 1000) 129000
=================================================================
Total params: 277,896
Trainable params: 277,896
Non-trainable params: 0
input_size_1000 = 1000
hidden_size_128 = 128
hidden_size_64 = 64
code_size_32 = 32
output_size_1000 = 1000
# using sequential()
model = tf.keras.models.Sequential()
# model.add(tf.keras.layers.Input(input_size_1000,))
model.add(tf.keras.layers.Dense(hidden_size_128,input_dim=1000,activation=tf.nn.relu))
model.add(tf.keras.layers.Dense(hidden_size_64,activation=tf.nn.relu))
model.add(tf.keras.layers.Dense(code_size_32,activation=tf.nn.relu))
model.add(tf.keras.layers.Dense(hidden_size_64,activation=tf.nn.relu))
model.add(tf.keras.layers.Dense(hidden_size_128,activation=tf.nn.relu))
model.add(tf.keras.layers.Dense(output_size_1000,activation=tf.nn.sigmoid))
model.compile(optimizer='adam', loss='mse', metrics=["mse"])
Model: "sequential"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
dense (Dense) (None, 128) 128128
_________________________________________________________________
dense_1 (Dense) (None, 64) 8256
_________________________________________________________________
dense_2 (Dense) (None, 32) 2080
_________________________________________________________________
dense_3 (Dense) (None, 64) 2112
_________________________________________________________________
dense_4 (Dense) (None, 128) 8320
_________________________________________________________________
dense_5 (Dense) (None, 1000) 129000
=================================================================
Total params: 277,896
Trainable params: 277,896
Non-trainable params: 0
_________________________________________________________________
None
我想知道为什么在重写的代码中没有输入层(我添加了input_dim=1000)。首先我添加了model.add(tf.keras.layers.Input(Input\u size\u 1000,)
,但它不起作用。然后我命令这行,加上inpu\u dim=1000,仍然没有输入层。顺便问一下,我的重写有什么问题吗?因为我将要使用的库没有输入层,但它提供了一个展平层,如果删除输入层会有任何影响吗?如果我使用展平层而不是输入层(我的数据是一维列表)(因为MNIST图片展平后库中的样本很密集),会怎么样?谢谢 必须指定密集层的输入尺寸。在您的顺序模型中,它是
input\u dim
。使用print(model.inputs)
可以看到输入张量
如果使用展平层,则必须指定展平层的
input_shape
,以便下一层(密集)的输入维度清晰,例如:tf.keras.layers.flatte(input_shape=[24,24])
必须指定密集层的输入维度。在您的顺序模型中,它是input\u dim
。使用print(model.inputs)
可以看到输入张量
如果使用展平层,则必须指定展平层的
input\u shape
,以便下一层(密集)的输入维度是清晰的,例如:tf.keras.layers.flatte(input\u shape=[24,24])
看起来输入层被注释掉了,这是您代码中的情况吗?对不起,我没有解释清楚。我注释的行对代码的model.summary()没有任何影响,因此我尝试再次添加输入,但仍然无效。我还在问题中添加了此脚注。看起来输入层被注释掉了,您的代码中是这样吗?很抱歉,我没有解释清楚。我的注释行对代码的model.summary()没有任何影响,因此我再次尝试添加输入,但仍然无效。我还在问题中添加了此脚注。