Keras 隐藏层在哪里?

Keras 隐藏层在哪里?,keras,neural-network,nlp,keras-layer,autoencoder,Keras,Neural Network,Nlp,Keras Layer,Autoencoder,我对自动编码器有点陌生。我从Keras()获得了这段代码。我想知道我在代码中的注释是否正确 input_img = keras.Input(shape=(784,)) # input encoded = layers.Dense(128, activation='relu')(input_img) # is it hidden layer??? encoded = layers.Dense(64, activation='relu')(encoded) # is it hidden layer?

我对自动编码器有点陌生。我从Keras()获得了这段代码。我想知道我在代码中的注释是否正确

input_img = keras.Input(shape=(784,)) # input
encoded = layers.Dense(128, activation='relu')(input_img) # is it hidden layer???
encoded = layers.Dense(64, activation='relu')(encoded) # is it hidden layer???
encoded = layers.Dense(32, activation='relu')(encoded) # is it hidden layer???

decoded = layers.Dense(64, activation='relu')(encoded) # is it hidden layer???
decoded = layers.Dense(128, activation='relu')(decoded) # is it hidden layer???
decoded = layers.Dense(784, activation='sigmoid')(decoded) # output

如果可能的话,你们能再解释一下吗?谢谢

隐藏层是位于输入层和输出层()之间的任何层。因此,所有这些都是网络中的隐藏层:

encoded = layers.Dense(128, activation='relu')(input_img)
encoded = layers.Dense(64, activation='relu')(encoded)
encoded = layers.Dense(32, activation='relu')(encoded)
decoded = layers.Dense(64, activation='relu')(encoded)
decoded = layers.Dense(128, activation='relu')(decoded)
在自动编码器中,有一个特别有趣的隐藏层:网络中的“瓶颈”隐藏层,它强制压缩原始输入的知识表示。在您的示例中,它是784到32压缩,瓶颈隐藏层是:

encoded = layers.Dense(32, activation='relu')(encoded)


是否可以有多个隐藏层?是的。修改了答案。