Python 如何将致密层从Tensorflow 1迁移到Tensorflow 2?
如何将此层迁移到tf2Python 如何将致密层从Tensorflow 1迁移到Tensorflow 2?,python,tensorflow,keras,Python,Tensorflow,Keras,如何将此层迁移到tf2 observations = tf.placeholder(tf.float32,[None, OBSERVATIONS_SIZE]) h = tf.layers.dense( observations, units=hidden_layer_size, activation=tf.nn.relu, kernel_initializer=tf.contrib.layers.xavier_initializer() ) 我发现占
observations = tf.placeholder(tf.float32,[None, OBSERVATIONS_SIZE])
h = tf.layers.dense(
observations,
units=hidden_layer_size,
activation=tf.nn.relu,
kernel_initializer=tf.contrib.layers.xavier_initializer()
)
我发现占位符现在是“Input”,我使用了tf2的密集层
我试过:
observations = tf.keras.Input(
shape = [ None, OBSERVATIONS_SIZE ],
dtype = tf.float32
)
h = tf.keras.layers.Dense(
observations,
units=hidden_layer_size,
activation='relu',
kernel_initializer = 'glorot_uniform'
)
如果我使用它,我会得到这个错误
TypeError: __init__() got multiple values for argument 'units'
在这种情况下,我应该如何使用占位符/输入?Keras层不用作
tf。层
,它们是可调用的,而不是将张量作为第一个参数传递,因此应该是:
observations = tf.keras.Input(
shape = [ None, OBSERVATIONS_SIZE ],
dtype = tf.float32
)
h = tf.keras.layers.Dense(
units=hidden_layer_size,
activation='relu',
kernel_initializer = 'glorot_uniform'
)(observations)