Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/python/284.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Python 在TensorFlow中设置Adam优化器_Python_Tensorflow_Optimization_Neural Network - Fatal编程技术网

Python 在TensorFlow中设置Adam优化器

Python 在TensorFlow中设置Adam优化器,python,tensorflow,optimization,neural-network,Python,Tensorflow,Optimization,Neural Network,我在读一篇文章,作者有以下背景。 $Learning Rate=1^{-3},momentum=0.9,beta_=0.01,lambda=1,epsilon=10^{-4}$ 然而,tensorflow对Adam有以下设置 tf.keras.optimizers.Adam(learning_rate=0.001, beta_1=0.9, beta_2=0.999, epsilon=1e-07, amsgrad=False, name='Ada

我在读一篇文章,作者有以下背景。 $Learning Rate=1^{-3},momentum=0.9,beta_=0.01,lambda=1,epsilon=10^{-4}$

然而,tensorflow对Adam有以下设置

tf.keras.optimizers.Adam(learning_rate=0.001, beta_1=0.9, beta_2=0.999, epsilon=1e-07, amsgrad=False,
                          name='Adam', **kwargs)

我的问题是,如何设置$MONTORM$和$\lambda$

您可以在文档中查看:

opt = tf.keras.optimizers.Adam(learning_rate=0.1)
var1 = tf.Variable(10.0)
loss = lambda: (var1 ** 2)/2.0       # d(loss)/d(var1) == var1
step_count = opt.minimize(loss, [var1]).numpy()
# The first step is `-learning_rate*sign(grad)`
var1.numpy()
Beta1和beta 2:是动量衰减;请检查以下内容:


momnet取决于beta_1,
m=beta_1*m_pre+(1-beta_1)*g
lambda(对于adam)是什么?嗯,我不确定,我在检查之前关于堆栈溢出的答案:,这能帮你找到答案吗