Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/ionic-framework/2.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Neural network 随机梯度下降裁剪梯度_Neural Network_Deep Learning_Gradient_Recurrent Neural Network_Clipping - Fatal编程技术网

Neural network 随机梯度下降裁剪梯度

Neural network 随机梯度下降裁剪梯度,neural-network,deep-learning,gradient,recurrent-neural-network,clipping,Neural Network,Deep Learning,Gradient,Recurrent Neural Network,Clipping,我正在训练一个递归神经网络,但我想应用裁剪梯度。我用的是sgd。我可以将剪切梯度用于为小批量计算的梯度之和吗?剪切梯度之和没有效果。您应该单独剪裁每个渐变 下面是Tensorflow中渐变剪裁的快速代码片段: max = 20 grads = tf.gradients(loss, tf.trainable_variables()) grads, _ = tf.clip_by_global_norm(grads, max) # gradient clipping grads_and_vars =

我正在训练一个递归神经网络,但我想应用裁剪梯度。我用的是sgd。我可以将剪切梯度用于为小批量计算的梯度之和吗?

剪切梯度之和没有效果。您应该单独剪裁每个渐变

下面是Tensorflow中渐变剪裁的快速代码片段:

max = 20
grads = tf.gradients(loss, tf.trainable_variables())
grads, _ = tf.clip_by_global_norm(grads, max)  # gradient clipping
grads_and_vars = list(zip(grads, tf.trainable_variables()))
optimizer = tf.train.AdamOptimizer(learning_rate)
train_op = optimizer.apply_gradients(grads_and_vars)