Python 我的学习率真的在改变吗?
我试图调整梯度下降算法的学习速度。我希望能够确认我对Python 我的学习率真的在改变吗?,python,theano,Python,Theano,我试图调整梯度下降算法的学习速度。我希望能够确认我对learning\u rate的更改是否确实对我的theano训练功能产生了影响 示例代码: #set up the updates for param in params: updates.append((param, param-learning_rate*T.grad(cost, param))) #set up the training function train = theano.function(inputs=[index
learning\u rate
的更改是否确实对我的theano训练功能产生了影响
示例代码:
#set up the updates
for param in params:
updates.append((param, param-learning_rate*T.grad(cost, param)))
#set up the training function
train = theano.function(inputs=[index], outputs=[cost], updates=updates, givens={x:self.X[index:index+mini_batch_size,:]})
#run through the minibatches
for epoch in range(n_epochs):
for row in range(0,self.m, mini_batch_size):
cost = train(row)
#occasionally adjust the learning rate
learning_rate = learning_rate/2.0
这会如我所愿吗?我如何确认
编辑:
根据这个小测试,这似乎不起作用:
x = th.tensor.dscalar()
rate=5.0
f = th.function(inputs=[x], outputs=2*x*rate)
print(f(10))
>> 100.0
rate=0.0
print(f(10))
>> 100.0
正确的方法是什么?问题是您的代码正在将学习率作为常量编译到计算图中。如果要更改速率,需要使用Theano变量在计算图中表示它,然后在执行函数时提供一个值。这可以通过两种方式实现:
import theano as th
import theano.tensor
# Original version (changing rate doesn't affect theano function output)
x = th.tensor.dscalar()
rate=5.0
f = th.function(inputs=[x], outputs=2*x*rate)
print(f(10))
rate=0.0
print(f(10))
# New version using an input value
x = th.tensor.dscalar()
rate=th.tensor.scalar()
f = th.function(inputs=[x, rate], outputs=2*x*rate)
print(f(10, 5.0))
print(f(10, 0.0))
# New version using a shared variable with manual update
x = th.tensor.dscalar()
rate=th.shared(5.0)
f = th.function(inputs=[x], outputs=2*x*rate)
print(f(10))
rate.set_value(0.0)
print(f(10))
# New version using a shared variable with automatic update
x = th.tensor.dscalar()
rate=th.shared(5.0)
updates=[(rate, rate / 2.0)]
f = th.function(inputs=[x], outputs=2*x*rate, updates=updates)
print(f(10))
print(f(10))
print(f(10))
print(f(10))
受@Daniel Renshaw答案的启发,您可以尝试以下方法:
learning_rate = theano.shared(0.01)
for param in params:
updates.append((param, param-learning_rate*T.grad(cost, param)))
#set up the training function
train = theano.function(inputs=[index], outputs=[cost], updates=updates, givens={x:self.X[index:index+mini_batch_size,:]})
#run through the minibatches
for epoch in range(n_epochs):
for row in range(0,self.m, mini_batch_size):
cost = train(row)
#occasionally adjust the learning rate
learning_rate.set_value(learning_rate.get_value()/ 2)
基本上,您使用一个共享变量,并在每次迭代中手动更新它