如何知道Tensorflow中GradientDescentOptimizer的正确学习率?

如何知道Tensorflow中GradientDescentOptimizer的正确学习率?,tensorflow,machine-learning,deep-learning,gradient-descent,Tensorflow,Machine Learning,Deep Learning,Gradient Descent,我对Tensorflow中梯度下降优化器的学习速度感到困惑 因此,假设我试图从这些数据中预测下一个值: x_数据=[5,10,15,20,25,30,35,40] y_数据=[2,4,6,8,10,12,14,16] 如果我选择学习率为0.01,以下是我的计划: 将tensorflow导入为tf tf.set_random_seed(777) #x_数据=[5,10,15,20,25,30,35,40] #y_数据=[2,4,6,8,10,12,14,16,18] x_数据=[5,10,15,

我对Tensorflow中梯度下降优化器的学习速度感到困惑

因此,假设我试图从这些数据中预测下一个值:

x_数据=[5,10,15,20,25,30,35,40]
y_数据=[2,4,6,8,10,12,14,16]
如果我选择学习率为0.01,以下是我的计划:

将tensorflow导入为tf
tf.set_random_seed(777)
#x_数据=[5,10,15,20,25,30,35,40]
#y_数据=[2,4,6,8,10,12,14,16,18]
x_数据=[5,10,15,20,25,30,35,40]
y_数据=[2,4,6,8,10,12,14,16]
一=tf.变量(tf.随机_正态([1]))
two=tf.Variable(tf.random_normal([1]))
hypo=x_数据*1+2
成本=tf.减少平均值(tf.平方(准y数据))
列车=tf.列车.梯度降阶器(0.01).最小化(成本)
ina=tf.global_variables_initializer()
将tf.Session()作为tt:
tt.run(ina)
对于范围(3000)内的i:
a、 b,c,d=tt.运行([列车,成本,一,二])
如果i%10==0:
打印(c、d)
然后我得到这个输出,它进入inf(这是我的第二个困惑,为什么它会进入无穷大?)

[-20.48267746][-1.6179111]
[-1.06335529e+12][-3.75422935e+10]
[-5.40660918e+22][-1.90883086e+21]
[-2.74898110e+33][-9.70541703e+31]
[南][南]
[南][南]
[南][南]
[南][南]
[南][南]
[南][南]
[南][南]
[南][南]
[南][南]
[南][南]
[南][南]
....   ....
....   ....
但是如果我选择
学习率为0.001
,那么我得到了正确的输出:

[-0.06046534][0.90016752]
[ 0.43103883] [-0.87918627]
[ 0.43091267] [-0.87557721]
[ 0.4307858] [-0.87198305]
[ 0.43065941] [-0.86840361]
[ 0.43053356] [-0.8648389]
[ 0.43040821] [-0.86128885]
[ 0.43028343] [-0.85775328]
[ 0.43015912] [-0.85423231]
[ 0.43003532] [-0.85072571]
[ 0.429912] [-0.84723359]
[ 0.42978922] [-0.84375578]
[ 0.42966694] [-0.84029222]
[ 0.42954516] [-0.83684289]
[ 0.42942387] [-0.8334077]
[ 0.42930311] [-0.82998663]
[ 0.4291828] [-0.82657957]
[ 0.42906302] [-0.82318658]
[ 0.42894369] [-0.81980747]
[ 0.4288249] [-0.81644231]
[ 0.42870659] [-0.81309086]
[ 0.42858875] [-0.80975318]
[ 0.42847139] [-0.80642921]
[ 0.42835453] [-0.80311882]
[ 0.42823812] [-0.79982209]
[ 0.42812222] [-0.79653889]
[ 0.42800677] [-0.7932691]
[ 0.42789182] [-0.79001278]
[ 0.42777732] [-0.78676981]
[ 0.42766327] [-0.78354019]
[ 0.42754975] [-0.78032386]
[ 0.42743665] [-0.77712065]
[ 0.42732403] [-0.77393067]
[ 0.42721185] [-0.77075368]
[ 0.42710015] [-0.76758981]
[ 0.4269889] [-0.76443887]
[ 0.42687812] [-0.76130092]
[ 0.42676777] [-0.75817585]
[ 0.42665792] [-0.75506359]
[ 0.42654848] [-0.75196409]
[ 0.42643949] [-0.74887735]
[ 0.42633098] [-0.74580324]
[ 0.42622289] [-0.74274176]
[ 0.42611524] [-0.73969287]
[ 0.42600802] [-0.73665649]
[ 0.42590126] [-0.73363262]
[ 0.42579496] [-0.73062116]
[ 0.42568904] [-0.72762191]
[ 0.42558363] [-0.72463512]
[ 0.42547861] [-0.72166055]
[ 0.425374] [-0.7186982]
[ 0.42526984] [-0.71574789]
[ 0.4251661] [-0.71280998]
[ 0.42506284] [-0.70988399]
[ 0.42495993] [-0.70696992]
[ 0.42485747] [-0.70406777]
[ 0.42475539] [-0.70117754]
[ 0.42465383] [-0.69829923]
[ 0.42455259] [-0.69543284]
[ 0.42445183] [-0.69257832]
[ 0.42435145] [-0.68973517]
[ 0.4242515] [-0.68690395]
[ 0.42415196] [-0.68408424]
[ 0.4240528] [-0.6812762]
[ 0.42395407] [-0.67847955]
[ 0.42385572] [-0.67569441]
[ 0.42375779] [-0.6729207]
[ 0.42366028] [-0.67015845]
[ 0.42356315] [-0.66740751]
[ 0.42346644] [-0.66466784]
[ 0.42337012] [-0.66193944]
[ 0.42327416] [-0.65922225]
[ 0.42317864] [-0.65651619]
[ 0.42308348] [-0.65382123]
[ 0.42298874] [-0.65113741]
[ 0.42289436] [-0.6484645]
[ 0.42280039] [-0.64580262]
[ 0.42270681] [-0.6431517]
[ 0.42261356] [-0.64051157]
[ 0.42252076] [-0.63788235]
[ 0.42242831] [-0.63526386]
[ 0.42233622] [-0.6326561]
[ 0.42224455] [-0.63005906]
[ 0.42215326] [-0.62747276]
[ 0.42206231] [-0.62489706]
[ 0.42197174] [-0.62233192]
[ 0.42188156] [-0.61977726]
[ 0.42179173] [-0.61723322]
[ 0.42170227] [-0.61469954]
[ 0.42161322] [-0.6121763]
[ 0.42152449] [-0.60966337]
[ 0.4214361] [-0.60716075]
[ 0.42134812] [-0.60466844]
[ 0.42126048] [-0.60218632]
[ 0.42117321] [-0.5997144]
[ 0.42108631] [-0.59725261]
[ 0.42099974] [-0.59480095]
[ 0.42091355] [-0.5923593]
[ 0.42082772] [-0.58992773]
[ 0.42074221] [-0.58750612]
[ 0.42065707] [-0.58509439]
[ 0.42057225] [-0.58269262]
[ 0.42048782] [-0.58030075]
[ 0.42040369] [-0.57791865]
[ 0.42031991] [-0.57554632]
[ 0.42023653] [-0.57318377]
[ 0.42015347] [-0.57083094]
[ 0.42007077] [-0.5684877]
[ 0.41998836] [-0.56615406]
[ 0.41990632] [-0.56383008]
[ 0.41982457] [-0.56151563]
[ 0.41974318] [-0.55921066]
[ 0.41966218] [-0.55691516]
[ 0.41958147] [-0.55462909]
[ 0.4195011] [-0.55235237]
[ 0.41942102] [-0.55008501]
[ 0.41934133] [-0.54782701]
[ 0.41926193] [-0.54557824]
[ 0.41918284] [-0.54333872]
[ 0.4191041] [-0.54110831]
[ 0.41902569] [-0.53888714]
[ 0.41894755] [-0.5366751]
[ 0.41886982] [-0.53447211]
[ 0.41879237] [-0.53227806]
[ 0.41871524] [-0.53009313]
[ 0.41863838] [-0.52791727]
[ 0.41856188] [-0.52575034]
[ 0.41848567] [-0.52359205]
[ 0.41840979] [-0.52144271]
[ 0.41833425] [-0.51930231]
[ 0.41825897] [-0.51717055]
[ 0.41818401] [-0.51504761]
[ 0.41810936] [-0.51293337]
[ 0.41803503] [-0.51082784]
[ 0.417961] [-0.50873089]
[ 0.41788727] [-0.50664258]
[ 0.41781384] [-0.50456285]
[ 0.4177407] [-0.50249171]
[ 0.4176679] [-0.50042903]
[ 0.41759539] [-0.49837482]
[ 0.41752315] [-0.49632904]
[ 0.41745123] [-0.49429163]
[ 0.41737959] [-0.4922626]
[ 0.41730824] [-0.49024191]
[ 0.41723716] [-0.48822951]
[ 0.41716644] [-0.4862254]
[ 0.41709596] [-0.48422945]
[ 0.41702577] [-0.48224172]
[ 0.41695589] [-0.48026216]
[ 0.4168863] [-0.47829071]
[ 0.41681695] [-0.47632736]
[ 0.41674796] [-0.47437206]
[ 0.41667923] [-0.47242478]
[ 0.41661072] [-0.47048554]
[ 0.41654253] [-0.46855426]
[ 0.41647464] [-0.46663091]
[ 0.41640702] [-0.46471542]
[ 0.41633967] [-0.4628078]
[ 0.41627261] [-0.46090803]
[ 0.41620579] [-0.45901603]
[ 0.41613927] [-0.4571318]
[ 0.4