Python 用线性模型逼近平方函数时,PyTorch不收敛
我正在尝试学习一些Pytork,并参考了这次讨论 作者提供了一段最短的代码,说明了如何使用PyTorch来求解被随机噪声污染的未知线性函数 这段代码对我来说运行良好 然而,当我改变函数,使我想要t=X^2时,参数似乎不会收敛Python 用线性模型逼近平方函数时,PyTorch不收敛,python,machine-learning,neural-network,regression,pytorch,Python,Machine Learning,Neural Network,Regression,Pytorch,我正在尝试学习一些Pytork,并参考了这次讨论 作者提供了一段最短的代码,说明了如何使用PyTorch来求解被随机噪声污染的未知线性函数 这段代码对我来说运行良好 然而,当我改变函数,使我想要t=X^2时,参数似乎不会收敛 import torch import torch.nn as nn import torch.optim as optim from torch.autograd import Variable # Let's make some data for a linear r
import torch
import torch.nn as nn
import torch.optim as optim
from torch.autograd import Variable
# Let's make some data for a linear regression.
A = 3.1415926
b = 2.7189351
error = 0.1
N = 100 # number of data points
# Data
X = Variable(torch.randn(N, 1))
# (noisy) Target values that we want to learn.
t = X * X + Variable(torch.randn(N, 1) * error)
# Creating a model, making the optimizer, defining loss
model = nn.Linear(1, 1)
optimizer = optim.SGD(model.parameters(), lr=0.05)
loss_fn = nn.MSELoss()
# Run training
niter = 50
for _ in range(0, niter):
optimizer.zero_grad()
predictions = model(X)
loss = loss_fn(predictions, t)
loss.backward()
optimizer.step()
print("-" * 50)
print("error = {}".format(loss.data[0]))
print("learned A = {}".format(list(model.parameters())[0].data[0, 0]))
print("learned b = {}".format(list(model.parameters())[1].data[0]))
当我执行这段代码时,新的A和b参数似乎是随机的,因此不会收敛。我认为这应该收敛,因为你可以用斜率和偏移函数来近似任何函数。我的理论是我没有正确地使用Pytork
有人能识别我的
t=X*X+变量(torch.randn(N,1)*错误)
代码行有问题吗?你不能用线性函数拟合二次多项式。您不能期望超过随机数(因为您有来自多项式的随机样本)。您可以做的是尝试使用两个输入,
x
和x^2
,并根据它们进行调整:
model = nn.Linear(2, 1) # you have 2 inputs now
X_input = torch.cat((X, X**2), dim=1) # have 2 inputs per entry
# ...
predictions = model(X_input) # 2 inputs -> 1 output
loss = loss_fn(predictions, t)
# ...
# learning t = c*x^2 + a*x + b
print("learned a = {}".format(list(model.parameters())[0].data[0, 0]))
print("learned c = {}".format(list(model.parameters())[0].data[0, 1]))
print("learned b = {}".format(list(model.parameters())[1].data[0]))