Warning: file_get_contents(/data/phpspider/zhask/data//catemap/4/matlab/16.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
如何在MATLAB代码中使用GD实现收敛?明显地显示出矩阵维数的错误_Matlab - Fatal编程技术网

如何在MATLAB代码中使用GD实现收敛?明显地显示出矩阵维数的错误

如何在MATLAB代码中使用GD实现收敛?明显地显示出矩阵维数的错误,matlab,Matlab,我的梯度下降代码中有一个维数错误。代码应该随着Y_预测迭代次数的增加而收敛。Theta_0和Theta_1应该迭代,但它显示了Y_预测的维数误差 Theta_0 = 0 Theta_1 = 0 learning_rate = 0.001 X = [2000 2001 2002 2003 2004 2005 2006 2007 2008 2009 2010 2011 2012 2013] % Year Y = [2.00 2.500 2.900 3.147 4.515 4.9

我的梯度下降代码中有一个维数错误。代码应该随着Y_预测迭代次数的增加而收敛。Theta_0和Theta_1应该迭代,但它显示了Y_预测的维数误差

  Theta_0 = 0
  Theta_1 = 0
  learning_rate = 0.001

  X = [2000 2001 2002 2003 2004 2005 2006 2007 2008 2009 2010 2011 2012 2013] % Year
  Y = [2.00 2.500 2.900 3.147 4.515 4.903 5.365 5.704 6.853 7.971 8.561 10.00 11.280 12.900] % Price

  n = length(X)

  for i = 1:100  

  Y_prediction = Theta_1.*X + Theta_0                                   ! Y = mx + c

  Derivative_Theta_0 = (1/n)*sum(Y_prediction - Y)
  Derivative_Theta_1 = (1/n)*sum(X.*(Y_prediction - Y))

  Theta_0(i+1) = Theta_0(i) - learning_rate*Derivative_Theta_0
  Theta_1(i+1) = Theta_1(i) - learning_rate*Derivative_Theta_1

  end 

你可能想要
Y\u预测(i)=θ1(i)。*X+θ0(i)
即使这样,它也不会收敛。数据集在关系中看起来是线性的,显然,线性回归应该可以处理它。一个是你的代码不能运行的原因,另一个是数学错误的原因。我回答了第一个问题
Theta_0 = 0
Theta_1 = 0
learning_rate = 0.001

X = [2000 2001 2002 2003 2004 2005 2006 2007 2008 2009 2010 2011 2012 2013] % Year
Y = [2.00 2.500 2.900 3.147 4.515 4.903 5.365 5.704 6.853 7.971 8.561 10.00 11.280 
    12.900] % Price
X = X - 2000 
n = length(X)

for i = 1:100000  

 Y_prediction = Theta_1*X + Theta_0;

  Derivative_Theta_0 = (1/n)*sum(Y_prediction - Y);
  Derivative_Theta_1 = (1/n)*sum(X.*(Y_prediction - Y));

  Theta_0 = Theta_0 - learning_rate*Derivative_Theta_0;
  Theta_1 = Theta_1 - learning_rate*Derivative_Theta_1;

end 
% Linear Regression
Y_prediction = Theta_1*X + Theta_0;