Warning: file_get_contents(/data/phpspider/zhask/data//catemap/4/matlab/15.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Matlab中的Logistic回归梯度下降法_Matlab_Machine Learning_Logistic Regression_Gradient Descent - Fatal编程技术网

Matlab中的Logistic回归梯度下降法

Matlab中的Logistic回归梯度下降法,matlab,machine-learning,logistic-regression,gradient-descent,Matlab,Machine Learning,Logistic Regression,Gradient Descent,这是密码 function [theta] = LR(D) % D is the data having feature variables and class labels % Now decompose D into X and C %Note that dimensions of X = , C = C = D(:,1); C = C'; size(C) X = D(:,2:size(D,2)); size(X) alpha = .00001; theta_old = zero

这是密码

function [theta] = LR(D)
% D is the data having feature variables and class labels

% Now decompose D into X and C 
%Note that dimensions of X =  , C = 

C = D(:,1);
C = C';
size(C)
X = D(:,2:size(D,2));
size(X)
alpha = .00001;

theta_old = zeros(1,34);
theta_new = .001.*ones(1,34);
count = 1;
for count = 1:100000
    theta_old = theta_new;
    theta_new = theta_new + alpha*(C-sigmoid(X*theta_new')')*X;
    llr =  sum(LLR((X*theta_new').*(C'))) 
end
thetaopt = theta_new


end


function a = LLR( z )
a= 1.*log(1.0 + exp(-z));
end

function a = sigmoid(z)
 a = 1.0 ./ (1.0 + exp(-z));
 end

我的问题是对数似然比先减小,然后开始增大。这是梯度下降算法还是代码的问题。

看起来目标函数可能有问题

如果标签(
C
)在
{0,1}
中,那么您应该使用loss
C.*LLR(X*theta')+(1-C)。*(LLR(X*theta')+X*theta')

如果标签位于
{-1,1}
中,则损失应为
LLR(C.*X*theta')


您似乎只使用了第一类损失函数的第一部分

是指标签为D(:,1)0/1或-1/1。可以给我们显示重量(θu新)梯度,llr,在每10个国际热核实验反应器中,似乎模型是过拟合的,因为你是国际热核实验反应器的100000倍,这是一个很大的误差。当梯度达到某个值(例如:1e-4)时,您应该退出。您可以尝试正则化,在l2正则化中,grad=(C-sigmoid(X*theta_new'))*X+thread_new