Matlab 如何实现具有隐藏层的神经网络?

Matlab 如何实现具有隐藏层的神经网络?,matlab,machine-learning,Matlab,Machine Learning,我正在尝试训练一个3输入1输出的神经网络(有一个输入层,一个隐藏层和一个输出层),它可以在MATLAB中对二次函数进行分类。我试图实施的阶段是我试图实施的阶段。我试图实施的阶段是:前馈的阶段,x{u I{{{0 0 0 0 0 0 0 0 0 0 0 0 0 0 0{{{0 0 0}{{{j\\}w{{{{j}}}w{{{{j}}{我试图实施的阶段,我试图实施的阶段是我试图实施的前馈,x我试图试图实施的阶段,我试图实施实施实施阶段的前馈前馈的前馈,,x{{{{{{{{{{{{{{{{{{{{j}

我正在尝试训练一个3输入1输出的神经网络(有一个输入层,一个隐藏层和一个输出层),它可以在MATLAB中对二次函数进行分类。我试图实施的阶段是我试图实施的阶段。我试图实施的阶段是:前馈的阶段,x{u I{{{0 0 0 0 0 0 0 0 0 0 0 0 0 0 0{{{0 0 0}{{{j\\}w{{{{j}}}w{{{{j}}{我试图实施的阶段,我试图实施的阶段是我试图实施的前馈,x我试图试图实施的阶段,我试图实施实施实施阶段的前馈前馈的前馈,,x{{{{{{{{{{{{{{{{{{{{j}}}}}}{{{{{j}}}}}}{{{{{{{{{{{{{{{{{{{{{j}}}}}}}}}}}}{{{{{{{{{{{{{{{{{{{{{{{{{{{{}}}}}}}}}}{ij}{old}-\epsilon{\delta}{I}{out}x{in}$,其中$x$是一个输入向量,$w$是重量,$\epsilon$是学习率

我在编码隐藏层和添加激活函数$f(s)=tanh(s)$时遇到问题,因为网络输出中的错误似乎没有减少。有人能指出我实施的错误吗

输入是二次型$ax^2+bx+c=0$的实系数,如果二次型有两个实根,则输出应为正,如果没有,则输出应为负

nTrain = 100; % training set
nOutput = 1;
nSecondLayer = 7; % size of hidden layer (arbitrary)
trainExamples = rand(4,nTrain); % independent random set of examples
trainExamples(4,:) = ones(1,nTrain);  % set the dummy input to be 1

T = sign(trainExamples(2,:).^2-4*trainExamples(1,:).*trainExamples(3,:)); % The teacher provides this for every example
%The student neuron starts with random weights
w1 = rand(4,nSecondLayer);
w2 = rand(nSecondLayer,nOutput);
nepochs=0;
nwrong = 1;
S1(nSecondLayer,nTrain) = 0;
S2(nOutput,nTrain) = 0; 

while( nwrong>1e-2 )  % more then some small number close to zero
    for i=1:nTrain
        x = trainExamples(:,i);
        S2(:,i) = w2'*S1(:,i);
        deltak = tanh(S2(:,i)) - T(:,i); % back propagate
        deltaj = (1-tanh(S2(:,i)).^2).*(w2*deltak); % back propagate      
        w2 = w2 - tanh(S1(:,i))*deltak'; % updating
        w1 = w1- x*deltaj'; % updating  
    end
   output = tanh(w2'*tanh(w1'*trainExamples));
   dOutput = output-T;
   nwrong = sum(abs(dOutput));
   disp(nwrong)
   nepochs = nepochs+1          
end
nepochs

谢谢

几天后,我把头撞在墙上,发现了一个小的打字错误。以下是一个可行的解决方案:

clear
% Set up parameters
nInput = 4; % number of nodes in input
nOutput = 1; % number of nodes in output
nHiddenLayer = 7; % number of nodes in th hidden layer
nTrain = 1000; % size of training set
epsilon = 0.01; % learning rate


% Set up the inputs: random coefficients between -1 and 1
trainExamples = 2*rand(nInput,nTrain)-1;
trainExamples(nInput,:) = ones(1,nTrain);  %set the last input to be 1

% Set up the student neurons for both hidden and the output layers
S1(nHiddenLayer,nTrain) = 0;
S2(nOutput,nTrain) = 0;

% The student neuron starts with random weights from both input and the hidden layers
w1 = rand(nInput,nHiddenLayer);
w2 = rand(nHiddenLayer+1,nOutput);

% Calculate the teacher outputs according to the quadratic formula
T = sign(trainExamples(2,:).^2-4*trainExamples(1,:).*trainExamples(3,:));


% Initialise values for looping
nEpochs = 0;
nWrong = nTrain*0.01;
Wrong = [];
Epoch = [];

while(nWrong >= (nTrain*0.01)) % as long as more than 1% of outputs are wrong
    for i=1:nTrain
        x = trainExamples(:,i);
        S1(1:nHiddenLayer,i) = w1'*x;
        S2(:,i) = w2'*[tanh(S1(:,i));1];
        delta1 = tanh(S2(:,i)) - T(:,i); % back propagate
        delta2 = (1-tanh(S1(:,i)).^2).*(w2(1:nHiddenLayer,:)*delta1); % back propagate       
        w1 = w1 - epsilon*x*delta2'; % update
        w2 = w2 - epsilon*[tanh(S1(:,i));1]*delta1'; % update
    end

    outputNN = sign(tanh(S2));
    delta = outputNN - T; % difference between student and teacher
    nWrong = sum(abs(delta/2));
    nEpochs = nEpochs + 1;
    Wrong = [Wrong nWrong];
    Epoch = [Epoch nEpochs];
end
plot(Epoch,Wrong);