Neural network sigmoid神经网络

Neural network sigmoid神经网络,neural-network,multi-layer,Neural Network,Multi Layer,我正在尝试用sigmoid函数实现一个神经网络 但是下面的代码不起作用 这是神经网络的训练部分。 它无法正确更新权重 这个代码有什么问题 clc; clear all; close all; % load train_data1 train_data1=[-0.498800000000000,-0.257500000000000;-0.492800000000000,-0.274300000000000;-0.470300000000000,-0.282600000000000;-0.42740

我正在尝试用sigmoid函数实现一个神经网络 但是下面的代码不起作用 这是神经网络的训练部分。 它无法正确更新权重 这个代码有什么问题

clc; clear all; close all;
% load train_data1
train_data1=[-0.498800000000000,-0.257500000000000;-0.492800000000000,-0.274300000000000;-0.470300000000000,-0.282600000000000;-0.427400000000000,-0.474000000000000;-0.420400000000000,-0.518000000000000;-0.326300000000000,-1.13230000000000;-0.317300000000000,-0.875300000000000;-0.295000000000000,-1.02770000000000;-0.267600000000000,-0.882800000000000;-0.260500000000000,-0.976500000000000;-0.216100000000000,-0.970400000000000;-0.207000000000000,-0.813800000000000;-0.164000000000000,-0.696600000000000;-0.159900000000000,-0.793300000000000;-0.122000000000000,-0.764400000000000;-0.0729000000000000,-0.435300000000000;-0.00640000000000000,-0.0546000000000000;0.132200000000000,0.710300000000000;0.137100000000000,0.587000000000000;0.160300000000000,0.819200000000000;0.230600000000000,0.989200000000000;0.286800000000000,0.737700000000000;0.334000000000000,0.943500000000000;0.375200000000000,0.688200000000000;0.429700000000000,0.567800000000000];
train_data1 = sortrows(train_data1);
% normalize data to [0,1]
data1=[train_data1];
max1=max(max(data1));
min1=min(min(data1));
train_data2 = (train_data1 - min1) / ( max1 - min1);

x = train_data2(:,1); % train input data
r = train_data2(:,2); % train output data


hidden_neurons = 2;
maxepochcount = 1000;

datacount1 = size(x,1);
% add a bias as an input
bias = ones(datacount1,1);
% x = [x bias];
% read how many inputs
inputcount = size(x,2);
% ---------- data loaded -----------
% ---------- set weights -----------
% set initial random weights
WI = (randn(inputcount,hidden_neurons) - 0.5)/10;
WO = (randn(1,hidden_neurons) - 0.5)/10;
%-----------------------------------
%--- Learning Starts Here! ---------
%-----------------------------------
eta1 = 0.5;
eta2 = eta1/5;
% do a number of epochs
for iter = 1:maxepochcount
% loop through the data
    for j = 1:datacount1
        % read the current sample
        I = x(j,:);
        D = r(j,1);
        % calculate the error for this sample
        H = (sigmoid(I * WI))';
        O = H' * WO';
        error = D-O;
        % adjust weight between hidden & output
        delta_i = O.*(1-O).*(D-O); % D actual, O calculated output
        % Calculate error for each node in layer_(n-1)
        delta_j = H.*(1-H).*(WO.'*delta_i); % H.' is the output of hidden layer
        % Adjust weights in matrices sequentially
        WO = WO + eta2.*delta_i*(H.') % H.' is the output of hidden layer
        WI = WI + eta1.*(delta_j*(I))' % I.' is the inputs

%         % adjust weight between hidden & output
%         delta_HO = error.*eta2 .* hidden_val;
%         WO = WO - delta_HO';
%         % adjust the weights between input & hidden
%         delta_IH = eta1 .* error .* WO' .* (1 - (H .^ 2)) * I;
%         WI = WI - delta_IH';

    end
    O = sigmoid(WO*sigmoid(x * WI)');
%     error(iter) =  (sum(error .^ 2)) ^ 0.5;
    if rem(iter,100)==0     % Every 100 epochs, show how training is doing
     plot(x,O, 'color','red','linewidth',2); hold on;    
     drawnow;
     iter

    end

%  return   
end

只需将输出值缩放到激活功能。 如果我们使用tanh,我们必须将它们缩放到[-1,1],对于乙状结肠[0,1]。
代码运行正常,但有时需要更多的历元。

您是否使用调试器对代码进行逐步调试,并确保每一步的值符合预期值?循环末尾有一个绘图函数,用于查看每个历元后网络的结果。我使用此代码的tanh版本,它工作正常,但sigmoid函数不起作用。我怀疑代码的权重更新部分对于具有sigmoid激活函数的网络来说,是否有必要将输入和输出值规范化为[0,1]。实际上,我的数据值介于-1和+1之间