Matlab 问题的神经网络训练
我在试着找出哪里会犯错误。如果你能帮助我,我将非常高兴 我的问题是: 在神经网络工具箱中的序列中,函数以一种方式运行,但当我将其放入parfor循环时,一切都变得疯狂Matlab 问题的神经网络训练,matlab,parallel-processing,machine-learning,neural-network,parfor,Matlab,Parallel Processing,Machine Learning,Neural Network,Parfor,我在试着找出哪里会犯错误。如果你能帮助我,我将非常高兴 我的问题是: 在神经网络工具箱中的序列中,函数以一种方式运行,但当我将其放入parfor循环时,一切都变得疯狂 >> version ans = 8.3.0.532 (R2014a) 这里有一个函数 function per = neuralTr(tSet,Y,CrossVal,Ycv) hiddenLayerSize = 94; redeT = patternnet(hiddenLayerSize); redeT.in
>> version
ans =
8.3.0.532 (R2014a)
这里有一个函数
function per = neuralTr(tSet,Y,CrossVal,Ycv)
hiddenLayerSize = 94;
redeT = patternnet(hiddenLayerSize);
redeT.input.processFcns = {'removeconstantrows','mapminmax'};
redeT.output.processFcns = {'removeconstantrows','mapminmax'};
redeT.divideFcn = 'dividerand'; % Divide data randomly
redeT.divideMode = 'sample'; % Divide up every sample
redeT.divideParam.trainRatio = 80/100;
redeT.divideParam.valRatio = 10/100;
redeT.divideParam.testRatio = 10/100;
redeT.trainFcn = 'trainscg'; % Scaled conjugate gradient
redeT.performFcn = 'crossentropy'; % Cross-entropy
redeT.trainParam.showWindow=0; %default is 1)
redeT = train(redeT,tSet,Y);
outputs = sim(redeT,CrossVal);
per = perform(redeT,Ycv,outputs);
end
这是我输入的代码:
Data loaded in workspace
whos
Name Size Bytes Class Attributes
CrossVal 282x157 354192 double
Y 2x363 5808 double
Ycv 2x157 2512 double
per 1x1 8 double
tSet 282x363 818928 double
函数以串行方式执行
per = neuralTr(tSet,Y,CrossVal,Ycv)
per =
0.90
平行起跑
>> parpool local
Starting parallel pool (parpool) using the 'local' profile ... connected to 12 workers.
ans =
Pool with properties:
Connected: true
NumWorkers: 12
Cluster: local
AttachedFiles: {}
IdleTimeout: Inf (no automatic shut down)
SpmdEnabled: true
并行初始化和执行函数12次
per = cell(12,1);
parfor ii = 1 : 12
per{ii} = neuralTr(tSet,Y,CrossVal,Ycv);
end
per
per =
[0.96]
[0.83]
[0.92]
[1.08]
[0.85]
[0.89]
[1.06]
[0.83]
[0.90]
[0.93]
[0.95]
[0.81]
再次执行以查看随机初始化是否带来不同的值
per = cell(12,1);
parfor ii = 1 : 12
per{ii} = neuralTr(tSet,Y,CrossVal,Ycv);
end
per
per =
[0.96]
[0.83]
[0.92]
[1.08]
[0.85]
[0.89]
[1.06]
[0.83]
[0.90]
[0.93]
[0.95]
[0.81]
编辑1:
仅使用for运行函数
per = cell(12,1);
for ii = 1 : 12
per{ii} = neuralTr(tSet,Y,CrossVal,Ycv);
end
per
per =
[0.90]
[0.90]
[0.90]
[0.90]
[0.90]
[0.90]
[0.90]
[0.90]
[0.90]
[0.90]
[0.90]
[0.90]
编辑2:
我修改了我的函数,现在一切都很好。也许问题是当数据被并行划分时。所以我在发送到并行机之前对数据进行了划分。很多
function per = neuralTr(tSet,Y,CrossVal,Ycv)
indt = 1:round(size(tSet,2) * 0.8) ;
indv = round(size(tSet,2) * 0.8):round(size(tSet,2) * 0.9);
indte = round(size(tSet,2) * 0.9):size(tSet,2);
hiddenLayerSize = 94;
redeT = patternnet(hiddenLayerSize);
redeT.input.processFcns = {'removeconstantrows','mapminmax'};
redeT.output.processFcns = {'removeconstantrows','mapminmax'};
redeT.divideFcn = 'dividerand'; % Divide data randomly
redeT.divideMode = 'sample'; % Divide up every sample
redeT.divideParam.trainRatio = 80/100;
redeT.divideParam.valRatio = 10/100;
redeT.divideParam.testRatio = 10/100;
redeT.trainFcn = 'trainscg'; % Scaled conjugate gradient
redeT.performFcn = 'crossentropy'; % Cross-entropy
redeT.trainParam.showWindow=0; %default is 1)
redeT = train(redeT,tSet,Y);
outputs = sim(redeT,CrossVal);
per = zeros(12,1);
parfor ii = 1 : 12
redes = train(redeT,tSet,Y);
per(ii) = perform(redes,Ycv,outputs);
end
end
结果:
>> per = neuralTr(tSet,Y,CrossVal,Ycv)
per =
0.90
0.90
0.90
0.90
0.90
0.90
0.90
0.90
0.90
0.90
0.90
0.90
哦!!我想我找到了,但无法测试 您的代码中有:
redeT.divideFcn = 'dividerand'; % Divide data randomly
如果每个工作人员随机选择数据,那么他们的预期结果是不同的,不是吗
试试下一个:
per = cell(12,1);
parfor ii = 1 : 12
rng(1); % set the seed for random number generation, so every time the number generated will be the same
per{ii} = neuralTr(tSet,Y,CrossVal,Ycv);
end
per
不确定neuralTr是否在内部设置种子,但试一下。据我所知,该功能工作正常。在parfor循环中,您只需运行串行版本12次,并将所有12个结果存储在
per
中。得到不同结果的原因是每次随机分割数据。这就是问题所在吗?嗯,但是如果我在函数中运行一个循环,for总是给我0.9。这种行为正常吗?你是说当你用for循环替换parfor循环时?另外,我刚刚看到运行parfor循环两次每次都会得到相同的结果,所以我认为这不是随机初始化。这真的很奇怪。我补充了一个问题,结果只运行for循环。这真的很奇怪。恐怕我现在还没有答案,但我会努力考虑的。回答得很好!我在mycode中使用它。Vielen danke我的朋友!