Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/python/288.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Python 如何使用PyBrain来训练简单的线性函数?_Python_Artificial Intelligence_Pybrain - Fatal编程技术网

Python 如何使用PyBrain来训练简单的线性函数?

Python 如何使用PyBrain来训练简单的线性函数?,python,artificial-intelligence,pybrain,Python,Artificial Intelligence,Pybrain,我刚刚尝试并希望它能学习简单的线性函数f(x)=4x+1: 但当我执行此操作时,我得到了可怕的错误结果: f(0) = 1962 f(1) = 1962 f(2) = 1962 f(3) = 1962 f(4) = 1962 f(5) = 1962 f(6) = 1962 f(7) = 1962 f(8) = 1962 f(9) = 1962 为什么这样不行 试试2 代码: 输出: Start training train-errors: [ 827395.411895 755443.28

我刚刚尝试并希望它能学习简单的线性函数f(x)=4x+1:

但当我执行此操作时,我得到了可怕的错误结果:

f(0) = 1962
f(1) = 1962
f(2) = 1962
f(3) = 1962
f(4) = 1962
f(5) = 1962
f(6) = 1962
f(7) = 1962
f(8) = 1962
f(9) = 1962
为什么这样不行

试试2 代码:

输出:

Start training
train-errors: [  827395.411895  755443.286202  722073.904381  748336.584579 
[...]
695939.638106  726953.086185  736527.150008  739789.458146  736074.235677  731222.936020  675937.725009]
valid-errors: [  2479217.507148  915115.526570  703748.266402  605613.979311  592809.132542  686959.683977  612248.174146  
[...]
655606.225724  637762.864477  643013.094767  620825.083765  609063.451602  607935.458244  716839.447374]
([827395.41189463751, 755443.28620243724, 722073.90438077366, 748336.58457926242, 739568.58919456392, 725496.58682491502, 
[...]
637762.86447708646, 643013.09476733557, 620825.08376532339, 609063.45160197129, 607935.45824447344, 716839.44737418776])
Finished training
Test function f(x)=4x+1
f(0) = 1955
f(1) = 1955
f(2) = 1955
f(3) = 1955
f(4) = 1955
f(5) = 1955
f(6) = 1955
f(7) = 1955
f(8) = 1955
f(9) = 1955

神经网络通常针对函数进行训练。这意味着您不能直接获取线性函数的(x,f(x))对并对其进行训练。(但是,这可以通过线性回归实现)

相反,网络必须通过变量集群进行训练,例如:

#!/usr/bin/env python

from random import normalvariate

# Build the network
from pybrain.tools.shortcuts import buildNetwork
net = buildNetwork(2, 1, 1, bias=True)

# Add samples
from pybrain.datasets import SupervisedDataSet
ds = SupervisedDataSet(2, 1)
for i in range(100):
    x = normalvariate(3, 0.6)
    y = normalvariate(2, 1)
    ds.addSample((x, y), (0,))
for i in range(100):
    x = normalvariate(7, 0.5)
    y = normalvariate(1, 0.1)
    ds.addSample((x, y), (1,))

# Train with samples
from pybrain.supervised.trainers import BackpropTrainer
trainer = BackpropTrainer(net, ds, learningrate=0.1, momentum=0.99)

print("Start training")
print(trainer.train())
a = trainer.trainUntilConvergence(dataset=ds,
                                  maxEpochs=1000,
                                  verbose=True,
                                  continueEpochs=10,
                                  validationProportion=0.1,
                                  outlayer=softmax)

print("Finished training")
print(trainer.train())

# See if it remembers
print("Test function f(x)=4x+1")
for x in range(-10,10):
    for y in range(-10,10):
        print("f(%i, %i) = %i" % (x, y, net.activate((x, y))))
print("f(%i, %i) = %i" % (3, 2, net.activate((3, 2))))
print("f(%i, %i) = %i" % (7, 1, net.activate((7, 1))))

可以在我的博客上找到。

我运行了你的代码,经过100次迭代的训练后,我仍然有一个671650.61的错误,看起来相当大。您是否尝试过更频繁地培训网络?例如,您也可以尝试使用maxEpochs=1000的方法trainUntilConvergence()。看@mcwise:是的,我试过了。错误率和结果相似:-/您是否尝试了较小的学习率和/或较大的数据集?阅读此答案,了解更多的见解和可能的问题,我认为我的方法在概念上是错误的。我不训练线性函数。我训练一个感知器来区分两个线性分离的数据集。
Start training
train-errors: [  827395.411895  755443.286202  722073.904381  748336.584579 
[...]
695939.638106  726953.086185  736527.150008  739789.458146  736074.235677  731222.936020  675937.725009]
valid-errors: [  2479217.507148  915115.526570  703748.266402  605613.979311  592809.132542  686959.683977  612248.174146  
[...]
655606.225724  637762.864477  643013.094767  620825.083765  609063.451602  607935.458244  716839.447374]
([827395.41189463751, 755443.28620243724, 722073.90438077366, 748336.58457926242, 739568.58919456392, 725496.58682491502, 
[...]
637762.86447708646, 643013.09476733557, 620825.08376532339, 609063.45160197129, 607935.45824447344, 716839.44737418776])
Finished training
Test function f(x)=4x+1
f(0) = 1955
f(1) = 1955
f(2) = 1955
f(3) = 1955
f(4) = 1955
f(5) = 1955
f(6) = 1955
f(7) = 1955
f(8) = 1955
f(9) = 1955
#!/usr/bin/env python

from random import normalvariate

# Build the network
from pybrain.tools.shortcuts import buildNetwork
net = buildNetwork(2, 1, 1, bias=True)

# Add samples
from pybrain.datasets import SupervisedDataSet
ds = SupervisedDataSet(2, 1)
for i in range(100):
    x = normalvariate(3, 0.6)
    y = normalvariate(2, 1)
    ds.addSample((x, y), (0,))
for i in range(100):
    x = normalvariate(7, 0.5)
    y = normalvariate(1, 0.1)
    ds.addSample((x, y), (1,))

# Train with samples
from pybrain.supervised.trainers import BackpropTrainer
trainer = BackpropTrainer(net, ds, learningrate=0.1, momentum=0.99)

print("Start training")
print(trainer.train())
a = trainer.trainUntilConvergence(dataset=ds,
                                  maxEpochs=1000,
                                  verbose=True,
                                  continueEpochs=10,
                                  validationProportion=0.1,
                                  outlayer=softmax)

print("Finished training")
print(trainer.train())

# See if it remembers
print("Test function f(x)=4x+1")
for x in range(-10,10):
    for y in range(-10,10):
        print("f(%i, %i) = %i" % (x, y, net.activate((x, y))))
print("f(%i, %i) = %i" % (3, 2, net.activate((3, 2))))
print("f(%i, %i) = %i" % (7, 1, net.activate((7, 1))))