Python 如何从Spicle.optimize.minimize的所有迭代中返回参数

Python 如何从Spicle.optimize.minimize的所有迭代中返回参数,python,scipy,minimization,Python,Scipy,Minimization,我正在使用scipy.optimize.fmin优化Rosenbrock函数: import scipy import bumpy as np def rosen(x): """The Rosenbrock function""" return sum(100.0*(x[1:]-x[:-1]**2.0)**2.0 + (1-x[:-1])**2.0) x0 = np.array([1.3, 0.7, 0.8, 1.9, 1.2]) scipy.optimize.fmin(ros

我正在使用
scipy.optimize.fmin
优化Rosenbrock函数:

import scipy
import bumpy as np
def rosen(x):
    """The Rosenbrock function"""
    return sum(100.0*(x[1:]-x[:-1]**2.0)**2.0 + (1-x[:-1])**2.0)

x0 = np.array([1.3, 0.7, 0.8, 1.9, 1.2])
scipy.optimize.fmin(rosen, x0, full_output=True)
这将返回解决方案的元组(最小化函数的参数、函数最小值、迭代次数、函数调用次数)


但是,我希望能够在每个步骤中绘制值。例如,我希望沿x轴绘制迭代次数,沿y轴绘制运行的最小值。

fmin可以采用在每个步骤调用的可选回调函数,因此您可以创建一个简单的回调函数来获取每个步骤的值:

def save_step(k):
    global steps
    steps.append(k)

steps = []
scipy.optimize.fmin(rosen, x0, full_output=True, callback=save_step)
print np.array(steps)[:10]
输出:

[[ 1.339       0.721       0.824       1.71        1.236     ]
 [ 1.339       0.721       0.824       1.71        1.236     ]
 [ 1.339       0.721       0.824       1.71        1.236     ]
 [ 1.339       0.721       0.824       1.71        1.236     ]
 [ 1.2877696   0.7417984   0.8013696   1.587184    1.3580544 ]
 [ 1.28043136  0.76687744  0.88219136  1.3994944   1.29688704]
 [ 1.28043136  0.76687744  0.88219136  1.3994944   1.29688704]
 [ 1.28043136  0.76687744  0.88219136  1.3994944   1.29688704]
 [ 1.35935594  0.83266045  0.8240753   1.02414244  1.38852256]
 [ 1.30094767  0.80530982  0.85898166  1.0331386   1.45104273]]

fmin可以采用一个可选的回调函数,该函数在每个步骤中都会被调用,因此您可以创建一个简单的回调函数来获取每个步骤中的值:

def save_step(k):
    global steps
    steps.append(k)

steps = []
scipy.optimize.fmin(rosen, x0, full_output=True, callback=save_step)
print np.array(steps)[:10]
输出:

[[ 1.339       0.721       0.824       1.71        1.236     ]
 [ 1.339       0.721       0.824       1.71        1.236     ]
 [ 1.339       0.721       0.824       1.71        1.236     ]
 [ 1.339       0.721       0.824       1.71        1.236     ]
 [ 1.2877696   0.7417984   0.8013696   1.587184    1.3580544 ]
 [ 1.28043136  0.76687744  0.88219136  1.3994944   1.29688704]
 [ 1.28043136  0.76687744  0.88219136  1.3994944   1.29688704]
 [ 1.28043136  0.76687744  0.88219136  1.3994944   1.29688704]
 [ 1.35935594  0.83266045  0.8240753   1.02414244  1.38852256]
 [ 1.30094767  0.80530982  0.85898166  1.0331386   1.45104273]]

fmin可以采用一个可选的回调函数,该函数在每个步骤中都会被调用,因此您可以创建一个简单的回调函数来获取每个步骤中的值:

def save_step(k):
    global steps
    steps.append(k)

steps = []
scipy.optimize.fmin(rosen, x0, full_output=True, callback=save_step)
print np.array(steps)[:10]
输出:

[[ 1.339       0.721       0.824       1.71        1.236     ]
 [ 1.339       0.721       0.824       1.71        1.236     ]
 [ 1.339       0.721       0.824       1.71        1.236     ]
 [ 1.339       0.721       0.824       1.71        1.236     ]
 [ 1.2877696   0.7417984   0.8013696   1.587184    1.3580544 ]
 [ 1.28043136  0.76687744  0.88219136  1.3994944   1.29688704]
 [ 1.28043136  0.76687744  0.88219136  1.3994944   1.29688704]
 [ 1.28043136  0.76687744  0.88219136  1.3994944   1.29688704]
 [ 1.35935594  0.83266045  0.8240753   1.02414244  1.38852256]
 [ 1.30094767  0.80530982  0.85898166  1.0331386   1.45104273]]

fmin可以采用一个可选的回调函数,该函数在每个步骤中都会被调用,因此您可以创建一个简单的回调函数来获取每个步骤中的值:

def save_step(k):
    global steps
    steps.append(k)

steps = []
scipy.optimize.fmin(rosen, x0, full_output=True, callback=save_step)
print np.array(steps)[:10]
输出:

[[ 1.339       0.721       0.824       1.71        1.236     ]
 [ 1.339       0.721       0.824       1.71        1.236     ]
 [ 1.339       0.721       0.824       1.71        1.236     ]
 [ 1.339       0.721       0.824       1.71        1.236     ]
 [ 1.2877696   0.7417984   0.8013696   1.587184    1.3580544 ]
 [ 1.28043136  0.76687744  0.88219136  1.3994944   1.29688704]
 [ 1.28043136  0.76687744  0.88219136  1.3994944   1.29688704]
 [ 1.28043136  0.76687744  0.88219136  1.3994944   1.29688704]
 [ 1.35935594  0.83266045  0.8240753   1.02414244  1.38852256]
 [ 1.30094767  0.80530982  0.85898166  1.0331386   1.45104273]]