Python scipy最小化返回初始值

Python scipy最小化返回初始值,python,numpy,scipy,minimization,Python,Numpy,Scipy,Minimization,我试图使用scipy.minimize执行一个简单的最小化(模拟最大似然的基本示例)。出于某种原因,它只返回初始值。我做错了什么 这是我的密码: import numpy as np from scipy.optimize import minimize # Simulated likelihood function # Arguments: # theta: vector representing probabilities # sims: vector representing unifo

我试图使用scipy.minimize执行一个简单的最小化(模拟最大似然的基本示例)。出于某种原因,它只返回初始值。我做错了什么

这是我的密码:

import numpy as np
from scipy.optimize import minimize

# Simulated likelihood function
# Arguments:
# theta: vector representing probabilities
# sims: vector representing uniform simulated data, e.g. [0.43, 0.11, 0.02, 0.97, 0.77]
# dataCounts: vector representing counts of actual data, e.g. [4, 10, 7]
def simLogLikelihood(theta, sims, dataCounts):

        # Categorise sims using theta
        simCounts = np.bincount(theta.cumsum().searchsorted(sims))

        # Calculate probabilities using simulated data
        simProbs = simCounts/simCounts.sum()
        # Calculate likelihood using simulated probabilities and actual data
        logLikelihood = (dataCounts*np.log(simProbs)).sum()

        return -logLikelihood


# Set seed
np.random.seed(121)

# Generate 'true' data
trueTheta = np.array([0.1, 0.4, 0.5])
dataCounts = np.bincount(np.random.choice([0, 1, 2], 1000, p=trueTheta))

# Generate simulated data (random draws from [0, 1))
sims = np.random.random(1000)

# Choose theta to maximise likelihood
thetaStart = np.array([0.33, 0.33, 0.34])
bnds = ((0, 1), (0, 1), (0, 1))
cons = ({'type': 'eq', 'fun': lambda x:  x.sum() - 1.0})

result = minimize(simLogLikelihood, x0=thetaStart, args=(sims, dataCounts), method='SLSQP', bounds=bnds, constraints=cons)
(bnds
bnds
中的边界反映了概率必须介于0和1之间的事实。
cons
中的约束条件是概率必须总和为1。)

如果我运行此代码,
result
包含:

     fun: 1094.7593617864004
     jac: array([ 0.,  0.,  0.])
 message: 'Optimization terminated successfully.'
    nfev: 5
     nit: 1
    njev: 1
  status: 0
 success: True
       x: array([ 0.33,  0.33,  0.34])

所以它只进行一次迭代,然后返回我开始使用的概率向量。但很容易找到另一个目标较低的概率向量,例如[0.1,0.4,0.5]。出了什么问题?

您的优化问题看起来非常不平滑(可能是因为
np.bincount()
,但我不会深入讨论),这对大多数优化器来说确实是一件坏事。由于还存在约束,所以只剩下2个优化器(SLSQP、COBYLA),它们都假设平滑

添加类似以下内容的打印:

print(theta, -logLikelihood)
最后的to
simloglikelibility
告诉您,在数值微分过程中(因为您没有提供梯度),scipy正在尝试一些小扰动,但目标根本没有改变(非平滑)

虽然num diff可以调整为采取更大的步骤,但我认为您的问题不适合这里

快速演示(不推荐):

输出:

[ 0.  0.  1.] inf
[ 0.21587719  0.2695045   0.51461833] 1013.80776084
[ 0.23010601  0.28726799  0.48262602] 1012.05516321
[ 0.23627513  0.29496961  0.46875527] 1010.48916647
[ 0.2386537   0.29793905  0.46340726] 1010.13774627
[ 0.23957593  0.29909039  0.46133369] 1009.0850268
[ 0.2397671   0.29932904  0.46090387] 1008.96044271
[ 0.23981532  0.29938924  0.46079545] 1008.96044271
[ 0.23983943  0.29941934  0.46074124] 1008.96044271
[ 0.23985149  0.29943439  0.46071414] 1008.96044271
[ 0.23985751  0.29944192  0.46070058] 1008.96044271
     fun: 1008.960442706361
     jac: array([ 947.81880269,  -52.71300484,    0.        ])
 message: 'Optimization terminated successfully.'
    nfev: 44
     nit: 6
    njev: 5
  status: 0
 success: True
       x: array([ 0.23985751,  0.29944192,  0.46070058])
您可以看到,在某些情况下,函数返回的是非有限值。也有非常糟糕的事情


因此,我强烈建议尝试制定一些平滑的内容,而不是调优优化器

感谢sascha注意到我对numpy而不是scipy的错误引用。这些问题已经得到纠正。
result = minimize(simLogLikelihood, x0=thetaStart, args=(sims, dataCounts),
              method='SLSQP', bounds=bnds, constraints=cons, options={'eps': 1e-2})
                                                             # much bigger num-diff steps
[ 0.  0.  1.] inf
[ 0.21587719  0.2695045   0.51461833] 1013.80776084
[ 0.23010601  0.28726799  0.48262602] 1012.05516321
[ 0.23627513  0.29496961  0.46875527] 1010.48916647
[ 0.2386537   0.29793905  0.46340726] 1010.13774627
[ 0.23957593  0.29909039  0.46133369] 1009.0850268
[ 0.2397671   0.29932904  0.46090387] 1008.96044271
[ 0.23981532  0.29938924  0.46079545] 1008.96044271
[ 0.23983943  0.29941934  0.46074124] 1008.96044271
[ 0.23985149  0.29943439  0.46071414] 1008.96044271
[ 0.23985751  0.29944192  0.46070058] 1008.96044271
     fun: 1008.960442706361
     jac: array([ 947.81880269,  -52.71300484,    0.        ])
 message: 'Optimization terminated successfully.'
    nfev: 44
     nit: 6
    njev: 5
  status: 0
 success: True
       x: array([ 0.23985751,  0.29944192,  0.46070058])