Python 3.x 在三重高斯拟合过程中,初始猜测值和高卡方值之间存在巨大差异

Python 3.x 在三重高斯拟合过程中,初始猜测值和高卡方值之间存在巨大差异,python-3.x,matplotlib,curve-fitting,lmfit,Python 3.x,Matplotlib,Curve Fitting,Lmfit,我正在使用这个代码 import matplotlib.pyplot as plt from numpy import exp, loadtxt, pi, sqrt from lmfit import Model data = loadtxt('model1d_gauss.dat') x = data[:, 0] y = data[:, 1] + 0.25*x - 1.0 def gaussian(x, amp, cen, wid): """1-d

我正在使用这个代码

import matplotlib.pyplot as plt
from numpy import exp, loadtxt, pi, sqrt

from lmfit import Model

data = loadtxt('model1d_gauss.dat')
x = data[:, 0]
y = data[:, 1] + 0.25*x - 1.0


def gaussian(x, amp, cen, wid):
    """1-d gaussian: gaussian(x, amp, cen, wid)"""
    return (amp / (sqrt(2*pi) * wid)) * exp(-(x-cen)**2 / (2*wid**2))


def line(x, slope, intercept):
    """a line"""
    return slope*x + intercept


mod = Model(gaussian) + Model(line)
pars = mod.make_params(amp=5, cen=5, wid=1, slope=0, intercept=1)

result = mod.fit(y, pars, x=x)

print(result.fit_report())

plt.plot(x, y, 'bo')
plt.plot(x, result.init_fit, 'k--')
plt.plot(x, result.best_fit, 'r-')
plt.show()
我尝试用三重高斯拟合来拟合数据,而不是“高斯加直线”拟合。然后像这样修改代码

y=[9, 11, 7, 6, 21, 9, 36, 8, 22, 7, 25, 27, 18, 22, 22, 18, 21, 17, 16, 13, 30, 8, 10, 18, 12, 17, 24, 19, 18, 25, 6, 18, 20, 36, 22, 12, 25, 20, 22, 32, 30, 32, 51, 52, 46, 41, 49, 51, 56, 71, 56, 58, 73, 66, 71, 80, 76, 90, 71, 71, 87, 68, 74, 67, 71, 67, 75, 51, 51, 57, 38, 45, 39, 37, 23, 23, 21, 20, 13, 9, 10, 7, 5, 5, 9, 5, 6, 5, 0]
    
x=[-4.91, -3.29, -2.5700000000000003, -2.39, -2.21, -1.94, -1.67, -1.4900000000000002, -1.4000000000000004, -1.2200000000000002, -1.1300000000000003, -1.04, -0.8600000000000003, -0.6799999999999997, -0.5, -0.41000000000000014, -0.3200000000000003, -0.23000000000000043, -0.14000000000000057, -0.04999999999999982, 0.040000000000000036, 0.1299999999999999, 0.21999999999999975, 0.3099999999999996, 0.39999999999999947, 0.4900000000000002, 0.5800000000000001, 0.6699999999999999, 0.7599999999999998, 0.8499999999999996, 0.9399999999999995, 1.0299999999999994, 1.12, 1.21, 1.2999999999999998, 1.3899999999999997, 1.4799999999999995, 1.5699999999999994, 1.6600000000000001, 1.75, 1.8399999999999999, 1.9299999999999997, 2.0199999999999996, 2.1099999999999994, 2.1999999999999993, 2.29, 2.38, 2.4699999999999998, 2.5599999999999996, 2.6499999999999995, 2.7399999999999993, 2.83, 2.92, 3.01, 3.0999999999999996, 3.1899999999999995, 3.2799999999999994, 3.369999999999999, 3.459999999999999, 3.549999999999999, 3.6400000000000006, 3.7300000000000004, 3.8200000000000003, 3.91, 4.0, 4.09, 4.18, 4.27, 4.359999999999999, 4.449999999999999, 4.539999999999999, 4.629999999999999, 4.719999999999999, 4.8100000000000005, 4.9, 4.99, 5.08, 5.17, 5.26, 5.35, 5.4399999999999995, 5.529999999999999, 5.619999999999999, 5.709999999999999, 5.799999999999999, 5.98, 6.07, 6.25, 6.609999999999999]
    
    def gaussian1(x, amp1, cen1, wid1):
        "1-d gaussian: gaussian(x, amp, cen, wid)"
        return (amp1/(sqrt(2*pi)*wid1)) * exp(-(x-cen1)**2 /(2*wid1**2))
    
    def gaussian2(x, amp2, cen2, wid2):
        "1-d gaussian: gaussian(x, amp, cen, wid)"
        return (amp2/(sqrt(2*pi)*wid2)) * exp(-(x-cen2)**2 /(2*wid2**2))
    
    def gaussian3(x, amp3, cen3, wid3):
        "1-d gaussian: gaussian(x, amp, cen, wid)"
        return (amp3/(sqrt(2*pi)*wid3)) * exp(-(x-cen3)**2 /(2*wid3**2))
    
    
    
    mod = Model(gaussian1) + Model(gaussian2) + Model(gaussian3)
    pars  = mod.make_params( amp1=23,cen1=-1.5,wid1=3.5,amp2=17,cen2=1.0,wid2=1.5,amp3=80,cen3=3.5,wid3=3.0 )
    
    result = mod.fit(y, pars, x=x)
    
    print(result.fit_report())
    
    #plt.plot(x, y,         'bo')
    plt.plot(x, result.init_fit, 'k--')
    plt.plot(x, result.best_fit, 'r-')
    plt.show()
但问题是,我得到了与初始猜测值的巨大差异,以及非常高的卡方值。我在下面附上了输出值

[[Model]]
    ((Model(gaussian1) + Model(gaussian2)) + Model(gaussian3))
[[Fit Statistics]]
    # fitting method   = leastsq
    # function evals   = 751
    # data points      = 89
    # variables        = 9
    chi-square         = 3715.94994
    reduced chi-square = 46.4493743
    Akaike info crit   = 350.126040
    Bayesian info crit = 372.523767
##  Warning: uncertainties could not be estimated:
[[Variables]]
    amp1: -7174.13129 (init = 23)
    cen1: -853.048883 (init = -1.5)
    wid1: -84.6651961 (init = 3.5)
    amp2: -189.857626 (init = 17)
    cen2:  3.47343596 (init = 1)
    wid2: -1.02072899 (init = 1.5)
    amp3:  111.911023 (init = 80)
    cen3: -0.65585443 (init = 3.5)
    wid3:  2.37279022 (init = 3)

请帮助我减少奇斯方值和初始猜测值周围的参数值。

拟合告诉您数据的基本图将显示什么:此数据中有两个高斯数,一个振幅约为190,以3.5为中心,另一个振幅约为110,以-0.65为中心。没有第三个高斯分布


您可能能够给出更好的猜测,或者使用更好的高斯定义(啊,就像内置版本;))来防止宽度为负值。您可能会对参数值设置边界,但我猜数据将始终显示两个可靠的峰值,但不会显示可靠的第三个峰值。

根据您的参数猜测值进行两个高斯拟合仍然不起作用。卡方检验=3715.94993如果你得到卡方检验的值,那么拟合是“有效的”,至少在某种意义上是这样的(在编程的意义上,最适合这个论坛)。如果您得到了比您认为应该得到的更差的匹配,那么这是一种不同类型的问题——不完全是堆栈溢出的问题。当然,您必须发布一个完整的示例,并提供完整的输出。