Python中的特征值:Bug?

Python中的特征值:Bug?,python,numpy,scipy,linear-algebra,eigenvector,Python,Numpy,Scipy,Linear Algebra,Eigenvector,这里有两个关于方阵的特征向量和特征值的假设。我相信这两种说法都是正确的: 如果一个矩阵是对称的并且只包含实数,那么它就是一个厄米矩阵,那么所有特征值都应该是实数,所有特征向量的所有分量也应该是实数。从厄米矩阵计算特征向量和特征值时,结果中不应出现复数 根据给定矩阵计算的给定特征值的特征向量应始终指向仅由矩阵和特征值确定的方向。只要算法实现正确,用于计算它的算法对结果没有影响 但当您使用Python中的标准库来计算特征向量和特征值时,这两种假设都不成立。这些方法是否包含bug 从厄米矩阵计算特

这里有两个关于方阵的特征向量和特征值的假设。我相信这两种说法都是正确的:

  • 如果一个矩阵是对称的并且只包含实数,那么它就是一个厄米矩阵,那么所有特征值都应该是实数,所有特征向量的所有分量也应该是实数。从厄米矩阵计算特征向量和特征值时,结果中不应出现复数

  • 根据给定矩阵计算的给定特征值的特征向量应始终指向仅由矩阵和特征值确定的方向。只要算法实现正确,用于计算它的算法对结果没有影响

  • 但当您使用Python中的标准库来计算特征向量和特征值时,这两种假设都不成立。这些方法是否包含bug

    从厄米矩阵计算特征值和特征向量有四种不同的方法:

  • #1和#2可用于任何平方矩阵(包括厄米矩阵)。
    #3和#4仅适用于厄米矩阵。据我所知,它们的目的只是让它们运行得更快,但结果应该是一样的(只要输入是真正的Hermitian)

    但是这四种方法对于相同的输入提供了三种不同的结果。这是我用来测试所有四种方法的程序:

    #!/usr/bin/env python3
    
    import numpy as np
    import scipy.linalg as la
    
    A = [
        [19, -1, -1, -1, -1, -1, -1, -1],
        [-1, 19, -1, -1, -1, -1, -1, -1],
        [-1, -1, 19, -1, -1, -1, -1, -1],
        [-1, -1, -1, 19, -1, -1, -1, -1],
        [-1, -1, -1, -1, 19, -1, -1, -1],
        [-1, -1, -1, -1, -1, 19, -1, -1],
        [-1, -1, -1, -1, -1, -1, 19, -1],
        [-1, -1, -1, -1, -1, -1, -1, 19]
    ]
    
    A = np.array(A, dtype=np.float64)
    
    delta = 1e-12
    A[5,7] += delta
    A[7,5] += delta
    
    if np.array_equal(A, A.T):
        print('input is symmetric')
    else:
        print('input is NOT symmetric')
    
    methods = {
        'np.linalg.eig'  : np.linalg.eig,
        'la.eig'         : la.eig,
        'np.linalg.eigh' : np.linalg.eigh,
        'la.eigh'        : la.eigh
    }
    
    for name, method in methods.items():
    
        print('============================================================')
        print(name)
        print()
    
        eigenValues, eigenVectors = method(A)
        
        for i in range(len(eigenValues)):
            print('{0:6.3f}{1:+6.3f}i '.format(eigenValues[i].real, eigenValues[i].imag), end=' |  ')
            line = eigenVectors[i]
            for item in line:
                print('{0:6.3f}{1:+6.3f}i '.format(item.real, item.imag), end='')
            print()
    
        print('---------------------')
    
        for i in range(len(eigenValues)):
            if eigenValues[i].imag == 0:
                print('real    ', end=' |  ')
            else:
                print('COMPLEX ', end=' |  ')
            line = eigenVectors[i]
            for item in line:
                if item.imag == 0:
                    print('real    ', end='')
                else:
                    print('COMPLEX ', end='')
            print()
    
        print()
    
    这是它产生的输出:

    输入是对称的
    ============================================================
    np.linalg.eig
    12.000+0.000i |-0.354+0.000i 0.913+0.000i 0.204+0.000i-0.013+0.016i-0.013-0.016i 0.160+0.000i-0.000+0.000i 0.130+0.000i
    20.000+0.000i |-0.354+0.000i-0.183+0.000i 0.208+0.000i 0.379-0.171i 0.379+0.171i-0.607+0.000i 0.000+0.000i-0.138+0.000i
    20.000+0.000i |-0.354+0.000i-0.182+0.000i 0.203+0.000i-0.468-0.048i-0.468+0.048i 0.153+0.000i 0.001+0.000i-0.271+0.000i
    20.000+0.000i |-0.354+0.000i-0.182+0.000i 0.203+0.000i 0.657+0.000i 0.657-0.000i 0.672+0.000i-0.001+0.000i 0.617+0.000i
    20.000-0.000i |-0.354+0.000i-0.182+0.000i 0.203+0.000i-0.276+0.101i-0.276-0.101i-0.361+0.000i 0.001+0.000i-0.644+0.000i
    20.000+0.000i |-0.354+0.000i-0.001+0.000i-0.612+0.000i-0.001+0.000i-0.001-0.000i 0.001+0.000i 0.706+0.000i-0.000+0.000i
    20.000+0.000i |-0.354+0.000i-0.182+0.000i 0.203+0.000i-0.276+0.101i-0.276-0.101i-0.018+0.000i-0.000+0.000i 0.306+0.000i
    20.000+0.000i |-0.354+0.000i-0.001+0.000i-0.612+0.000i-0.001+0.000i-0.001-0.000i 0.001+0.000i-0.708+0.000i 0.000+0.000i
    ---------------------
    实|实实复实实
    实|实实复实实
    实|实实复实实
    复杂|真实
    复|实实复实
    实|实实复实实
    实|实实复实实
    实|实实复实实
    ============================================================
    拉威格
    12.000+0.000i |-0.354+0.000i 0.913+0.000i 0.204+0.000i-0.013+0.016i-0.013-0.016i 0.160+0.000i-0.000+0.000i 0.130+0.000i
    20.000+0.000i |-0.354+0.000i-0.183+0.000i 0.208+0.000i 0.379-0.171i 0.379+0.171i-0.607+0.000i 0.000+0.000i-0.138+0.000i
    20.000+0.000i |-0.354+0.000i-0.182+0.000i 0.203+0.000i-0.468-0.048i-0.468+0.048i 0.153+0.000i 0.001+0.000i-0.271+0.000i
    20.000+0.000i |-0.354+0.000i-0.182+0.000i 0.203+0.000i 0.657+0.000i 0.657-0.000i 0.672+0.000i-0.001+0.000i 0.617+0.000i
    20.000-0.000i |-0.354+0.000i-0.182+0.000i 0.203+0.000i-0.276+0.101i-0.276-0.101i-0.361+0.000i 0.001+0.000i-0.644+0.000i
    20.000+0.000i |-0.354+0.000i-0.001+0.000i-0.612+0.000i-0.001+0.000i-0.001-0.000i 0.001+0.000i 0.706+0.000i-0.000+0.000i
    20.000+0.000i |-0.354+0.000i-0.182+0.000i 0.203+0.000i-0.276+0.101i-0.276-0.101i-0.018+0.000i-0.000+0.000i 0.306+0.000i
    20.000+0.000i |-0.354+0.000i-0.001+0.000i-0.612+0.000i-0.001+0.000i-0.001-0.000i 0.001+0.000i-0.708+0.000i 0.000+0.000i
    ---------------------
    实|实实复实实
    实|实实复实实
    实|实实复实实
    复杂|真实
    复|实实复实
    实|实实复实实
    实|实实复实实
    实|实实复实实
    ============================================================
    np.linalg.eigh
    12.000+0.000i |-0.354+0.000i 0.000+0.000i 0.000+0.000i-0.086+0.000i 0.905+0.000i-0.025+0.000i 0.073+0.000i 0.205+0.000i
    20.000+0.000i |-0.354+0.000i 0.000+0.000i-0.374+0.000i 0.149+0.000i-0.236+0.000i-0.388+0.000i 0.682+0.000i 0.206+0.000i
    20.000+0.000i |-0.354+0.000i0.001+0.000i0.551+0.000i0.136+0.000i-0.180+0.000i0.616+0.000i0.317+0.000i0.201+0.000i
    20.000+0.000i |-0.354+0.000i 0.001+0.000i-0.149+0.000i 0.719+0.000i-0.074+0.000i-0.042+0.000i-0.534+0.000i 0.207+0.000i
    20.000+0.000i |-0.354+0.000i-0.005+0.000i 0.505+0.000i-0.386+0.000i-0.214+0.000i-0.556+0.000i-0.274+0.000i 0.203+0.000i
    20.000+0.000i |-0.354+0.000i-0.707+0.000i-0.004+0.000i 0.002+0.000i 0.001+0.000i 0.002+0.000i-0.000+0.000i-0.612+0.000i
    20.000+0.000i |-0.354+0.000i 0
    
    12.000000000000249
    20
    20.00000000000075
    19.999999999999
    20
    20
    20
    20
    
    0.3535533905932847   0.9128505045937204      0.20252576206455747  0.002673672081814904   -0.09302397289286794   -0.09302397289286794   -0.09302397289286794   -0.09302397289286794     
    0.3535533905932848  -0.18259457246238117     0.20444330131542393 -0.00009386949436945406 -0.20415317121194954   -0.20415317121194954   -0.20415317121194954   -0.20415317121194954     
    0.3535533905932848  -0.18259457246238117     0.20444330131542393 -0.00009386949436945406 -0.20415317121194954   -0.20415317121194954   -0.20415317121194954    0.9080920678356449     
    0.3535533905932848  -0.18259457246238117     0.20444330131542393 -0.00009386949436945406 -0.20415317121194954    0.9080920678356449    -0.20415317121194954   -0.20415317121194954     
    0.3535533905932848  -0.18259457246238117     0.20444330131542393 -0.00009386949436945406  0.9080920678356449    -0.20415317121194954   -0.20415317121194954   -0.20415317121194954     
    0.35355339059324065 -0.00011103276380548543 -0.6116010247648269   0.7060012169461334      0.0005790869815273477  0.0005790869815273477  0.0005790869815273477  0.0005790869815273477     
    0.3535533905932848  -0.18259457246238117     0.20444330131542393 -0.00009386949436945406 -0.20415317121194954   -0.20415317121194954    0.9080920678356449    -0.20415317121194954     
    0.35355339059324054  0.0002333904819895115  -0.6131412438770024  -0.7082055415560993      0.0009655029234935232  0.0009655029234935232  0.0009655029234935232  0.0009655029234935232     
    
    (20 - λ)^7 * (12 - λ)
    
    (20 - λ)^5 * (19999999999999/1000000000000 - λ) * (1000000000000 λ^2 - 32000000000001 λ + 240000000000014)/1000000000000
    
    λ=12.00000000000024999999999998 (rounded)
    λ=19.999999999999 (exact value)
    λ=20 (exact value)  
    λ=20 (exact value)  
    λ=20 (exact value)  
    λ=20 (exact value)  
    λ=20 (exact value)  
    λ=20.00000000000075000000000002 (rounded)
    
    v = (1,1,1,1,1,1,1,1)
    
    (-1,1,0,0,0,0,0,0)
    (-1,0,1,0,0,0,0,0)
    (-1,0,0,1,0,0,0,0)
    (-1,0,0,0,1,0,0,0)
    (-1,0,0,0,0,1,0,0)
    (-1,0,0,0,0,0,1,0)
    (-1,0,0,0,0,0,0,1)
    
       20
       12
       20
       20
       20
       20
       20
       20
    
     0.9354143466934854     0.35355339059327395     -0.021596710639534     -0.021596710639534     -0.021596710639534     -0.021596710639534     -0.021596710639533997     -0.021596710639533997
    -0.1336306209562122     0.3535533905932738     -0.15117697447673797     -0.15117697447673797     -0.15117697447673797     -0.15117697447673797     -0.15117697447673797     -0.15117697447673797
    -0.1336306209562122     0.3535533905932738     0.9286585574999623     -0.15117697447673797     -0.15117697447673797     -0.15117697447673797     -0.15117697447673797     -0.15117697447673797
    -0.1336306209562122     0.3535533905932738     -0.15117697447673797     0.9286585574999623     -0.15117697447673797     -0.15117697447673797     -0.15117697447673797     -0.15117697447673797
    -0.1336306209562122     0.3535533905932738     -0.15117697447673797     -0.15117697447673797     0.9286585574999623     -0.15117697447673797     -0.15117697447673797     -0.15117697447673797
    -0.1336306209562122     0.3535533905932738     -0.15117697447673797     -0.15117697447673797     -0.15117697447673797     0.9286585574999623     -0.15117697447673797     -0.15117697447673797
    -0.1336306209562122     0.3535533905932738     -0.15117697447673797     -0.15117697447673797     -0.15117697447673797     -0.15117697447673797     0.9286585574999622     -0.15117697447673797
    -0.1336306209562122     0.3535533905932738     -0.15117697447673797     -0.15117697447673797     -0.15117697447673797     -0.15117697447673797     -0.15117697447673797     0.9286585574999622