Scikit learn 如何获得PCA中所有极端方差所需的分量数量?

Scikit learn 如何获得PCA中所有极端方差所需的分量数量?,scikit-learn,pca,Scikit Learn,Pca,我试图得到需要用于分类的组件数量。我已经阅读了一个类似的问题和scikit文件: 然而,这仍然没有解决我的问题。我所有的PCA组件都非常大,当然我可以选择所有组件,但如果我这样做,PCA将毫无用处 我还阅读了scikit learn中的PCA库 它表明L: 如果n_components=='mle',则使用Minka的mle 猜测尺寸如果0

我试图得到需要用于分类的组件数量。我已经阅读了一个类似的问题和scikit文件:

然而,这仍然没有解决我的问题。我所有的PCA组件都非常大,当然我可以选择所有组件,但如果我这样做,PCA将毫无用处

我还阅读了scikit learn中的PCA库 它表明L:

如果n_components=='mle',则使用Minka的mle 猜测尺寸如果0 然而,我找不到更多关于使用这种技术分析PCA的n_成分的信息

以下是我的PCA分析代码:

from sklearn.decomposition import PCA
    pca = PCA()
    pca.fit(x_array_train)
    print(pca.explained_variance_)
结果:

   [  6.58902714e+50   6.23266555e+49   2.93568652e+49   2.25418736e+49
       1.10063872e+49   3.25107359e+40   4.72113817e+39   1.40411862e+39
       4.03270198e+38   1.60662882e+38   3.20028861e+28   2.35570241e+27
       1.54944915e+27   8.05181151e+24   1.42231553e+24   5.05155955e+23
       2.90909468e+23   2.60339206e+23   1.95672973e+23   1.22987336e+23
       9.67133111e+22   7.07208772e+22   4.49067983e+22   3.57882593e+22
       3.03546737e+22   2.38077950e+22   2.18424235e+22   1.79048845e+22
       1.50871735e+22   1.35571453e+22   1.26605081e+22   1.04851395e+22
       8.88191944e+21   6.91581346e+21   5.43786989e+21   5.05544020e+21
       4.33110823e+21   3.18309135e+21   3.06169368e+21   2.66513522e+21
       2.57173046e+21   2.36482212e+21   2.32203521e+21   2.06033130e+21
       1.89039408e+21   1.51882514e+21   1.29284842e+21   1.26103770e+21
       1.22012185e+21   1.07857244e+21   8.55143095e+20   4.82321416e+20
       2.98301261e+20   2.31336276e+20   1.31712446e+20   1.05253795e+20
       9.84992112e+19   8.27574150e+19   4.66007620e+19   4.09687463e+19
       2.89855823e+19   2.79035170e+19   1.57015298e+19   1.39218538e+19
       1.00594159e+19   7.31960049e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.29043685e+18   5.29043685e+18   5.29043685e+18
       5.29043685e+18   5.24952686e+18   2.09685699e+18   4.16588190e+17]
我尝试了PCA(n_components='mle'),但是我得到了这些错误

    Traceback (most recent call last):
  File "xx", line 166, in <module>
    pca.fit(x_array_train)
  File "xx", line 225, in fit
    self._fit(X)
  File "/Users/lib/python2.7/site-packages/sklearn/decomposition/pca.py", line 294, in _fit
    n_samples, n_features)
  File "/Users/lib/python2.7/site-packages/sklearn/decomposition/pca.py", line 98, in _infer_dimension_
    ll[rank] = _assess_dimension_(spectrum, rank, n_samples, n_features)
  File "/Users/lib/python2.7/site-packages/sklearn/decomposition/pca.py", line 83, in _assess_dimension_
    (1. / spectrum_[j] - 1. / spectrum_[i])) + log(n_samples)
ValueError: math domain error
回溯(最近一次呼叫最后一次):
文件“xx”,第166行,在
pca.fit(x_阵列_序列)
文件“xx”,第225行,合适
自适配(X)
文件“/Users/lib/python2.7/site packages/sklearn/decomposition/pca.py”,第294行
n_样本,n_特征)
文件“/Users/lib/python2.7/site packages/sklearn/decomposition/pca.py”,第98行,在“推断”维度中_
ll[rank]=“评估维度”(光谱、等级、n个样本、n个特征)
文件“/Users/lib/python2.7/site packages/sklearn/decomposition/pca.py”,第83行,在“评估”维度中_
(1./spectrum[uj]-1./spectrum[i])+对数(n个样本)
ValueError:数学域错误

非常感谢您的帮助

我没有使用
Python
,但我在
C++
opencv
中做了一些您需要的事情。希望你能成功地把它转换成任何语言

// choose how many eigenvectors you want:
int nEigensOfInterest = 0;
float sum = 0.0;
for (int i = 0; i < mEiVal.rows; ++i)
{
    sum += mEiVal.at<float>(i, 0);
    if (((sum * 100) / (sumOfEigens)) > 80)
    {
        nEigensOfInterest = i;
        break;
    }
}
logfile << "No of Eigens of interest: " << nEigensOfInterest << std::endl << std::endl;
//选择需要多少特征向量:
int nEigensOfInterest=0;
浮动总和=0.0;
对于(int i=0;i80)
{
无利害关系=i;
打破
}
}

logfile在寻找主成分分析中相关特征值的数量这一主题上,有几项调查可用。我喜欢断棒法和平行分析法。用谷歌搜索它们或者看一看。

我自己也在学习,但在我看来,使用
0
的参考表明,你可以将
n\u组件
设置为,比如说,0.85,然后使用你需要解释85%方差的组件的确切数量。您还可以通过打印
sum(pca.explained\u variance)
来验证是否选择了正确数量的组件。对于您的数据,您应该获得超过0.85(或您选择的任何值)的最小方差百分比总和

当然,有更复杂的方法来选择许多组件,但70%-90%的经验法则是一个合理的开始。

我假设您的列车阵列(
x\u array\u train
)是这样标准化的:

from sklearn.preprocessing import StandardScaler
x_array_train = StandardScaler().fit_transform(x_array)
然而,我的解决办法如下:

from sklearn.decomposition import PCA as pca
your_pca = pca(n_components = "mle", svd_solver ="full")
your_pca.fit_transform(x_array_train)
print(your_pca.explained_variance_)

通过这种方式,您应该获得尽可能少的主成分,因为
mle
算法允许您

您是否尝试过
PCA(n_components='mle')
?您好,是的,我尝试过,我更新了关于此的问题:)PCA(n_components='mle')似乎不时遇到日志(0)错误:-(roc图!经典,我找到了:解释了,方差,比率,它们做了完全相同的事情。谢谢!