Python LDA是否忽略n_组件?

Python LDA是否忽略n_组件?,python,scikit-learn,dimensionality-reduction,Python,Scikit Learn,Dimensionality Reduction,当我尝试使用Scikit Learn的LDA时,它总是只给我一个组件,尽管我要求更多: >>> from sklearn.lda import LDA >>> x = np.random.randn(5,5) >>> y = [True, False, True, False, True] >>> for i in range(1,6): ... lda = LDA(n_components=i) ... m

当我尝试使用Scikit Learn的LDA时,它总是只给我一个组件,尽管我要求更多:

>>> from sklearn.lda import LDA
>>> x = np.random.randn(5,5)
>>> y = [True, False, True, False, True]
>>> for i in range(1,6):
...     lda = LDA(n_components=i)
...     model = lda.fit(x,y)
...     model.transform(x)
给予

正如你所看到的,它每次只打印出一个维度。为什么会这样?这和共线的变量有关吗

此外,当我使用Scikit Learn的PCA执行此操作时,它会给我想要的

>>> from sklearn.decomposition import PCA
>>> for i in range(1,6):
...     pca = PCA(n_components=i)
...     model = pca.fit(x)
...     model.transform(x)
... 
array([[ 0.83688322],
       [ 0.79565477],
       [-2.4373344 ],
       [ 0.72500848],
       [ 0.07978792]])
array([[ 0.83688322, -1.56459039],
       [ 0.79565477,  0.84710518],
       [-2.4373344 , -0.35548589],
       [ 0.72500848, -0.49079647],
       [ 0.07978792,  1.56376757]])
array([[ 0.83688322, -1.56459039, -0.3353066 ],
       [ 0.79565477,  0.84710518, -1.21454498],
       [-2.4373344 , -0.35548589, -0.16684946],
       [ 0.72500848, -0.49079647,  1.09006296],
       [ 0.07978792,  1.56376757,  0.62663807]])
array([[ 0.83688322, -1.56459039, -0.3353066 ,  0.22196922],
       [ 0.79565477,  0.84710518, -1.21454498, -0.15961993],
       [-2.4373344 , -0.35548589, -0.16684946, -0.04114339],
       [ 0.72500848, -0.49079647,  1.09006296, -0.2438673 ],
       [ 0.07978792,  1.56376757,  0.62663807,  0.2226614 ]])
array([[  8.36883220e-01,  -1.56459039e+00,  -3.35306597e-01,
          2.21969223e-01,  -1.66533454e-16],
       [  7.95654771e-01,   8.47105182e-01,  -1.21454498e+00,
         -1.59619933e-01,   3.33066907e-16],
       [ -2.43733440e+00,  -3.55485895e-01,  -1.66849458e-01,
         -4.11433949e-02,   0.00000000e+00],
       [  7.25008484e-01,  -4.90796471e-01,   1.09006296e+00,
         -2.43867297e-01,  -1.38777878e-16],
       [  7.97879229e-02,   1.56376757e+00,   6.26638070e-01,
          2.22661402e-01,   2.22044605e-16]])

LDA.transform
的相关降维行,它使用
缩放
。如中所述,
scalings\uu
最多有
n类-1列。这是使用
transform
可以获得的最大列数。在你的例子中,两个类
(True,False)
,最多生成一列。

你能发布你如何打印所有内容吗?这只是在Python解释器中,因此model.transform(x)将输出你看到的内容。啊,好吧,没关系,我认为这只是有两个类,
n\u组件
可能被
n\u类-1
限制,但我可能弄错了。我不明白。我怎样才能让LDA将我的数据从5维减少到4维呢?你不能(至少不能用普通的LDA)。退房构建用于捕获类间/类内方差的矩阵的秩最多为
n类-1
,因此最多只能产生捕获任何方差的
n类-1
方向。对于2类,这将精确地减少到1个判别向量。嗯,好的。。。在哪里我可以更详细地了解到您所描述的内容,也就是说,可以更清楚、更详细地解释您所描述的内容?
>>> from sklearn.decomposition import PCA
>>> for i in range(1,6):
...     pca = PCA(n_components=i)
...     model = pca.fit(x)
...     model.transform(x)
... 
array([[ 0.83688322],
       [ 0.79565477],
       [-2.4373344 ],
       [ 0.72500848],
       [ 0.07978792]])
array([[ 0.83688322, -1.56459039],
       [ 0.79565477,  0.84710518],
       [-2.4373344 , -0.35548589],
       [ 0.72500848, -0.49079647],
       [ 0.07978792,  1.56376757]])
array([[ 0.83688322, -1.56459039, -0.3353066 ],
       [ 0.79565477,  0.84710518, -1.21454498],
       [-2.4373344 , -0.35548589, -0.16684946],
       [ 0.72500848, -0.49079647,  1.09006296],
       [ 0.07978792,  1.56376757,  0.62663807]])
array([[ 0.83688322, -1.56459039, -0.3353066 ,  0.22196922],
       [ 0.79565477,  0.84710518, -1.21454498, -0.15961993],
       [-2.4373344 , -0.35548589, -0.16684946, -0.04114339],
       [ 0.72500848, -0.49079647,  1.09006296, -0.2438673 ],
       [ 0.07978792,  1.56376757,  0.62663807,  0.2226614 ]])
array([[  8.36883220e-01,  -1.56459039e+00,  -3.35306597e-01,
          2.21969223e-01,  -1.66533454e-16],
       [  7.95654771e-01,   8.47105182e-01,  -1.21454498e+00,
         -1.59619933e-01,   3.33066907e-16],
       [ -2.43733440e+00,  -3.55485895e-01,  -1.66849458e-01,
         -4.11433949e-02,   0.00000000e+00],
       [  7.25008484e-01,  -4.90796471e-01,   1.09006296e+00,
         -2.43867297e-01,  -1.38777878e-16],
       [  7.97879229e-02,   1.56376757e+00,   6.26638070e-01,
          2.22661402e-01,   2.22044605e-16]])