Python 梯度增强分类器训练损失增加且不收敛

Python 梯度增强分类器训练损失增加且不收敛,python,numpy,scikit-learn,multilabel-classification,Python,Numpy,Scikit Learn,Multilabel Classification,我试图为我的多类别分类问题找到一个模型。我有一个150k记录的训练集,X_train.shape=(150000,89)和y_train.shape=(150000,)有462个类别整数标签。我想尝试一下sklearn.essemble.GradientBoostingClassifier,看看它的性能如何。 问题是培训损失在增加,而不是减少: Starting Learning rate: 0.01 Iter Train Loss Remaining Time

我试图为我的多类别分类问题找到一个模型。我有一个150k记录的训练集,X_train.shape=(150000,89)和y_train.shape=(150000,)有462个类别整数标签。我想尝试一下sklearn.essemble.GradientBoostingClassifier,看看它的性能如何。 问题是培训损失在增加,而不是减少:

Starting Learning rate:  0.01
      Iter       Train Loss   Remaining Time 
         1      560305.4652         4495.28m
         2 49997116709991915540048202694656.0000         4821.85m
         3 83239558948150798998862338330957347606091880446602191149465600.0000         4930.27m
         4 83239558948150798998862338330957347606091880446602191149465600.0000         4930.59m
         5 83239558948150798998862338330957347606091880446602191149465600.0000         4894.59m
         6 528425156187558281292347469394171433826548228598829759650220334971581416568393759237556439905294529429284743947837505536.0000         4873.90m
         7 528425156187558281292347469394171433826548228598829759650220334971581416568393759237556439905294529429284743947837505536.0000         4867.15m
         8 528425156187558281292347469394171433826548228598829759650220334971581416568393759237556439905294529429284743947837505536.0000         4860.32m
...
我做错了什么?我的代码:

import sklearn.model_selection
import sklearn.datasets
import sklearn.metrics
import numpy as np
X_train = np.load("X_train_automl.npy")
X_test = np.load("X_val_automl.npy")
y_train = np.load("Y_train_automl.npy")
y_test = np.load("Y_val_automl.npy")
y_train = y_train.astype(int)
y_test = y_test.astype(int)


from sklearn.preprocessing import MinMaxScaler
from sklearn.model_selection import train_test_split
from sklearn.metrics import classification_report, confusion_matrix
from sklearn.ensemble import GradientBoostingClassifier

lr_list = [0.01, 0.05, 0.1, 0.25, 0.5, 0.75, 1]
#max_depth=2,,  random_state=0, n_estimators=20, 
for learning_rate in lr_list:
    print("Starting Learning rate: ", learning_rate)
    gb_clf = GradientBoostingClassifier(learning_rate=learning_rate, max_features="auto", verbose =2, max_depth=5, n_estimators=500)
    gb_clf.fit(X_train, y_train)

    print("Learning rate: ", learning_rate)
    print("Accuracy score (training): {0:.3f}".format(gb_clf.score(X_train, y_train)))
    print("Accuracy score (validation): {0:.3f}".format(gb_clf.score(X_val, y_val)))

我发现通过将学习率从0.01降低到0.001,损失函数开始减少。。。。。因此,似乎学习率太高了…

我发现,通过将学习率从0.01降低到0.001,损失函数开始降低。。。。。因此,学习率似乎太高了……

u plz可以清除所有这些数字吗,也许可以将它们转换为科学记数法?u plz可以清除所有这些数字吗,也许可以将它们转换为科学记数法?您可以发布新的损失值吗?这将有助于完成回答。你能公布新的损失值吗?这将有助于完成这个答案