Warning: file_get_contents(/data/phpspider/zhask/data//catemap/9/three.js/2.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Python 如何在Xgboost中修复此警告?_Python_Xgboost - Fatal编程技术网

Python 如何在Xgboost中修复此警告?

Python 如何在Xgboost中修复此警告?,python,xgboost,Python,Xgboost,我有一个不平衡的数据集,有53987行、32列和8个类。我正在尝试执行多类分类。这是我的代码和相应的输出: from sklearn.metrics import classification_report, accuracy_score import xgboost xgb_model = xgboost.XGBClassifier(num_class=7, learning_rate=0.1, num_iterations=1000, max_depth=10, feature_fracti

我有一个不平衡的数据集,有53987行、32列和8个类。我正在尝试执行多类分类。这是我的代码和相应的输出:

from sklearn.metrics import classification_report, accuracy_score
import xgboost
xgb_model = xgboost.XGBClassifier(num_class=7, learning_rate=0.1, num_iterations=1000, max_depth=10, feature_fraction=0.7, 
                              scale_pos_weight=1.5, boosting='gbdt', metric='multiclass')
hr_pred = xgb_model.fit(x_train, y_train).predict(x_test)
print(classification_report(y_test, hr_pred))


[10:03:13] WARNING: C:/Users/Administrator/workspace/xgboost-win64_release_1.3.0/src/learner.cc:541: 
Parameters: { boosting, feature_fraction, metric, num_iterations, scale_pos_weight } might not be used.

This may not be accurate due to some parameters are only used in language bindings but
passed down to XGBoost core.  Or some parameters are not used but slip through this verification. Please open an issue if you find above cases.

[10:03:13] WARNING: C:/Users/Administrator/workspace/xgboost-win64_release_1.3.0/src/learner.cc:1061: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'multi:softprob' was changed from 'merror' to 'mlogloss'. Explicitly set eval_metric if you'd like to restore the old behavior.
          precision    recall  f1-score   support

     1.0       0.84      0.92      0.88      8783
     2.0       0.78      0.80      0.79      4588
     3.0       0.73      0.59      0.65      2109
     4.0       1.00      0.33      0.50         3
     5.0       0.42      0.06      0.11       205
     6.0       0.60      0.12      0.20       197
     7.0       0.79      0.44      0.57       143
     8.0       0.74      0.30      0.42       169

accuracy                           0.81     16197
macro avg       0.74      0.45      0.52     16197
weighted avg       0.80      0.81      0.80     16197


如何修复这些警告?

如果您不想更改任何行为,只需按以下方式设置
eval\u metric='mlogloss'

xgb\u model=xgboost.XGBClassifier(num\u class=7,
学习率=0.1,
num_迭代次数=1000,
最大深度=10,
特征_分数=0.7,
天平位置重量=1.5,
boosting='gbdt',
度量='多类',
eval_metric='mlogloss')

从警告日志中,您将知道要设置什么
eval\u metric
算法来删除警告。主要是
mlogloss
logloss

欢迎来到StackOverflow。请先创建一个MWE(),不要将代码作为图像()发布。欢迎使用Stackoverflow。请确保1)在问题中以文本形式包含代码和错误消息。截图,甚至更糟糕的截图链接,阅读起来不是很好,尤其是在移动设备上。另外2)请说明您的确切问题是什么,警告(有两个)给出了如何操作的说明,因此不清楚为什么您无法执行此操作。此外,升级到最新的XGBoost版本可能会自动删除其中一些警告。安全方面的注意事项。请设置标准用户帐户(对于需要提升权限的任务,使用UAC提示)。将模型训练为管理员/根用户是不安全的,尤其是在网络连接的机器上。
max_depth_list = [3,5,7,9,10,15,20,25,30]

for max_depth in max_depth_list:
    xgb_model = xgboost.XGBClassifier(max_depth=max_depth, seed=777)
    xgb_pred = xgb_model.fit(x_train, y_train).predict(x_test)
    xgb_f1_score_micro = f1_score(y_test, xgb_pred, average='micro')

    xgb_df = pd.DataFrame({'tree depth':max_depth_list,             
                            'accuracy':xgb_f1_score_micro})
    xgb_df

WARNING: C:/Users/Administrator/workspace/xgboost-win64_release_1.3.0/src/learner.cc:1061: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'multi:softprob' was changed from 'merror' to 'mlogloss'. Explicitly set eval_metric if you'd like to restore the old behavior.