Python 为回归重现LightGBM自定义损失函数
我想重现LightGBM的自定义损耗函数。这就是我所尝试的:Python 为回归重现LightGBM自定义损失函数,python,python-3.x,machine-learning,xgboost,lightgbm,Python,Python 3.x,Machine Learning,Xgboost,Lightgbm,我想重现LightGBM的自定义损耗函数。这就是我所尝试的: lgb.train(params=params, train_set=dtrain, num_boost_round=num_round, fobj=default_mse_obj) 默认_mse_obj定义为: residual = y_true - y_pred.get_label() grad = -2.0*residual hess = 2.0+(residual*0) return grad, hess 但是,与定义的自
lgb.train(params=params, train_set=dtrain, num_boost_round=num_round, fobj=default_mse_obj)
默认_mse_obj定义为:
residual = y_true - y_pred.get_label()
grad = -2.0*residual
hess = 2.0+(residual*0)
return grad, hess
但是,与定义的自定义损失函数相比,默认“回归”目标的评估指标不同。我想知道,LightGBM对“回归”目标使用的默认函数是什么?如您所见,这是回归任务的默认损失函数
def default_mse_obj(y_pred, dtrain):
y_true = dtrain.get_label()
grad = (y_pred - y_true)
hess = np.ones(len(grad))
return grad, hess