Machine learning LightGBM中的装袋工作原理

Machine learning LightGBM中的装袋工作原理,machine-learning,xgboost,lightgbm,Machine Learning,Xgboost,Lightgbm,在lightGBM模型中,有两个参数与装袋相关 bagging_fraction bagging_freq (frequency for bagging 0 means disable bagging; k means perform bagging at every k iteration Note: to enable bagging, bagging_fraction should be set to

在lightGBM模型中,有两个参数与装袋相关

bagging_fraction
bagging_freq (frequency for bagging
              0 means disable bagging; k means perform bagging at every k 
              iteration
              Note: to enable bagging, bagging_fraction should be set to 
              value smaller than 1.0 as well)

我可以在gdbt中找到关于这个打包功能的更详细的解释。那么有人给我更详细的解释吗?

代码执行文档中所说的内容-它对大小为
bagging\u fraction*N\u train\u示例的训练示例子集进行采样。并且对该子集执行第i树的训练。可以对每棵树(即每一次迭代)或在每棵
bagging\u freq
树进行训练后进行取样

例如,
bagging\u fraction=0.5,bagging\u freq=10
意味着新的
0.5*N\u train\u示例的采样将每10次迭代进行一次