Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/python/351.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Python XGBRegressionor.fit()方法TypeError:应为range()整数结束参数,获得浮点值_Python_Xgboost - Fatal编程技术网

Python XGBRegressionor.fit()方法TypeError:应为range()整数结束参数,获得浮点值

Python XGBRegressionor.fit()方法TypeError:应为range()整数结束参数,获得浮点值,python,xgboost,Python,Xgboost,我运行了mod.fit(X,y)并收到错误: TypeError:应为range()整数结束参数,获得浮点值 (见下文)。X和y输入似乎没有任何问题。该错误似乎在xgboost代码中。它成功地适合其他型号,但我最近通过conda安装了它: conda安装-c conda forge xgboost 我正在MacOS 10.10.5上运行python 2.7.11 模型参数为: { 'base_score': 5.0, 'booster': 'gbtree'

我运行了mod.fit(X,y)并收到错误:

TypeError:应为range()整数结束参数,获得浮点值

(见下文)。
X
y
输入似乎没有任何问题。该错误似乎在xgboost代码中。它成功地适合其他型号,但我最近通过conda安装了它:

conda安装-c conda forge xgboost

我正在MacOS 10.10.5上运行python 2.7.11

模型参数为:

{ 'base_score':         5.0,
  'booster':           'gbtree',
  'colsample_bylevel':  1,
  'colsample_bytree':   1,
  'gamma':              0,
  'learning_rate':      0.07500000000000001,
  'max_delta_step':     0,
  'max_depth':          4,
  'min_child_weight':   1,
  'missing':            None,
  'n_estimators':      75.0,
  'n_jobs':            -1,
  'nthread':            None,
  'objective':         'reg:linear',
  'random_state':       0,
  'reg_alpha':          0,
  'reg_lambda':         1,
  'scale_pos_weight':   1,
  'seed':               0,
  'silent':             True,
  'subsample':          1
   }

---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
<ipython-input-30-38afa4aff6db> in <module>()
----> 1 mod.fit(X_train.values, y_train.values)

/Users/chriseal/anaconda/lib/python2.7/site-packages/xgboost/sklearn.pyc in fit(self, X, y, eval_set, eval_metric, early_stopping_rounds, verbose)
    249                               early_stopping_rounds=early_stopping_rounds,
    250                               evals_result=evals_result, obj=obj, feval=feval,
--> 251                               verbose_eval=verbose)
    252
    253         if evals_result:

/Users/chriseal/anaconda/lib/python2.7/site-packages/xgboost/training.pyc in train(params, dtrain, num_boost_round, evals, obj, feval, maximize, early_stopping_rounds, evals_result, verbose_eval, learning_rates, xgb_model, callbacks)
    203                            evals=evals,
    204                            obj=obj, feval=feval,
--> 205                            xgb_model=xgb_model, callbacks=callbacks)
    206
    207

/Users/chriseal/anaconda/lib/python2.7/site-packages/xgboost/training.pyc in _train_internal(params, dtrain, num_boost_round, evals, obj, feval, xgb_model, callbacks)
     62         cb for cb in callbacks if not cb.__dict__.get('before_iteration', False)]
     63
---> 64     for i in range(start_iteration, num_boost_round):
     65         for cb in callbacks_before_iter:
     66             cb(CallbackEnv(model=bst,

TypeError: range() integer end argument expected, got float.
{'base_score':5.0,
“助推器”:“gbtree”,
“colsample_bylevel”:1,
“colsample_bytree”:1,
“gamma”:0,
“学习率”:0.075000000000000001,
“最大增量步长”:0,
“最大深度”:4,
“最小儿童体重”:1,
“缺失”:无,
‘n_估计量’:75.0,
“n_jobs”:-1,
“nthread”:无,
“目标”:“注册:线性”,
“随机_状态”:0,
“reg_alpha”:0,
“reg_lambda”:1,
“秤位重量”:1,
“种子”:0,
“沉默”:没错,
“子样本”:1
}
---------------------------------------------------------------------------
TypeError回溯(最近一次调用上次)
在()
---->1模配合(X_系列值、y_系列值)
/Users/chriseal/anaconda/lib/python2.7/site-packages/xgboost/sklearn.pyc合适(self、X、y、eval\u集、eval\u度量、提前停止轮、详细)
249个提前停止轮=提前停止轮,
250评估结果=评估结果,obj=obj,feval=feval,
-->251详细(评估=详细)
252
253如果评估结果:
/Users/chriseal/anaconda/lib/python2.7/site-packages/xgboost/training.pyc in train(参数、数据训练、数值推进、评估、对象、feval、最大化、提前停止、评估结果、详细评估、学习率、xgb模型、回调)
203 evals=evals,
204 obj=obj,feval=feval,
-->205 xgb_模型=xgb_模型,回调=回调)
206
207
/Users/chriseal/anaconda/lib/python2.7/site-packages/xgboost/training.pyc in_train_internal(参数、dtrain、num_boost_round、evals、obj、feval、xgb_model、回调)
如果不是cb.dict.get('before\u iteration',False),回调中的cb为62 cb
63
--->64表示范围内的i(开始迭代,第次循环):
65对于iter前回调中的cb:
66 cb(CallbackEnv(型号=bst,
TypeError:应为range()整数结束参数,获得浮点值。

我刚刚算出了。我在迭代超参数,而n_估计量是一个
浮点(75.0)而不是整数(75)。很容易修复