Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

host_device_vector.cu:100: Check failed: Size() == other->Size() (0 vs. 115) #6688

Closed
allen20200111 opened this issue Feb 7, 2021 · 5 comments · Fixed by #6689
Closed

host_device_vector.cu:100: Check failed: Size() == other->Size() (0 vs. 115) #6688

allen20200111 opened this issue Feb 7, 2021 · 5 comments · Fixed by #6689

Comments

@allen20200111
Copy link

Thanks for participating in the XGBoost community! We use https://discuss.xgboost.ai for any general usage questions and discussions. The issue tracker is used for actionable items such as feature proposals discussion, roadmaps, and bug tracking. You are always welcomed to post on the forum first :)

Issues that are inactive for a period of time may get closed. We adopt this policy so that we won't lose track of actionable issues that may fall at the bottom of the pile. Feel free to reopen a new one if you feel there is an additional problem that needs attention when an old one gets closed.

For bug reports, to help the developer act on the issues, please include a description of your environment, preferably a minimum script to reproduce the problem.

For feature proposals, list clear, small actionable items so we can track the progress of the change.

@allen20200111
Copy link
Author

allen20200111 commented Feb 7, 2021

---------------------------------------------------------------------------
XGBoostError                              Traceback (most recent call last)
<ipython-input-18-a0b9f07acd28> in <module>
    604 
    605     # grid_search_xgb()
--> 606     train_xgb(dataset12, dataset3)
    607 
    608     #gbm,dataset12_xx, dataset3_xx = train_lgb(dataset12, dataset3)

<ipython-input-18-a0b9f07acd28> in train_xgb(dataset12, dataset3)
    530     cvresult = xgb.cv(params, train_dmatrix, num_boost_round=10000, nfold=3, metrics='auc', seed=0, callbacks=[
    531         xgb.callback.print_evaluation(show_stdv=False),
--> 532         xgb.callback.early_stop(50)
    533     ])
    534 

/usr/local/lib/python3.6/dist-packages/xgboost/training.py in cv(params, dtrain, num_boost_round, nfold, stratified, folds, metrics, obj, feval, maximize, early_stopping_rounds, fpreproc, as_pandas, verbose_eval, show_stdv, seed, callbacks, shuffle)
    470     results = {}
    471     cvfolds = mknfold(dtrain, nfold, params, seed, metrics, fpreproc,
--> 472                       stratified, folds, shuffle)
    473 
    474     # setup callbacks

/usr/local/lib/python3.6/dist-packages/xgboost/training.py in mknfold(dall, nfold, param, seed, evals, fpreproc, stratified, folds, shuffle)
    361     for k in range(nfold):
    362         # perform the slicing using the indexes determined by the above methods
--> 363         dtrain = dall.slice(in_idset[k])
    364         dtest = dall.slice(out_idset[k])
    365         # run preprocessing on the data set if needed

/usr/local/lib/python3.6/dist-packages/xgboost/core.py in slice(self, rindex, allow_groups)
    821                 c_bst_ulong(len(rindex)),
    822                 ctypes.byref(res.handle),
--> 823                 ctypes.c_int(1 if allow_groups else 0),
    824             )
    825         )

/usr/local/lib/python3.6/dist-packages/xgboost/core.py in _check_call(ret)
    188     """
    189     if ret != 0:
--> 190         raise XGBoostError(py_str(_LIB.XGBGetLastError()))
    191 
    192 

XGBoostError: [09:40:10] /root/xgboost/src/common/host_device_vector.cu:100: Check failed: Size() == other->Size() (0 vs. 115) : 
Stack trace:
  [bt] (0) /usr/local/lib/python3.6/dist-packages/xgboost/lib/libxgboost.so(+0x391da2) [0x7fd4a6acbda2]
  [bt] (1) /usr/local/lib/python3.6/dist-packages/xgboost/lib/libxgboost.so(xgboost::HostDeviceVector<xgboost::FeatureType>::Copy(xgboost::HostDeviceVector<xgboost::FeatureType> const&)+0x203) [0x7fd4a6af1fe3]
  [bt] (2) /usr/local/lib/python3.6/dist-packages/xgboost/lib/libxgboost.so(xgboost::MetaInfo::Slice(xgboost::common::Span<int const, 18446744073709551615ul>) const+0x841) [0x7fd4a690d131]
  [bt] (3) /usr/local/lib/python3.6/dist-packages/xgboost/lib/libxgboost.so(xgboost::data::SimpleDMatrix::Slice(xgboost::common::Span<int const, 18446744073709551615ul>)+0xa5d) [0x7fd4a6944a7d]
  [bt] (4) /usr/local/lib/python3.6/dist-packages/xgboost/lib/libxgboost.so(XGDMatrixSliceDMatrixEx+0x81) [0x7fd4a68adf11]
  [bt] (5) /usr/lib/x86_64-linux-gnu/libffi.so.6(ffi_call_unix64+0x4c) [0x7fd56a425dae]
  [bt] (6) /usr/lib/x86_64-linux-gnu/libffi.so.6(ffi_call+0x22f) [0x7fd56a42571f]
  [bt] (7) /usr/lib/python3.6/lib-dynload/_ctypes.cpython-36m-x86_64-linux-gnu.so(_ctypes_callproc+0x2b4) [0x7fd56a6395c4]
  [bt] (8) /usr/lib/python3.6/lib-dynload/_ctypes.cpython-36m-x86_64-linux-gnu.so(+0x11c33) [0x7fd56a639c33]

@trivialfis
Copy link
Member

Could you please share the used parameters and type of input data?

@allen20200111
Copy link
Author

OS is: Ubuntu 18.0.4
code is:
#train_dmatrix = xgb.DMatrix(X_train, label=y_train)
train_dmatrix = xgb.DMatrix(dataset12_x, label=dataset12.label)
predict_dmatrix = xgb.DMatrix(dataset3_x)
# xgboost模型训练
params = {'booster': 'gbtree',
'objective': 'binary:logistic',
'eval_metric': 'auc',
'gamma': 0.1,
'min_child_weight': 1.1,
'max_depth': 5,
'lambda': 10,
'subsample': 0.7,
'colsample_bytree': 0.7,
'colsample_bylevel': 0.7,
'eta': 0.01,
#'tree_method': 'hist',
'tree_method': 'gpu_hist',
#'n_gpus': '-1',
'seed': 0,
'nthread': cpu_jobs,
'predictor': 'gpu_predictor'
}

#使用xgb.cv优化num_boost_round参数

cvresult = xgb.cv(params, train_dmatrix, num_boost_round=10000, nfold=3, metrics='auc', seed=0, callbacks=[
    xgb.callback.print_evaluation(show_stdv=False),
    xgb.callback.early_stop(50)
])

Could you please share the used parameters and type of input data?

@allen20200111
Copy link
Author

when used xgb.cv a error happen, but xgb train is ok

@allen20200111
Copy link
Author

dataset12_x is input data, and the type of the data below:
<class 'pandas.core.frame.DataFrame'>
Int64Index: 383386 entries, 0 to 252585
Columns: 115 entries, Distance to on_u13
dtypes: float64(94), int64(21)
memory usage: 339.3 MB

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants