Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

json.h:73: Invalid cast, from Object to Array #7926

Closed
luigif2000 opened this issue May 20, 2022 · 31 comments · Fixed by #7928
Closed

json.h:73: Invalid cast, from Object to Array #7926

luigif2000 opened this issue May 20, 2022 · 31 comments · Fixed by #7928

Comments

@luigif2000
Copy link

Dear all,
with xgboost 1.5.2 no error
with 1.6.0 or 1.6.1 i meet the subject error: json.h:73: Invalid cast, from Object to Array

Has anyone any idea how to fix?
thanks a lot in advance

luigi

@trivialfis
Copy link
Member

Could you please share how to reproduce it?

@luigif2000
Copy link
Author

sure, let me think how to:

mmmmm I have a large dataframe with categorical features and numerical, ubuntu, python 3.9....
everything is ok with xgboost 1.5.2 but if i upgrade to xgboost 1.6.1 i had that error
i tried to investigate a lot but nothing.......i sinceresly don't know how to reproduce it
but the most important thing is that if a run pip install --upgrade xgboost and the system go to 1.6.1 i have the error,
i i back to pip install xgboost==1.5.2 the error disappear!
i hope it could be enought to understand...if not let me kindly know....

bye

@trivialfis
Copy link
Member

Could you please share a reproducible example that we can run and see the error? I can't guess based on your description. :-).

@luigif2000
Copy link
Author

you right at all..........let me arrange an example.....is not easy but i can.......stay tune...and thanks a lot

luigi

@luigif2000
Copy link
Author

luigif2000 commented May 21, 2022

I DID! FOLLOWING:

INSTALLATION PACKET

pip install xgboost==1.5.2
Collecting xgboost==1.5.2
Using cached xgboost-1.5.2-py3-none-manylinux2014_x86_64.whl (173.6 MB)
Requirement already satisfied: numpy in /home/luigi/miniconda3/lib/python3.9/site-packages (from xgboost==1.5.2) (1.21.6)
Requirement already satisfied: scipy in /home/luigi/miniconda3/lib/python3.9/site-packages (from xgboost==1.5.2) (1.8.0)
Installing collected packages: xgboost
Attempting uninstall: xgboost
Found existing installation: xgboost 1.6.1
Uninstalling xgboost-1.6.1:
Successfully uninstalled xgboost-1.6.1
Successfully installed xgboost-1.5.2

  • RUN MY PROGRAM
./xgboost_debug_v2.py 
/home/luigi/miniconda3/lib/python3.9/site-packages/xgboost/sklearn.py:1224: UserWarning: The use of label encoder in XGBClassifier is deprecated and will be removed in a future release. To remove this warning, do the following: 1) Pass option use_label_encoder=False when constructing XGBClassifier object; and 2) Encode your labels (y) as integers starting with 0, i.e. 0, 1, 2, ..., [num_class - 1].
  warnings.warn(label_encoder_deprecation_msg, UserWarning)
/home/luigi/miniconda3/lib/python3.9/site-packages/xgboost/data.py:290: PerformanceWarning: DataFrame is highly fragmented.  This is usually the result of calling `frame.insert` many times, which has poor performance.  Consider joining all columns at once using pd.concat(axis=1) instead.  To get a de-fragmented frame, use `newframe = frame.copy()`
  transformed[data.columns[i]] = data[data.columns[i]]
[06:47:04] WARNING: /home/conda/feedstock_root/build_artifacts/xgboost-split_1645117836726/work/src/learner.cc:1115: Starting in XGBoost 1.3.0, the default evaluation metric used with the objective 'multi:softprob' was changed from 'merror' to 'mlogloss'. Explicitly set eval_metric if you'd like to restore the old behavior.
[0]	validation_0-mlogloss:1.19024
[1]	validation_0-mlogloss:1.08064
[2]	validation_0-mlogloss:1.01365
[3]	validation_0-mlogloss:0.95256
[4]	validation_0-mlogloss:0.91857
[5]	validation_0-mlogloss:0.89723
[6]	validation_0-mlogloss:0.87563
[7]	validation_0-mlogloss:0.86596
[8]	validation_0-mlogloss:0.84995
[9]	validation_0-mlogloss:0.84009
[10]	validation_0-mlogloss:0.82646
[11]	validation_0-mlogloss:0.82888
[12]	validation_0-mlogloss:0.82017
[13]	validation_0-mlogloss:0.82596
[14]	validation_0-mlogloss:0.82856
[15]	validation_0-mlogloss:0.82910
[16]	validation_0-mlogloss:0.84222
[17]	validation_0-mlogloss:0.84589
[18]	validation_0-mlogloss:0.85449
[19]	validation_0-mlogloss:0.85547
[20]	validation_0-mlogloss:0.85730
[21]	validation_0-mlogloss:0.86160
[22]	validation_0-mlogloss:0.85594
[23]	validation_0-mlogloss:0.86170
[24]	validation_0-mlogloss:0.86263
[25]	validation_0-mlogloss:0.86344
[26]	validation_0-mlogloss:0.86367
[27]	validation_0-mlogloss:0.86610
[28]	validation_0-mlogloss:0.86771
[29]	validation_0-mlogloss:0.86682
[30]	validation_0-mlogloss:0.86891
[31]	validation_0-mlogloss:0.87142
[32]	validation_0-mlogloss:0.87111
[33]	validation_0-mlogloss:0.87522
[34]	validation_0-mlogloss:0.87387
[35]	validation_0-mlogloss:0.87505
[36]	validation_0-mlogloss:0.87663
[37]	validation_0-mlogloss:0.87639
[38]	validation_0-mlogloss:0.87981
[39]	validation_0-mlogloss:0.87960
[40]	validation_0-mlogloss:0.88043
[41]	validation_0-mlogloss:0.88294
[42]	validation_0-mlogloss:0.87967
[43]	validation_0-mlogloss:0.88094
[44]	validation_0-mlogloss:0.88225
[45]	validation_0-mlogloss:0.88492
[46]	validation_0-mlogloss:0.88807
[47]	validation_0-mlogloss:0.88823
[48]	validation_0-mlogloss:0.89067
[49]	validation_0-mlogloss:0.89345
[50]	validation_0-mlogloss:0.89319
[51]	validation_0-mlogloss:0.89473
[52]	validation_0-mlogloss:0.89564
[53]	validation_0-mlogloss:0.89658
[54]	validation_0-mlogloss:0.89715
[55]	validation_0-mlogloss:0.89784
[56]	validation_0-mlogloss:0.89829
[57]	validation_0-mlogloss:0.89762
[58]	validation_0-mlogloss:0.89766
[59]	validation_0-mlogloss:0.89769
[60]	validation_0-mlogloss:0.89772
[61]	validation_0-mlogloss:0.89774
[62]	validation_0-mlogloss:0.89776
[63]	validation_0-mlogloss:0.89777
[64]	validation_0-mlogloss:0.89779
[65]	validation_0-mlogloss:0.89780
[66]	validation_0-mlogloss:0.89781
[67]	validation_0-mlogloss:0.89782
[68]	validation_0-mlogloss:0.89782
[69]	validation_0-mlogloss:0.89783
[70]	validation_0-mlogloss:0.89783
[71]	validation_0-mlogloss:0.89784
[72]	validation_0-mlogloss:0.89784
[73]	validation_0-mlogloss:0.89784
[74]	validation_0-mlogloss:0.89785
[75]	validation_0-mlogloss:0.89785
[76]	validation_0-mlogloss:0.89785
[77]	validation_0-mlogloss:0.89785
[78]	validation_0-mlogloss:0.89785
[79]	validation_0-mlogloss:0.89785
[80]	validation_0-mlogloss:0.89785
[81]	validation_0-mlogloss:0.89785
[82]	validation_0-mlogloss:0.89785
[83]	validation_0-mlogloss:0.89785
[84]	validation_0-mlogloss:0.89785
[85]	validation_0-mlogloss:0.89786
[86]	validation_0-mlogloss:0.89786
[87]	validation_0-mlogloss:0.89786
[88]	validation_0-mlogloss:0.89786
[89]	validation_0-mlogloss:0.89786
[90]	validation_0-mlogloss:0.89786
[91]	validation_0-mlogloss:0.89786
[92]	validation_0-mlogloss:0.89786
[93]	validation_0-mlogloss:0.89786
[94]	validation_0-mlogloss:0.89786
[95]	validation_0-mlogloss:0.89786
[96]	validation_0-mlogloss:0.89786
[97]	validation_0-mlogloss:0.89786
[98]	validation_0-mlogloss:0.89786
[99]	validation_0-mlogloss:0.89786

-EVERYTHING OK
-UPDATE XGBOOST

pip install --upgrade xgboost
Requirement already satisfied: xgboost in /home/luigi/miniconda3/lib/python3.9/site-packages (1.5.2)
Collecting xgboost
Using cached xgboost-1.6.1-py3-none-manylinux2014_x86_64.whl (192.9 MB)
Requirement already satisfied: numpy in /home/luigi/miniconda3/lib/python3.9/site-packages (from xgboost) (1.21.6)
Requirement already satisfied: scipy in /home/luigi/miniconda3/lib/python3.9/site-packages (from xgboost) (1.8.0)
Installing collected packages: xgboost
Attempting uninstall: xgboost
Found existing installation: xgboost 1.5.2
Uninstalling xgboost-1.5.2:
Successfully uninstalled xgboost-1.5.2
Successfully installed xgboost-1.6.1

-RUN THE PROGRAM AND THE ERROR COME UP!!!!!

./xgboost_debug_v2.py 
/home/luigi/miniconda3/lib/python3.9/site-packages/xgboost/data.py:323: PerformanceWarning: DataFrame is highly fragmented.  This is usually the result of calling `frame.insert` many times, which has poor performance.  Consider joining all columns at once using pd.concat(axis=1) instead.  To get a de-fragmented frame, use `newframe = frame.copy()`
  transformed[data.columns[i]] = data[data.columns[i]]
Traceback (most recent call last):
  File "/home/luigi/SYNC_PCLOUD_PROGETTI/PROG_T/TRADING/SORGENTI/./xgboost_debug_v2.py", line 65, in <module>
    model_XGB.fit(X_train_0_100_debug,label_encoded_y_train_0_100_debug,eval_set=eval_set)
  File "/home/luigi/miniconda3/lib/python3.9/site-packages/xgboost/core.py", line 532, in inner_f
    return f(**kwargs)
  File "/home/luigi/miniconda3/lib/python3.9/site-packages/xgboost/sklearn.py", line 1382, in fit
    train_dmatrix, evals = _wrap_evaluation_matrices(
  File "/home/luigi/miniconda3/lib/python3.9/site-packages/xgboost/sklearn.py", line 401, in _wrap_evaluation_matrices
    train_dmatrix = create_dmatrix(
  File "/home/luigi/miniconda3/lib/python3.9/site-packages/xgboost/sklearn.py", line 1396, in <lambda>
    create_dmatrix=lambda **kwargs: DMatrix(nthread=self.n_jobs, **kwargs),
  File "/home/luigi/miniconda3/lib/python3.9/site-packages/xgboost/core.py", line 532, in inner_f
    return f(**kwargs)
  File "/home/luigi/miniconda3/lib/python3.9/site-packages/xgboost/core.py", line 654, in __init__
    self.set_info(
  File "/home/luigi/miniconda3/lib/python3.9/site-packages/xgboost/core.py", line 532, in inner_f
    return f(**kwargs)
  File "/home/luigi/miniconda3/lib/python3.9/site-packages/xgboost/core.py", line 719, in set_info
    self.set_label(label)
  File "/home/luigi/miniconda3/lib/python3.9/site-packages/xgboost/core.py", line 850, in set_label
    dispatch_meta_backend(self, label, 'label', 'float')
  File "/home/luigi/miniconda3/lib/python3.9/site-packages/xgboost/data.py", line 1043, in dispatch_meta_backend
    _meta_from_pandas_series(data, name, dtype, handle)
  File "/home/luigi/miniconda3/lib/python3.9/site-packages/xgboost/data.py", line 368, in _meta_from_pandas_series
    _meta_from_numpy(data, name, dtype, handle)
  File "/home/luigi/miniconda3/lib/python3.9/site-packages/xgboost/data.py", line 976, in _meta_from_numpy
    _check_call(_LIB.XGDMatrixSetInfoFromInterface(handle, c_str(field), interface_str))
  File "/home/luigi/miniconda3/lib/python3.9/site-packages/xgboost/core.py", line 203, in _check_call
    raise XGBoostError(py_str(_LIB.XGBGetLastError()))
xgboost.core.XGBoostError: [06:48:10] /home/conda/feedstock_root/build_artifacts/xgboost-split_1645117836726/work/include/xgboost/json.h:73: Invalid cast, from Object to Array
Stack trace:
  [bt] (0) /home/luigi/miniconda3/lib/libxgboost.so(+0x28f68b) [0x7ff602a7668b]
  [bt] (1) /home/luigi/miniconda3/lib/libxgboost.so(xgboost::JsonArray* xgboost::Cast<xgboost::JsonArray, xgboost::Value>(xgboost::Value*)+0x424) [0x7ff602a80264]
  [bt] (2) /home/luigi/miniconda3/lib/libxgboost.so(xgboost::MetaInfo::SetInfo(char const*, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&)+0x3c) [0x7ff602c9181c]
  [bt] (3) /home/luigi/miniconda3/lib/libxgboost.so(XGDMatrixSetInfoFromInterface+0xd0) [0x7ff602992c40]
  [bt] (4) /home/luigi/miniconda3/lib/python3.9/lib-dynload/../../libffi.so.8(+0x6a4a) [0x7ff64d11fa4a]
  [bt] (5) /home/luigi/miniconda3/lib/python3.9/lib-dynload/../../libffi.so.8(+0x5fea) [0x7ff64d11efea]
  [bt] (6) /home/luigi/miniconda3/lib/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so(+0x13dc7) [0x7ff64d138dc7]
  [bt] (7) /home/luigi/miniconda3/lib/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so(+0x14454) [0x7ff64d139454]
  [bt] (8) python3(_PyObject_MakeTpCall+0x2c0) [0x55ad7b6ff260]

  • DEAR LET ME KNOW IF ENOUGHT.......WAITING KINDLY REPLY
    LUIGI

@luigif2000
Copy link
Author

luigif2000 commented May 21, 2022

my program above (xgboost_debug_v2.py):

#!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""
Created on Thu Aug 12 17:25:34 2021

@author: luigi
"""







import os
#import sys
#from os import path
import pandas as pd

import numpy as np
import sys

#import shutil

import xgboost as xgb

import pyarrow


#################################################


X_train_0_100_debug = pd.read_parquet('/home/luigi/debug_xgboost/X_train_0_100.zip')
#X_train_debug = pd.read_parquet('/home/luigi/debug_xgboost/X_train.zip')

X_validation_0_100_debug = pd.read_parquet('/home/luigi/debug_xgboost/X_validation_0_100.zip')
#X_validation_debug = pd.read_parquet('/home/luigi/debug_xgboost/X_validation.zip')



#label_encoded_y_train_debug=pd.read_csv('/home/luigi/debug_xgboost/label_encoded_y_train.csv')
#label_encoded_y_train_debug=label_encoded_y_train_debug['target']

label_encoded_y_train_0_100_debug=pd.read_csv('/home/luigi/debug_xgboost/label_encoded_y_train_0_100.csv')
label_encoded_y_train_0_100_debug=label_encoded_y_train_0_100_debug['target']


#label_encoded_y_validation_debug=pd.read_csv('/home/luigi/debug_xgboost/label_encoded_y_validation.csv')
#label_encoded_y_validation_debug=label_encoded_y_validation_debug['target']

label_encoded_y_validation_0_100_debug=pd.read_csv('/home/luigi/debug_xgboost/label_encoded_y_validation_0_100.csv')
label_encoded_y_validation_0_100_debug=label_encoded_y_validation_0_100_debug['target']



from xgboost import XGBClassifier

#model_XGB = XGBClassifier(tree_method="gpu_hist",enable_categorical=True,n_estimators=100,early_stopping_rounds=100,max_depth=12)
model_XGB = XGBClassifier(tree_method="gpu_hist",enable_categorical=True)

eval_set= [(X_validation_0_100_debug, label_encoded_y_validation_0_100_debug)]

model_XGB.fit(X_train_0_100_debug,label_encoded_y_train_0_100_debug,eval_set=eval_set)

@trivialfis
Copy link
Member

Couldn't reproduce it with master branch, trying it with 1.6.1 now.

@trivialfis
Copy link
Member

I can't reproduce it with 1.6.1 either, but I have found the cause: You have 1.5.1 libxgboost.so somewhere installed by conda, and then you install XGBoost with pip without first removing the old version completely. Please remove the libxgboost.so in your conda environment: /home/luigi/miniconda3/lib/

@luigif2000
Copy link
Author

uaooooooooo....great trivialfis!!!! ...it ....I'm going to try what you kindly suggest.......wait ....i hope .....

@luigif2000
Copy link
Author

(base) luigi@DM2:/home/luigi$ conda remove libxgboost
Collecting package metadata (repodata.json): done
Solving environment: /

still waiting......ahhhhhhhhh

@luigif2000
Copy link
Author

conda remove libxgboost

take a long time.....still solving....waiting ( i remember i tried some days ago but after hours nothing....!?!?!?!
do you think a could force the remove?

@trivialfis
Copy link
Member

yes, just delete the file ... or , you can use mamba instead of conda.

@luigif2000
Copy link
Author

conda remove --force libxgboost
conda remove --force _py-xgboost-mutex
conda remove --force py-xgboost
conda remove --force py-xgboost-gpu
pip uninstall xgboost
Found existing installation: xgboost 1.5.2
Uninstalling xgboost-1.5.2:
Would remove:
/home/luigi/miniconda3/lib/python3.9/site-packages/xgboost-1.5.2.dist-info/*
/home/luigi/miniconda3/lib/python3.9/site-packages/xgboost.libs/libgomp-a34b3233.so.1.0.0
/home/luigi/miniconda3/lib/python3.9/site-packages/xgboost/lib/libxgboost.so
Proceed (Y/n)?
Successfully uninstalled xgboost-1.5.2

conda list | grep xgboost -> nothing
pip list | grep xgboost -> nothing
conda update conda

Collecting package metadata (current_repodata.json): done
Solving environment: done

Package Plan

environment location: /home/luigi/miniconda3

added / updated specs:
- conda

The following packages will be downloaded:

package                    |            build
---------------------------|-----------------
importlib-metadata-4.11.4  |   py39hf3d152e_0          33 KB  conda-forge
importlib_metadata-4.11.4  |       hd8ed1ab_0           4 KB  conda-forge
llvmlite-0.38.1            |   py39h7d9a04d_0         2.3 MB  conda-forge
psutil-5.9.1               |   py39hb9d737c_0         348 KB  conda-forge
pytools-2022.1.9           |     pyhd8ed1ab_0          61 KB  conda-forge
scipy-1.8.1                |   py39he49c0e8_0        24.9 MB  conda-forge
typed-ast-1.5.4            |   py39hb9d737c_0         219 KB  conda-forge
------------------------------------------------------------
                                       Total:        27.9 MB

The following packages will be REMOVED:

argh-0.26.2-pyh9f0ad1d_1002
cudatoolkit-11.7.0-hd8887f6_10
nccl-2.12.12.1-h0800d71_0

The following packages will be UPDATED:

importlib-metadata 4.11.3-py39hf3d152e_1 --> 4.11.4-py39hf3d152e_0
importlib_metadata 4.11.3-hd8ed1ab_1 --> 4.11.4-hd8ed1ab_0
llvmlite 0.38.0-py39h7d9a04d_1 --> 0.38.1-py39h7d9a04d_0
psutil 5.9.0-py39hb9d737c_1 --> 5.9.1-py39hb9d737c_0
pytools 2022.1.7-pyh8a188c0_0 --> 2022.1.9-pyhd8ed1ab_0
scipy 1.8.0-py39hee8e79c_1 --> 1.8.1-py39he49c0e8_0
typed-ast 1.5.3-py39hb9d737c_0 --> 1.5.4-py39hb9d737c_0

Proceed ([y]/n)?

Downloading and Extracting Packages
llvmlite-0.38.1 | 2.3 MB | ########################################## | 100%
typed-ast-1.5.4 | 219 KB | ########################################## | 100%
importlib-metadata-4 | 33 KB | ########################################## | 100%
psutil-5.9.1 | 348 KB | ########################################## | 100%
importlib_metadata-4 | 4 KB | ########################################## | 100%
pytools-2022.1.9 | 61 KB | ########################################## | 100%
scipy-1.8.1 | 24.9 MB | ########################################## | 100%
Preparing transaction: done
Verifying transaction: done
Executing transaction: done

(base) luigi@DM2:/home/luigi$ pip install xgboost
Collecting xgboost
Using cached xgboost-1.6.1-py3-none-manylinux2014_x86_64.whl (192.9 MB)
Requirement already satisfied: numpy in ./miniconda3/lib/python3.9/site-packages (from xgboost) (1.21.6)
Requirement already satisfied: scipy in ./miniconda3/lib/python3.9/site-packages (from xgboost) (1.8.1)
Installing collected packages: xgboost
Successfully installed xgboost-1.6.1

./xgboost_debug_v2.py
/home/luigi/miniconda3/lib/python3.9/site-packages/xgboost/data.py:323: PerformanceWarning: DataFrame is highly fragmented. This is usually the result of calling frame.insert many times, which has poor performance. Consider joining all columns at once using pd.concat(axis=1) instead. To get a de-fragmented frame, use newframe = frame.copy()
transformed[data.columns[i]] = data[data.columns[i]]
[0] validation_0-mlogloss:1.19024
[1] validation_0-mlogloss:1.08012
[2] validation_0-mlogloss:0.99451
[3] validation_0-mlogloss:0.93255
[4] validation_0-mlogloss:0.88973
[5] validation_0-mlogloss:0.86237
[6] validation_0-mlogloss:0.84204
[7] validation_0-mlogloss:0.84029
[8] validation_0-mlogloss:0.83645
[9] validation_0-mlogloss:0.83282
[10] validation_0-mlogloss:0.82359
[11] validation_0-mlogloss:0.82722
[12] validation_0-mlogloss:0.83352
[13] validation_0-mlogloss:0.83637
[14] validation_0-mlogloss:0.83113
[15] validation_0-mlogloss:0.83600
[16] validation_0-mlogloss:0.83641
[17] validation_0-mlogloss:0.83568
[18] validation_0-mlogloss:0.83701
[19] validation_0-mlogloss:0.83683
[20] validation_0-mlogloss:0.84495
[21] validation_0-mlogloss:0.84584
[22] validation_0-mlogloss:0.84878
[23] validation_0-mlogloss:0.84774
[24] validation_0-mlogloss:0.85839
[25] validation_0-mlogloss:0.86133
[26] validation_0-mlogloss:0.86145
[27] validation_0-mlogloss:0.86253
[28] validation_0-mlogloss:0.86274
[29] validation_0-mlogloss:0.86461
[30] validation_0-mlogloss:0.87034
[31] validation_0-mlogloss:0.87595
[32] validation_0-mlogloss:0.87646
[33] validation_0-mlogloss:0.87730
[34] validation_0-mlogloss:0.87684
[35] validation_0-mlogloss:0.88112
[36] validation_0-mlogloss:0.88233
[37] validation_0-mlogloss:0.88355
[38] validation_0-mlogloss:0.87927
[39] validation_0-mlogloss:0.87951
[40] validation_0-mlogloss:0.88164
[41] validation_0-mlogloss:0.88282
[42] validation_0-mlogloss:0.88432
[43] validation_0-mlogloss:0.88824
[44] validation_0-mlogloss:0.88852
[45] validation_0-mlogloss:0.88694
[46] validation_0-mlogloss:0.88987
[47] validation_0-mlogloss:0.89241
[48] validation_0-mlogloss:0.89404
[49] validation_0-mlogloss:0.89602
[50] validation_0-mlogloss:0.89526
[51] validation_0-mlogloss:0.89628
[52] validation_0-mlogloss:0.89597
[53] validation_0-mlogloss:0.89784
[54] validation_0-mlogloss:0.89902
[55] validation_0-mlogloss:0.89872
[56] validation_0-mlogloss:0.89900
[57] validation_0-mlogloss:0.89923
[58] validation_0-mlogloss:0.89924
[59] validation_0-mlogloss:0.89925
[60] validation_0-mlogloss:0.89925
[61] validation_0-mlogloss:0.89926
[62] validation_0-mlogloss:0.89926
[63] validation_0-mlogloss:0.89927
[64] validation_0-mlogloss:0.89927
[65] validation_0-mlogloss:0.89927
[66] validation_0-mlogloss:0.89927
[67] validation_0-mlogloss:0.89927
[68] validation_0-mlogloss:0.89928
[69] validation_0-mlogloss:0.89928
[70] validation_0-mlogloss:0.89928
[71] validation_0-mlogloss:0.89928
[72] validation_0-mlogloss:0.89928
[73] validation_0-mlogloss:0.89928
[74] validation_0-mlogloss:0.89928
[75] validation_0-mlogloss:0.89928
[76] validation_0-mlogloss:0.89928
[77] validation_0-mlogloss:0.89928
[78] validation_0-mlogloss:0.89928
[79] validation_0-mlogloss:0.89928
[80] validation_0-mlogloss:0.89928
[81] validation_0-mlogloss:0.89928
[82] validation_0-mlogloss:0.89928
[83] validation_0-mlogloss:0.89928
[84] validation_0-mlogloss:0.89928
[85] validation_0-mlogloss:0.89928
[86] validation_0-mlogloss:0.89928
[87] validation_0-mlogloss:0.89928
[88] validation_0-mlogloss:0.89928
[89] validation_0-mlogloss:0.89928
[90] validation_0-mlogloss:0.89928
[91] validation_0-mlogloss:0.89928
[92] validation_0-mlogloss:0.89928
[93] validation_0-mlogloss:0.89928
[94] validation_0-mlogloss:0.89928
[95] validation_0-mlogloss:0.89928
[96] validation_0-mlogloss:0.89928
[97] validation_0-mlogloss:0.89928
[98] validation_0-mlogloss:0.89928
[99] validation_0-mlogloss:0.89928
(base) luigi@DM2:/home/luigi/SYNC_PCLOUD_PROGETTI/PROG_T/TRADING/SORGENTI$

YESSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSS

YEAHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHH
trivialfis You are wonderfull!!!
thanks thanks thanks

for the time , for the patient

best regards

luigi

@luigif2000
Copy link
Author

noooooo
File "/home/luigi/miniconda3/lib/python3.9/site-packages/xgboost/core.py", line 203, in _check_call
raise XGBoostError(py_str(_LIB.XGBGetLastError()))
xgboost.core.XGBoostError: [07:39:14] ../src/tree/updater_gpu_hist.cu:712: Exception in gpu_hist: [07:39:14] ../src/common/categorical.h:82: Check failed: max_cat + 1 >= n_categories (1.5335 vs. 2) : Maximum cateogry should not be lesser than the total number of categories.
Stack trace:

@luigif2000
Copy link
Author

let me investigate.....

@luigif2000
Copy link
Author

pip install xgboost==1.5.2
Collecting xgboost==1.5.2
Using cached xgboost-1.5.2-py3-none-manylinux2014_x86_64.whl (173.6 MB)
Requirement already satisfied: scipy in ./miniconda3/lib/python3.9/site-packages (from xgboost==1.5.2) (1.8.1)
Requirement already satisfied: numpy in ./miniconda3/lib/python3.9/site-packages (from xgboost==1.5.2) (1.21.6)
Installing collected packages: xgboost
Attempting uninstall: xgboost
Found existing installation: xgboost 1.6.1
Uninstalling xgboost-1.6.1:
Successfully uninstalled xgboost-1.6.1
Successfully installed xgboost-1.5.2

get back to 1.5.2, NO ERROR!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
aahhhhhhhhhhhhhhhhhhhhhhhh

@luigif2000
Copy link
Author

pip install --upgrade xgboost
Requirement already satisfied: xgboost in ./miniconda3/lib/python3.9/site-packages (1.5.2)
Collecting xgboost
Using cached xgboost-1.6.1-py3-none-manylinux2014_x86_64.whl (192.9 MB)
Requirement already satisfied: scipy in ./miniconda3/lib/python3.9/site-packages (from xgboost) (1.8.1)
Requirement already satisfied: numpy in ./miniconda3/lib/python3.9/site-packages (from xgboost) (1.21.6)
Installing collected packages: xgboost
Attempting uninstall: xgboost
Found existing installation: xgboost 1.5.2
Uninstalling xgboost-1.5.2:
Successfully uninstalled xgboost-1.5.2
Successfully installed xgboost-1.6.1

(base) luigi@DM2:/home/luigi$ pip list | grep xgboost
xgboost 1.6.1

(base) luigi@DM2:/home/luigi$ conda list | grep xgboost
xgboost 1.6.1 pypi_0 pypi
(base) luigi@DM2:/home/luigi$

/home/luigi/miniconda3/lib/python3.9/site-packages/xgboost/data.py:323: PerformanceWarning: DataFrame is highly fragmented. This is usually the result of calling frame.insert many times, which has poor performance. Consider joining all columns at once using pd.concat(axis=1) instead. To get a de-fragmented frame, use newframe = frame.copy()
transformed[data.columns[i]] = data[data.columns[i]]
/home/luigi/miniconda3/lib/python3.9/site-packages/xgboost/data.py:314: PerformanceWarning: DataFrame is highly fragmented. This is usually the result of calling frame.insert many times, which has poor performance. Consider joining all columns at once using pd.concat(axis=1) instead. To get a de-fragmented frame, use newframe = frame.copy()
transformed[data.columns[i]] = (
xgboost.core.XGBoostError: [07:50:08] ../src/tree/updater_gpu_hist.cu:712: Exception in gpu_hist: [07:50:08] ../src/common/categorical.h:82: Check failed: max_cat + 1 >= n_categories (1.5335 vs. 2) : Maximum cateogry should not be lesser than the total number of categories.
Stack trace:
[bt] (0) /home/luigi/miniconda3/lib/python3.9/site-packages/xgboost/lib/libxgboost.so(+0x3f2e69) [0x7f67314f2e69]
[bt] (1) /home/luigi/miniconda3/lib/python3.9/site-packages/xgboost/lib/libxgboost.so(+0x3f8724) [0x7f67314f8724]
[bt] (2) /home/luigi/miniconda3/lib/python3.9/site-packages/xgboost/lib/libxgboost.so(+0x3a5956) [0x7f67314a5956]
[bt] (3) /home/luigi/miniconda3/lib/python3.9/site-packages/xgboost/lib/libxgboost.so(+0x416809) [0x7f6731516809]
[bt] (4) /home/luigi/miniconda3/lib/python3.9/site-packages/xgboost/lib/libxgboost.so(+0x416ece) [0x7f6731516ece]
[bt] (5) /home/luigi/miniconda3/lib/python3.9/site-packages/xgboost/lib/libxgboost.so(+0x19cacb) [0x7f673129cacb]
[bt] (6) /home/luigi/miniconda3/lib/python3.9/site-packages/xgboost/lib/libxgboost.so(+0x64ca77) [0x7f673174ca77]
[bt] (7) /home/luigi/miniconda3/lib/python3.9/site-packages/xgboost/lib/libxgboost.so(+0x6589e6) [0x7f67317589e6]
[bt] (8) /home/luigi/miniconda3/lib/python3.9/site-packages/xgboost/lib/libxgboost.so(+0x1d2e43) [0x7f67312d2e43]

Stack trace:
[bt] (0) /home/luigi/miniconda3/lib/python3.9/site-packages/xgboost/lib/libxgboost.so(+0x637999) [0x7f6731737999]
[bt] (1) /home/luigi/miniconda3/lib/python3.9/site-packages/xgboost/lib/libxgboost.so(+0x658cc5) [0x7f6731758cc5]
[bt] (2) /home/luigi/miniconda3/lib/python3.9/site-packages/xgboost/lib/libxgboost.so(+0x1d2e43) [0x7f67312d2e43]
[bt] (3) /home/luigi/miniconda3/lib/python3.9/site-packages/xgboost/lib/libxgboost.so(+0x1d3c80) [0x7f67312d3c80]
[bt] (4) /home/luigi/miniconda3/lib/python3.9/site-packages/xgboost/lib/libxgboost.so(+0x20fd12) [0x7f673130fd12]
[bt] (5) /home/luigi/miniconda3/lib/python3.9/site-packages/xgboost/lib/libxgboost.so(XGBoosterUpdateOneIter+0x68) [0x7f67311a9688]
[bt] (6) /home/luigi/miniconda3/lib/python3.9/lib-dynload/../../libffi.so.8(+0x6a4a) [0x7f6926122a4a]
[bt] (7) /home/luigi/miniconda3/lib/python3.9/lib-dynload/../../libffi.so.8(+0x5fea) [0x7f6926121fea]
[bt] (8) /home/luigi/miniconda3/lib/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so(+0x13dc7) [0x7f692613bdc7]

@luigif2000
Copy link
Author

aaaaaaaaaaaaahhhhhhhhhhhhhhh why?????

@luigif2000
Copy link
Author

some dataframes, only xgboost intalled......1.5.2 non error, 1.6.1 error:/categorical.h:82: Check failed: max_cat + 1 >= n_categories (1.5335 vs. 2) : Maximum cateogry should not be lesser than the total number of categories.

@luigif2000
Copy link
Author

same program, same dataframes: 1.5.2 NO ERROR, 1.6.1 ERROR..............................................

@luigif2000
Copy link
Author

sorry it's normal i know, but with the other dataframes (only 0-100lines) 1.6.1 was fine!?!?!?!?

@luigif2000
Copy link
Author

i need time to rest..............the computer science is not deterministic also when it is........i know

@luigif2000
Copy link
Author

:-)

@luigif2000
Copy link
Author

but what means:Check failed: max_cat + 1 >= n_categories (1.5335 vs. 2) : Maximum cateogry should not be lesser than the total number of categories.!?!?!?!?!?!?

@luigif2000
Copy link
Author

ahhhh it is not 15335 but 1.5335!!!?!?!?!?

@luigif2000
Copy link
Author

this should be a bug of 1.6.1!?!?!?

@luigif2000
Copy link
Author

n_categories=2 ??????????? i have thousands categories......

@luigif2000
Copy link
Author

go back to 1.5.2......waiting rest time.......thanks a lot anyway [trivialfis]...that's very kind oy you

luigi

@trivialfis
Copy link
Member

Hi, please provide something that we can run and see the error? I'm confused by your comments. ;-)

@trivialfis
Copy link
Member

Closing due to stalled.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants