- Fix float32 serialisation,
97
- Fixes to make package work with xgboost 2.0
- Various internal changes for xgboost sklearn API consistency
- Fix overflow issues for normal distribution,
64
- Removed verbosity hack in model training
- Better support for pickle/joblib,
82
- Added support for sample weights,
45
- Added example script for hyperparameter tuning
- Python requires >= 3.8 & xgboost >= 1.7.0 compatibility
- Added more precise loss description, negative log likelihood vs error
- Various updates to conform with xgboost==1.6.0 release
- Added type hints to XGBDistribution model class
- Hotfix to add error raising if sample weights are used (which is not yet implemented)
- Hot fix to enable compatibility with xgboost v1.5.0 (enable_categorical kwarg)
- Fixed the objective parameter in trained model to be reflective of distribution
- Support for model saving and loading with pickle (please don't use pickle)
- Added count data example with distribution heatmap,
45
- Updated docs to include estimators parameter,
43
- Implemented cleaner model saving, tests against binary and json formats
- Performed experiments on various datasets to assess XGBDistribution performance
- Added exponential distribution
- Added Laplace distribution
- Added poisson distribution
- Added negative-binomial distribution
- Changed naming conventions of distributions
- Safety checks on distribution parameters
- Added lognormal distribution
- Cleanup of distribution code, tested
- Silenced warnings during fit and predict steps
- Explicit link to RTD, showing available distributions
- CI tests running in Python 3.6, 3.7, 3.8
- First release of xgboost-distribution package
- Contains XGBDistribution estimator, an xgboost wrapper with natural gradients
- Normal distribution implemented