Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

News for 2.0. [skip ci] #9484

Merged
merged 7 commits into from Sep 18, 2023
Merged

News for 2.0. [skip ci] #9484

merged 7 commits into from Sep 18, 2023

Conversation

trivialfis
Copy link
Member

No description provided.

@trivialfis trivialfis added this to 2.0 In Progress in 2.0 Roadmap via automation Aug 15, 2023
@trivialfis trivialfis mentioned this pull request Aug 17, 2023
5 tasks
NEWS.md Outdated Show resolved Hide resolved
NEWS.md Outdated Show resolved Hide resolved
NEWS.md Outdated Show resolved Hide resolved
NEWS.md Outdated Show resolved Hide resolved
NEWS.md Outdated Show resolved Hide resolved
NEWS.md Outdated Show resolved Hide resolved
NEWS.md Outdated
In the previous version, `base_score` was a constant that could be set as a training parameter. In the new version, XGBoost can automatically estimate this parameter based on input labels for optimal accuracy. (#8539, #8498, #8272, #8793, #8607)

### Quantile regression
The XGBoost algorithm now supports quantile regression, which involves minimizing the quantile loss (also called "pinball loss"). Furthermore, XGBoost allows for training with multiple target quantiles simultaneously. (#8775, #8761, #8760, #8758, #8750)
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
The XGBoost algorithm now supports quantile regression, which involves minimizing the quantile loss (also called "pinball loss"). Furthermore, XGBoost allows for training with multiple target quantiles simultaneously. (#8775, #8761, #8760, #8758, #8750)
The XGBoost algorithm now supports quantile regression, which involves minimizing the quantile loss (also called "pinball loss"). Furthermore, XGBoost allows for training with multiple target quantiles simultaneously using one tree per quantile. (#8775, #8761, #8760, #8758, #8750)

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

When we fit a tree per quantile, is the resulting model considered a multi-target model?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes

NEWS.md Outdated Show resolved Hide resolved
NEWS.md Outdated Show resolved Hide resolved
NEWS.md Outdated Show resolved Hide resolved
NEWS.md Outdated

* doc
- Add introduction and notes for the sklearn interface. (#8948)
- Demo for using dask with HPO. (#8891)
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
- Demo for using dask with HPO. (#8891)
- Demo for using dask for HPO. (#8891)

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
- Demo for using dask with HPO. (#8891)
- Demo for using dask for hyperparameter optimization (HPO). (#8891)

NEWS.md Outdated
* CI bot PRs
We employed GitHub dependent bot to help us keep the dependencies up-to-date for JVM packages. With the help from the bot, we have cleared up all the dependencies that are lagging behind (#8501, #8507).

Here's a list of dependency update PRs including those made by dependent bots (#8456, #8560, #8571, #8561, #8562, #8600, #8594, #8524, #8509, #8548, #8549, #8533, #8521, #8534, #8532, #8516, #8503, #8531, #8530, #8518, #8512, #8515, #8517, #8506, #8504, #8502, #8629, #8815, #8813, #8814, #8877, #8876, #8875, #8874, #8873, #9049, #9070, #9073, #9039, #9083, #8917, #8952, #8980, #8973, #8962, #9252, #9208, #9131, #9136, #9219, #9160, #9158, #9163, #9184, #9192, #9265, #9268, #8882, #8837, #8662, #8661, #8390, #9056, #8508, #8925, #8920, #9149, #9230, #9097, #8648, #9203, 8593).
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
Here's a list of dependency update PRs including those made by dependent bots (#8456, #8560, #8571, #8561, #8562, #8600, #8594, #8524, #8509, #8548, #8549, #8533, #8521, #8534, #8532, #8516, #8503, #8531, #8530, #8518, #8512, #8515, #8517, #8506, #8504, #8502, #8629, #8815, #8813, #8814, #8877, #8876, #8875, #8874, #8873, #9049, #9070, #9073, #9039, #9083, #8917, #8952, #8980, #8973, #8962, #9252, #9208, #9131, #9136, #9219, #9160, #9158, #9163, #9184, #9192, #9265, #9268, #8882, #8837, #8662, #8661, #8390, #9056, #8508, #8925, #8920, #9149, #9230, #9097, #8648, #9203, 8593).
Here's a list of dependency update PRs including those made by dependent bots (#8456, #8560, #8571, #8561, #8562, #8600, #8594, #8524, #8509, #8548, #8549, #8533, #8521, #8534, #8532, #8516, #8503, #8531, #8530, #8518, #8512, #8515, #8517, #8506, #8504, #8502, #8629, #8815, #8813, #8814, #8877, #8876, #8875, #8874, #8873, #9049, #9070, #9073, #9039, #9083, #8917, #8952, #8980, #8973, #8962, #9252, #9208, #9131, #9136, #9219, #9160, #9158, #9163, #9184, #9192, #9265, #9268, #8882, #8837, #8662, #8661, #8390, #9056, #8508, #8925, #8920, #9149, #9230, #9097, #8648, #9203, #8593).

NEWS.md Outdated
Comment on lines 10 to 11
### Initial work on multi-target tree
We have been working on vector-leaf tree models for multi-target regression, multi-label classification, and multi-class classification in version 2.0. Previously, XGBoost would build a separate model for each target. However, with this new feature that's still being developed, XGBoost can create one tree for all targets. This method has various benefits and trade-offs compared to the current approach. It can help prevent overfitting, result in smaller models, and build trees that consider the correlation between targets. Additionally, users can combine vector leaf and scalar leaf trees during a training session using a callback. Please note that this feature is still a work in progress, and many parts are not yet available. See #9043 for the current status. Related PRs: (#8538, #8697, #8902, #8884, #8895, #8898, #8612, #8652, #8698, #8908, #8928, #8968, #8616, #8922, #8890, #8872, #8889) Please note that only the `hist` (default) tree method on CPU can be used for building vector leaf trees at the moment.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
### Initial work on multi-target tree
We have been working on vector-leaf tree models for multi-target regression, multi-label classification, and multi-class classification in version 2.0. Previously, XGBoost would build a separate model for each target. However, with this new feature that's still being developed, XGBoost can create one tree for all targets. This method has various benefits and trade-offs compared to the current approach. It can help prevent overfitting, result in smaller models, and build trees that consider the correlation between targets. Additionally, users can combine vector leaf and scalar leaf trees during a training session using a callback. Please note that this feature is still a work in progress, and many parts are not yet available. See #9043 for the current status. Related PRs: (#8538, #8697, #8902, #8884, #8895, #8898, #8612, #8652, #8698, #8908, #8928, #8968, #8616, #8922, #8890, #8872, #8889) Please note that only the `hist` (default) tree method on CPU can be used for building vector leaf trees at the moment.
### Initial work on multi-target trees with vector-leaf outputs
We have been working on vector-leaf tree models for multi-target regression, multi-label classification, and multi-class classification in version 2.0. Previously, XGBoost would build a separate model for each target. In 2.0, users can optionally fit trees with vector-leaf outputs, so that one tree yields prediction for all targets. The vector-leaf tree method has various benefits and trade-offs compared to the current approach. It can help prevent overfitting, result in smaller models, and build trees that consider correlations between targets. Additionally, users can combine vector-leaf and scalar-leaf trees during a training session using a callback. Please note that this feature is still a work in progress, and many parts are not yet available. See #9043 for the current status. Related PRs: (#8538, #8697, #8902, #8884, #8895, #8898, #8612, #8652, #8698, #8908, #8928, #8968, #8616, #8922, #8890, #8872, #8889) Please note that only the `hist` (default) tree method on CPU can be used for building vector-leaf trees at the moment.

NEWS.md Show resolved Hide resolved
NEWS.md Outdated Show resolved Hide resolved
NEWS.md Outdated Show resolved Hide resolved
NEWS.md Outdated Show resolved Hide resolved
NEWS.md Show resolved Hide resolved
NEWS.md Outdated Show resolved Hide resolved
NEWS.md Outdated

* doc
- Add introduction and notes for the sklearn interface. (#8948)
- Demo for using dask with HPO. (#8891)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
- Demo for using dask with HPO. (#8891)
- Demo for using dask for hyperparameter optimization (HPO). (#8891)

NEWS.md Outdated Show resolved Hide resolved
NEWS.md Outdated Show resolved Hide resolved
@trivialfis
Copy link
Member Author

cc @hcho3 .

@trivialfis trivialfis changed the title [WIP] news for 2.0. [skip ci] News for 2.0. [skip ci] Sep 18, 2023
@trivialfis trivialfis merged commit 259d80c into dmlc:master Sep 18, 2023
22 checks passed
2.0 Roadmap automation moved this from 2.0 In Progress to 2.0 Done Sep 18, 2023
@trivialfis trivialfis deleted the 2.0-news branch September 18, 2023 21:46
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
2.0 Roadmap
  
2.0 Done
Development

Successfully merging this pull request may close these issues.

None yet

2 participants