Skip to content

Conversation

@renovate
Copy link
Contributor

@renovate renovate bot commented Oct 27, 2025

This PR contains the following updates:

Package Change Age Confidence
opentsne ==1.0.2 -> ==1.0.4 age confidence
optuna ==4.5.0 -> ==4.6.0 age confidence
shap (changelog) ==0.48.0 -> ==0.49.1 age confidence

Release Notes

pavlin-policar/openTSNE (opentsne)

v1.0.4

Compare Source

v1.0.3

Compare Source

optuna/optuna (optuna)

v4.6.0

Compare Source

This is the release note of v4.6.0.

Highlights
Optuna Dashboard LLM Integration

Optuna Dashboard is a web-based tool that helps you easily explore and visualize your Optuna optimization history. The latest release, v0.20.0, introduces LLM integration, enabling the natural language-based Trial filtering and automatic Plotly chart generation. Please refer to the release blog for more details.

image55
Further Speed Enhancements for GPSampler

GPSampler becomes significantly faster owing to parallelized multi-start acquisition function optimization via PyTorch batching, and to optimized NumPy operations.

image15
Full Support for Multi-objective and Constrained Optimization in AutoSampler

We have fully implemented sampler selection rules for multi-objective and constrained optimization in AutoSampler. For more details, please see our blog post, "AutoSampler: Full Support for Multi-Objective & Constrained Optimization."

optuna-blog-autosampler-multi-constrained
Additions of Robust Bayesian Optimization Packages

Robust Bayesian optimization methods have been added to OptunaHub. Robust Bayesian optimization enables suggesting more robust parameters against input perturbations. This is especially helpful for Sim2Real transfer scenarios.

image25
Breaking Changes
Enhancements
  • Use iterator for lazy evaluation in journal storage’s read_logs (#​6144)
  • Cache pair-wise distances to speed up GPSampler (#​6244)
  • Speed up LogEI implementation (#​6248)
  • Speed up EHVI by optimizing tensor operation order (#​6257)
  • Use the decremental approach in the hypervolume contribution calculation (#​6264)
  • Use cached trials in TPESampler's sample_relative (#​6265)
  • Remove find_or_raise_by_id in _set_trial_value_without_commit (#​6266)
  • Speed up GPSampler by Batching Acquisition Function Evaluations (#​6268, thanks @​Kaichi-Irie!)
  • Use cached study direction and trial for _CachedStorage's get_best_trial (#​6270)
  • Add upsert in _set_trial_attr_without_commit for PostgreSQL (#​6282, thanks @​jaikumarm!)
  • Add states argument to _read_trials_from_remote_storage (#​6288)
  • Use cached trials for intersection search space calculation (#​6291)
  • Replace np.linalg.inv with np.linalg.cholesky to speed up GPSampler for numpy>=2.0.0 (#​6296)
Bug Fixes
  • Skip trial validation on copy_study (#​6249)
  • Fix incremental update algorithm in _CachedStorage's _read_trials_from_remote_storage (#​6310)
  • Add safety guard for exhaustive search (#​6321)
Documentation
  • Add AutoSampler to the sampler comparison table in the API reference (#​6260, thanks @​Kaichi-Irie!)
  • Update the GPSampler document to reflect support for constrained multi-objective optimization (#​6262)
  • Add a link to the metric TPE paper in the TPESampler document (#​6263)
  • Update announcement (#​6285)
  • Update the table of Samplers in docs (#​6287, thanks @​fusawa-yugo!)
  • Fix the table of samplers in the docs (#​6290)
Examples
Tests
Code Fixes
Continuous Integration
Other
Thanks to All the Contributors!

This release was made possible by the authors and the people who participated in the reviews and discussions.

@​AddyM, @​GabrielRomaoG, @​Jongwan93, @​Kaichi-Irie, @​ParagEkbote, @​Zrahay, @​c-bata, @​contramundum53, @​dross20, @​euangoodbrand, @​fusawa-yugo, @​gen740, @​jaikumarm, @​kAIto47802, @​ktns, @​nabenabe0928, @​nihalsiddiqui7, @​not522, @​satyarth7srivastava, @​sawa3030, @​toshihikoyanase, @​unKnownNG, @​y0z

shap/shap (shap)

v0.49.1

Compare Source

What's Changed

Fix broken v0.49.0 release.

The previous Release wasn't properly published due to HTTP errors on MacOS.


Configuration

📅 Schedule: Branch creation - Between 12:00 AM and 03:59 AM, only on Monday ( * 0-3 * * 1 ) (UTC), Automerge - At any time (no schedule defined).

🚦 Automerge: Disabled by config. Please merge this manually once you are satisfied.

Rebasing: Whenever PR becomes conflicted, or you tick the rebase/retry checkbox.

👻 Immortal: This PR will be recreated if closed unmerged. Get config help if that's undesired.


  • If you want to rebase/retry this PR, check this box

This PR was generated by Mend Renovate. View the repository job log.

@renovate renovate bot force-pushed the renovate/python-machine-learning-libs branch from 335a17c to f1dd38b Compare October 27, 2025 12:08
@renovate renovate bot changed the title Update dependency shap to v0.49.1 Update python-machine-learning-libs Oct 27, 2025
@renovate renovate bot force-pushed the renovate/python-machine-learning-libs branch from f1dd38b to 678ff79 Compare October 27, 2025 16:40
@renovate renovate bot force-pushed the renovate/python-machine-learning-libs branch from 678ff79 to e7b0bfd Compare November 10, 2025 05:46
@JohT JohT force-pushed the renovate/python-machine-learning-libs branch from e7b0bfd to b9a3541 Compare November 10, 2025 08:27
@JohT JohT force-pushed the renovate/python-machine-learning-libs branch from b9a3541 to d580e2a Compare November 10, 2025 18:18
@JohT JohT merged commit a6dd06e into main Nov 10, 2025
5 checks passed
@JohT JohT deleted the renovate/python-machine-learning-libs branch November 10, 2025 18:39
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants