Skip to content

Some notes/codes on hyperparameters tuning techniques with some hacking around...

Notifications You must be signed in to change notification settings

benoitdescamps/Hyperparameters-tuning

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

26 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Hyperparameters-tuning

Series of notes/code-snippets about various hyperparameter optimisation techniques.

Allocation-based optimisations

SuccessiveHalving

Code is based on the paper by Talwalkar A. Non-stochastic best arm identification and hyperparameter optimization. Some Example code for :

  • sklearn: RandomForest, GradientBoosting, etc...
  • xgboost

More information can be found in this blog post

Hyperband

Code is based on the paper by Talwalkar A. Hyperband: A Novel Bandit-Based Approach to Hyperparameter Optimization.

TODO:

  • add setup.py
  • add example Extreme-random-forest
  • add proper hyperparameter-spaces in examples
  • ? comparison for some dataset?

References:

Random Search

Spark implementation of Random Search for sampling over breeze.stats.distribution. More information can be found in this blog post

About

Some notes/codes on hyperparameters tuning techniques with some hacking around...

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published