-
First we'll begin with the slides and theoretically explain what hyper-parameter optimization is, how it is done and the advance techniques that can be used to do so.
-
Now that its theory is covered we'll move to iPython notebook named, "Hyperparameter_optimization_good_and_bad_hps.ipynb" and see how hyperparameters effects the algorithm and do some exercise on the same. At the end of this notebook you'll learn how important it is to select best hyperparameters.
-
Now we'll see another notebook, "hyperparameter_tuning.ipynb", in this how hyperparameters are tuned by using simple loops in python and then we'll use scikit-learn library in "using_scikitlearn.ipynb" to do the same, and do some exercise as well on both the topics. Will make sure you would be able to code and tune the hyperparameters on your own once you finish these two notebook.
-
Now we'll move on to the last part and see more advanced way of hyperparameters tuning in notebook "Introduction to Hyperopt.ipynb", we'll see the use of TPE algorithm and do some more examples based on hyperopt library. We'll compare here simple exhaustive method of hyper-parameter tuning and tpe algorithm. By end of this notebook you'll learn how to use this hyperopt library efficiently, and creating its search spaces and minimizing the loss using it.
A tutorial on Hyper-Parameter Optimization
tanayag/tutorial_hyperparameter_optimization
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
About
A tutorial on Hyper-Parameter Optimization
Resources
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published