Skip to content

tanayag/tutorial_hyperparameter_optimization

Repository files navigation

WorkFlow

  • First we'll begin with the slides and theoretically explain what hyper-parameter optimization is, how it is done and the advance techniques that can be used to do so.

  • Now that its theory is covered we'll move to iPython notebook named, "Hyperparameter_optimization_good_and_bad_hps.ipynb" and see how hyperparameters effects the algorithm and do some exercise on the same. At the end of this notebook you'll learn how important it is to select best hyperparameters.

  • Now we'll see another notebook, "hyperparameter_tuning.ipynb", in this how hyperparameters are tuned by using simple loops in python and then we'll use scikit-learn library in "using_scikitlearn.ipynb" to do the same, and do some exercise as well on both the topics. Will make sure you would be able to code and tune the hyperparameters on your own once you finish these two notebook.

  • Now we'll move on to the last part and see more advanced way of hyperparameters tuning in notebook "Introduction to Hyperopt.ipynb", we'll see the use of TPE algorithm and do some more examples based on hyperopt library. We'll compare here simple exhaustive method of hyper-parameter tuning and tpe algorithm. By end of this notebook you'll learn how to use this hyperopt library efficiently, and creating its search spaces and minimizing the loss using it.

Releases

No releases published

Packages

No packages published