Grammatical Evolution driven Algorithm for Efficient and Automatic Hyperparameter Optimisation of Neural Networks
This research work presents a Grammatical Evolution driven efficient hyperparameter optimization with the objective to reduce the computational resources. We propose two methods- Disjoint Dataset Sampling (DDS) and Search Space Pruning (SSP). Our method when rigorously tested against the benchmark of Bayesian Optimisation speeds up the optimization by 2x in the DDS approach while it significantly reduces the search space by 50% in the SSP.