The bayesmark package is another wrapper hyper parameter tuning library. We can add this to our benchmarking suite. Per their documentation, they wrap around:
The builtin optimizers are wrappers on the following projects:
HyperOpt
Nevergrad
OpenTuner
PySOT
Scikit-optimize
https://github.com/uber/bayesmark/
And we already benchmark against HyperOpt. Note that OpenTuner is a previous package developed at MIT in 2014.