Skip to content
Hyperparameter tuning using Particle Swarm Optimization and parallel computation which outperforms current approaches. V0.1 Beta
Python
Branch: master
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
examples
hypertune
test
.gitignore
LICENSE
README.md
future_improvements.md Update future_improvements.md Jan 16, 2020
requirements.txt
setup.py Update setup.py Jan 16, 2020

README.md

hypertune

A package to tune ML hyperparameters efficiently using Particle Swarm Optimization.

Please see ./examples for examples on how to use this package with your existing implementation.

Requires:

  • Python>=3 (built using v3.7.4)
  • numpy (built using v1.17.3)

Installation:

  • from PyPI via PIP:
    • TBD
  • from source via PIP:
    • pip install git+https://github.com/brodderickrodriguez/hypertune.git

Acknowledgements:

  • Travis E, Oliphant. A guide to NumPy, USA: Trelgol Publishing, (2006).

Contributors:

  • Brodderick Rodriguez (web)
You can’t perform that action at this time.