Random search your hyper parameters.
- feature: access Config variables with '.' notation
- local save/load
- initial pip release
You set a list of options that define your hyperparams:
import hyperchamber as hc
hc.set('learning_rate', [0.1, 0.2, 0.5])
config = hc.random_config() # => { 'learning_rate' : 0.2 }
-
logistic regression classifier on MNIST code
Based on a simple tensorflow example. We find the best learning rate from a small set of options.
-
Finding a better network architecture for MNIST code
Uses hyperparameter tuning to find the best performing MNIST fully connected deep network configuration.
Our search space of options here is now 720 options. Note we only have 2 variables. This search space expands exponentially with new options to search.
python setup.py develop
import hyperchamber as hc
hc.set(name, values)
Sets a hyperparameter to values.
- If values is an array, config[name] will be set to one element in that array.
- If values is a scalar, config[name] will always be set to that scalar
hc.configs(n)
Returns up to n configs of the form {name:value} for each hyperparameter.
hc.save(config, filename)
Saves the config to a file.
hc.load(filename)
Load a configuration from file
hc.load_or_create_config(filename, config)
Load a configuration from file if that file exists. Otherwise save config
to that file. config
is assumed to be a Dictionary.
hc.record(filename, config)
Store the cost of a config's training results.
hc.top(sort_by)
Return the top results across all recorded results
Example:
def by_cost(x):
config, result =x
return result['cost']
for config, result in hc.top(by_cost):
print(config, result)