Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add tuning by genetic algorithms #38

Open
ablaom opened this issue Jan 13, 2019 · 2 comments
Open

Add tuning by genetic algorithms #38

ablaom opened this issue Jan 13, 2019 · 2 comments
Labels
design discussion Discussing design issues help wanted Extra attention is needed

Comments

@ablaom
Copy link
Member

ablaom commented Jan 13, 2019

We have a Grid tuning strategy but should add genetic algorithm style tuning Genetic <: TuningStrategy with corresponding fit, best and predict methods for TunedModel{Genetic,<:Model}. See the related issue #37.

@ablaom ablaom added help wanted Extra attention is needed design discussion Discussing design issues labels Jan 13, 2019
@jpsamaroo
Copy link
Collaborator

In order to utilize a genetic algorithm, one first need to define the structure of the genetic material and how it maps to the final individual.

Going based on the existing tuning implementation, we could imagine having a minimum of one "gene" per supplied tunable parameter, where the gene is represented as a bitstring and a mapping from parameter value to bits in the bitstring. Genes will be strung together to construct the full genetic material for an individual, and would be subject to the usual rules of mutation and recombination. Additionally, parameters with more than 2 possible states would be expanded into longer genes, whose combined bit values encode the final value of that parameter. Of course, if the number of feasible values is not a power of two, then normalization or rejection of infeasible encodings will need to be implemented. If the number of feasible solutions is mathematically unbounded (such as a range of floating point values), then we also need a strategy to deal with this.

As a comparison, the Borg MOEA algorithm (implemented in BlackBoxOptim.jl) takes the approach of first randomly sampling from (continuous) parameter values to establish an initial population, and then utilizes genetic methods to exchange genes. Importantly, the genes may only take on a value present in one of the initial individuals; arbitrary floating point values are not generated at "runtime". One important caveat is that Borg cannot naturally handle parameters with a countable number of states (such as integer ranges or countable sets). (Note that this is my understanding of the algorithm, and I could be mistaken).

I think locating a method which is able to handle both continuous and discrete parameter ranges would be worth doing, if such a beast exists, as it would be straightforward to implement in the current MLJ tuning configuration.

@jpsamaroo
Copy link
Collaborator

Relevant discussion: robertfeldt/BlackBoxOptim.jl#136

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
design discussion Discussing design issues help wanted Extra attention is needed
Projects
None yet
Development

No branches or pull requests

3 participants
@jpsamaroo @ablaom and others