Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Enh/differential cgp #57

Closed
wants to merge 3 commits into from
Closed

Enh/differential cgp #57

wants to merge 3 commits into from

Conversation

jakobj
Copy link
Member

@jakobj jakobj commented Mar 30, 2020

This PR add a new node CGPParameter that, in contrast to the CGPConstantFloat can store arbitrary values. This is achieved by keeping (and passing on) a dictionary in each individual which for each position in the graph stores a corresponding CGPParameter value that can be changed, e.g., by copying values from an optimized torch class.
It's fairly clean in terms of implementation, untested with regards to performance and questionable whether this implementation supports efficient evolution. We should hence discuss it in detail @mschmidt87

In particular: does it make sense to store the values in an individual, where values are only passed on from one individual to the next? this means a) good values can get lost if the individual doesn't manage to survive b) bad values can lead to evolution disregarding constant nodes if as soon as they are switched on they lead to terrible fitness values; maybe it would make more sense to store the values at the population level?

@jakobj jakobj added this to the 0.1 milestone Mar 30, 2020
@jakobj jakobj requested a review from mschmidt87 March 30, 2020 21:29
@jakobj jakobj marked this pull request as draft April 13, 2020 14:58
@jakobj jakobj linked an issue Apr 20, 2020 that may be closed by this pull request
@mschmidt87
Copy link
Member

Sorry for the long delay in replying. Indeed, it is a good question how differentiation would be most useful in the CGP framework. I think that the approach you're suggesting makes most sense. In fact, I am not sure how storing the values at the population level would work in practice. The problem here seems to be that the optimized parameter values are only optimal within a certain graph, i.e., an individual, don't you think?

Regarding the use cases, I think that the usefulness of gradient-based optimization lies in judging the "potential" of an individual: a parametrized individual can yield very different fitness values depending on its parametrization. Thus, to judge the fitness of an individual, we should optimize these parameters. Whether this should be done excessively in each generation (i.e. a full optimization with many iterations) or just with few iterations to get some estimate, would be left to the user.
As far as I understand, your implementation would support this use case, wouldn't it?

@jakobj
Copy link
Member Author

jakobj commented Apr 24, 2020

I agree, the tuned parameters only make sense within a particular computational graph. when tuned parameters are passed on from one individual to the next, mutations can optimize that graph given those parameters. this seems intuitive.

Also fully agree on introducing local search, i.e., grad descent here, lies in judging the potential of an individual. and how much this is explored is a hyperparameter that can be chosen by the user (e.g., https://github.com/jakobj/python-gp/pull/57/files#diff-5fc60d716681a75e11aabee3debd4e12R60 ff could proceed for more than 10 steps). should we provide a "template" local search function that performs gradient descent using a particular torch optimizer?

@mschmidt87 could you confirm that you agree with the general approach? in that case i would prepare this PR for detailed review.

@jakobj jakobj changed the title [WIP] Enh/differential cgp Enh/differential cgp Apr 24, 2020
@mschmidt87
Copy link
Member

@mschmidt87 could you confirm that you agree with the general approach? in that case i would prepare this PR for detailed review.

Yes, your suggested approach makes sense to me.

@jakobj
Copy link
Member Author

jakobj commented Apr 28, 2020

closing in favor of #90. discussion should be continued there.

@jakobj jakobj closed this Apr 28, 2020
@jakobj jakobj deleted the enh/differential-cgp branch May 4, 2020 19:28
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Provide mechanisms to combine evolution with local search
2 participants