-
Notifications
You must be signed in to change notification settings - Fork 1.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Trim out data transformations operators that are downstream of the last classification step #70
Comments
Could we solve this by having a multi-objective fitness function? Combined from:
As far as I understand deap supports multi-objective optimization out-of-the-box. |
I'm currently developing and testing a multi-objective version of TPOT. On Tuesday, January 5, 2016, kadarakos notifications@github.com wrote:
Randal S. Olson, Ph.D. E-mail: rso@randalolson.com | Twitter: @randal_olson |
What kind of measures are you using for complexity? |
The two you mentioned -- number of pipeline operators and runtime -- but On Wed, Jan 6, 2016 at 12:49 AM, kadarakos notifications@github.com wrote:
Randal S. Olson, Ph.D. E-mail: rso@randalolson.com | Twitter: @randal_olson |
This is now encapsulated in #206, so I'm going to close this issue. |
Sometimes the optimized pipeline will look something like this:
The last transformation step adds nothing. We should cleanup the pipeline by adding a post-processing step to
tpot.fit
that trims out unnecessary operators from the optimized pipeline. This will be trivial after incorporating the refactor in #63 as we could just add an attribute to the base classes to identify whether or not an operator can be the pipeline terminus. Something like:I felt it'd probably be better to create a new issue for this topic rather than unilaterally adding a commit downstream of the #63 HEAD.
The text was updated successfully, but these errors were encountered: