Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[MRG] BUG Fixes partial dependence bug by setting axes to be invisible #15572

Conversation

thomasjpfan
Copy link
Member

@thomasjpfan thomasjpfan commented Nov 8, 2019

Reference Issues/PRs

Fixes bug with the title not displaying in examples/plot_partial_dependence_visualization_api.py

Instead of making the axes not visible, this checks if the axes is off to see if the axes was used twice.

What does this implement/fix? Explain your changes.

To see why this check was needed, please look at test_plot_partial_dependence_with_same_axes.

Any other comments?

CC @NicolasHug @amueller @glemaitre

Copy link
Member

@NicolasHug NicolasHug left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm fine with the changes but I don't understand what test_plot_partial_dependence_with_same_axes is supposed to explain

Why aren't we allowed to use the same ax object?

@thomasjpfan
Copy link
Member Author

The expected behavior of using the same axes, is to draw on axes created by the first call. The following example will fail:

tree_disp = plot_partial_dependence(mlp, X, ["LSTAT", "RM"], ax=ax)

# will error
mlp_disp = plot_partial_dependence(mlp, X, ["LSTAT", "RM"], ax=ax)

This is because tree_disp will create two new axes to place on top of the ax passed in. This will result in 3 total axes in the figure. If we want the second call to plot on the axes created by the first call, then it needs to look at the axes passed in, find the axes that may be taking the same space as the original axes, go into the figure object to grab the two axes, and the plot on them.

Currently our api allows:

tree_disp = plot_partial_dependence(mlp, X, ["LSTAT", "RM"], ax=ax)

# will not raise
mlp_disp = plot_partial_dependence(mlp, X, ["LSTAT", "RM"], ax=tree_disp.axes_)

Because the second call gets the axes directly and uses them to plot the new curves.

@NicolasHug
Copy link
Member

Thanks. Would be great if test_plot_partial_dependence_with_same_axes explained this (more succinctly)

@thomasjpfan
Copy link
Member Author

Added more details in explaining test_plot_partial_dependence_with_same_axes to this PR, since it mostly related.

@glemaitre
Copy link
Member

Added more details in explaining test_plot_partial_dependence_with_same_axes to this PR, since it mostly related.

Are you sure that you added something? I don't see any push since your comment?

@thomasjpfan
Copy link
Member Author

Are you sure that you added something? I don't see any push since your comment?

Oops forgot to push

@glemaitre glemaitre merged commit 2885a06 into scikit-learn:master Nov 14, 2019
@glemaitre
Copy link
Member

LGTM now :) Thanks @thomasjpfan

@glemaitre glemaitre removed their request for review November 14, 2019 17:47
@amueller
Copy link
Member

I just talked with @tacaswell and he suggested removing the axes from the figure instead of hiding it.

@thomasjpfan
Copy link
Member Author

Initially, I resized the original axes to the size of one of the plots but it was really janky with constraints. I am happy with removing the original axes.

@amueller
Copy link
Member

@thomasjpfan the other solution which was suggested was reshaping the axis to be the first axis of the stolen space. But maybe doing the removal for now would be ok?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants