Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

PropensityFeatureStandardization deepcopy fix #35

Merged
merged 2 commits into from
May 24, 2022

Conversation

ehudkr
Copy link
Collaborator

@ehudkr ehudkr commented May 24, 2022

@mmdanziger:
Seems like because the _feature_functions were calculated based on self in __init__ and then stored in an attribute they became effectively un-deepcopy-able. What was happening was that the deepcopied version would run fit on the fresh learner but then when it came to calculated the feature_functions it would use the old learner which had not been fit and was stored in the _feature_functions attribute.

The root of the problem is having _feature_functions as an attribute at all. It's a light function and only calculated once per predict so the added value of caching is dubious and creates cache invalidation problems. This required removing the attribute and replacing it with a lookup function.

mmdanziger and others added 2 commits May 24, 2022 10:23
removed calculating of feature functions on init
this makes the object un-deepcopyable.
the reason is that the set of feature functions store a copy of self
that does not get updated on deepcopy.
since the function is only called once, and it is a light function,
i opted to remove the attribute entirely and just create the dict
when it is needed. that way self will never be misreferenced.

Add deepcopy tests
@ehudkr ehudkr merged commit 40a3a47 into master May 24, 2022
@ehudkr ehudkr deleted the propensitystandardization-deepcopy-fix branch May 24, 2022 11:14
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants