Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: lru-cache usage #171

Merged
merged 3 commits into from Jun 17, 2021
Merged

Conversation

julia-shenshina
Copy link
Contributor

Hi!

I tried to use copy.deepcopy to copy Binseg object and met a problem: binseg.single_bkp and deepcopy(binseg).single_bkp referred to the same object. So, I can not use binseg_copy.single_bkp because it tries to use the data from the original object.

It seems to be caused by the way lru_cache is used. Are there any reasons not to use it with @ decorator syntax?

@julia-shenshina julia-shenshina changed the title Fix lru-cache for binseg fix: lru-cache for binseg usage Jun 12, 2021
@oboulant
Copy link
Collaborator

oboulant commented Jun 14, 2021

Hi,

Thanks for your interest in ruptures !

Indeed, seems like with the current use of lru_cache then the resulting deepcopy(binseg).single_bkp would point to the original binseg.single_bkp

What would be the use case for a deepcopy for which deepcopy(binseg).single_bkp needs to be different from the original binseg.single_bkp ? I am asking the question because if you perform a deepcopy and still have the same signal and parameters (whether it is min_size, jump at object creation time or n_bkps, pen or epsilon at prediction time), it makes sense to point toward the same function (from an optimisation standpoint) , no ? On the other hand, the current use or lru_cache is not very explicit in case of deepcopy. I will look into it !

In the meantime what you can do as a workaround is to create a brand new object (not creating one with deepcopy).

from ruptures.detection import Binseg
binseg = Binseg()
binseg_bis = Binseg()
assert id(binseg.single_bkp) != id(binseg_bis.single_bkp)

@codecov
Copy link

codecov bot commented Jun 14, 2021

Codecov Report

Merging #171 (5a7ee04) into master (eeb31fb) will not change coverage.
The diff coverage is 100.00%.

Impacted file tree graph

@@           Coverage Diff           @@
##           master     #171   +/-   ##
=======================================
  Coverage   96.31%   96.31%           
=======================================
  Files          40       40           
  Lines         978      978           
=======================================
  Hits          942      942           
  Misses         36       36           
Impacted Files Coverage Δ
src/ruptures/detection/binseg.py 100.00% <100.00%> (ø)
src/ruptures/detection/bottomup.py 100.00% <100.00%> (ø)
src/ruptures/detection/dynp.py 100.00% <100.00%> (ø)

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update eeb31fb...5a7ee04. Read the comment docs.

@oboulant
Copy link
Collaborator

After giving some thoughts about it, I think what @julia-shenshina is proposing is the way to go if we want to address this.
WDYT @deepcharles ? Is there any particular reason why not using the decorator and rather call lru_cache in the constructor ?
If we decide to proceed, we might also want to change Dynp and BottomUp in a same way.

@deepcharles
Copy link
Owner

Indeed, I do not remember why I did not use the decorator. We can change it also in Dynp and BottomUP.

@deepcharles deepcharles added the Type: Fix Bug or Bug fixes label Jun 16, 2021
@oboulant oboulant self-requested a review June 16, 2021 08:54
Copy link
Collaborator

@oboulant oboulant left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@julia-shenshina
Can you proceed to the following changes :

  • Change the use of lru_cache to decorator also for BottomUp and Dynp
  • Rename the PR to fix: lru-cache usage

Thanks a lot !

@julia-shenshina julia-shenshina changed the title fix: lru-cache for binseg usage fix: lru-cache usage Jun 16, 2021
@julia-shenshina
Copy link
Contributor Author

Thank you for your answers!

Changed lru_cache usage for Binseg, Dynp, BottomUp.

@oboulant oboulant merged commit ae85f9d into deepcharles:master Jun 17, 2021
@oboulant
Copy link
Collaborator

Merged, thx @julia-shenshina !

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Type: Fix Bug or Bug fixes
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants