Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Disable caching for data #3807

Merged
merged 5 commits into from
Feb 29, 2020
Merged

Conversation

greschd
Copy link
Member

@greschd greschd commented Feb 25, 2020

Fixes #3802.

Change the default for the _cachable class attribute to False in the Node class. This means it will be enabled only in the CalculationNode sub-classes.

Dominik Gresch added 2 commits February 25, 2020 14:33
Change the default for the _cachable class attribute to False
in the Node class. This means it will be enabled only in the
CalculationNode sub-classes.
Because the _cachable attribute is now set to False for Data
nodes, the test for the '_store_from_cache' method failed because
caching was not used. Instead of creating a valid cachable node
(which is slightly complicated), we just call _store_from_cache
directly to circumvent the 'is_valid_cache' and '_cachable'
checks.
@greschd
Copy link
Member Author

greschd commented Feb 25, 2020

I had to adapt the test for _store_from_cache, because it uses a Data node. By explicitly calling _store_from_cache, the _cachable and is_valid_cache checks are circumvented.

Another option (which I also tried) would be using a CalculationNode instead, but then we have to manually set process_state and process_type to ensure is_valid_cache is True. Overall, it's the uglier solution IMO, with dependencies to different parts of the codebase.

@greschd greschd requested review from ltalirz and sphuber and removed request for ltalirz February 25, 2020 14:36
@greschd
Copy link
Member Author

greschd commented Feb 25, 2020

On the SQLA backend, _store_from_cache only works with with_transaction=True - is this expected?

@greschd greschd added this to the v1.1.1 milestone Feb 28, 2020
clone = data.clone().store()

clone = data.clone()
clone._store_from_cache(data, with_transaction=True) # pylint: disable=protected-access
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It is a bit weird that we have a caching test for a Data node even though they are now explicitly not cacheable. Can we change this in a CalculationNode? Should be straightforward change and then you can revert to using with enable_caching()

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I did try that (see my comment above), but it becomes complicated trying to make sure that the CalculationNode fulfills all the requirements of the is_valid_cache check. I'd rather not rely on that specific behavior (as we may change the is_valid_cache check again).

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What if Nodes remain cacheable by default, but this is simply disabled at the Data level?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This test is not possible with Node because they can not be stored.

@sphuber sphuber self-requested a review February 29, 2020 18:30
Copy link
Contributor

@sphuber sphuber left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

👍 thanks

@sphuber sphuber merged commit 358b916 into aiidateam:develop Feb 29, 2020
@sphuber sphuber deleted the disable_caching_for_data branch February 29, 2020 18:31
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Caching messes with 'verdi code duplicate'
3 participants