You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Yeah, using the Result interface will be much more robust than trying to hack something together to store things in the dask cluster. #2619 about integrating results with cache_for/cache_validator may also interest you.
Thanks for the feedback! Looking into the result interface now and I'll see if I can make a new class that will keep a task in dask's distributed memory.
This issue was closed because it has been stale for 14 days with no activity. If this issue is important or you have more to add feel free to re-open it.
Current behavior
The docs here: https://docs.prefect.io/core/examples/cached_task.html document how to cache in a single flow. I am wondering if its possible to extend this to flows run from multiple processes.
Here in Output Caching (https://docs.prefect.io/core/concepts/persistence.html#output-caching) it mentions:
I assume the dask executor still uses the prefect core locally?
Proposed behavior
Store cached outputs in dask with a distributed variable (https://distributed.dask.org/en/latest/api.html?highlight=queue#distributed.Variable) so that when you run the example script below 2 times the cached value is used.
Example
This modified docs example doesn't return the same random number when run 2 times in a row
Maybe this is possible at the moment but I'm not sure how I could do it from the docs. Would be happy to document if that is the case.
The text was updated successfully, but these errors were encountered: