You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
For the first point, the DaskExecutor supports task packing currently so if you wanna give your custom plugin a shot that would be a good reference 馃檪 .
Steps you can use to check whether task packing is working as expected:
Temporarily add app_log.warning(f"Submitting {len(task_specs)} tasks to dask") to this line locally.
Run the workflow example you showed
Check covalent logs and you'll see that 1 task gets submitted to dask 2 times
Now, enable task packing with ct.set_config("sdk.task_packing", "true") in your workflow, and restart your notebook and the server
Then when you'll check the covalent logs, you'll see that 2 tasks were submitted to dask in 1 go -> i.e. the dictionary that was supposed to be created as a separate task, now gets packed with the task that uses it directly
That is essentially how it's going to work in practice and we're going to enable task packing by default once we have enough executors supporting it. It is still something that we haven't thoroughly tested but should help as a reference for the executors that do want to support it. (The send, poll and receive methods of the dask executor should help in understanding, and the class attribute of SUPPORTS_MANAGED_EXECUTION)
The text was updated successfully, but these errors were encountered:
What new feature would you like to see?
From Sankalp at Covalent:
For the first point, the DaskExecutor supports task packing currently so if you wanna give your custom plugin a shot that would be a good reference 馃檪 .
Steps you can use to check whether task packing is working as expected:
Temporarily add app_log.warning(f"Submitting {len(task_specs)} tasks to dask") to this line locally.
Run the workflow example you showed
Check covalent logs and you'll see that 1 task gets submitted to dask 2 times
Now, enable task packing with ct.set_config("sdk.task_packing", "true") in your workflow, and restart your notebook and the server
Then when you'll check the covalent logs, you'll see that 2 tasks were submitted to dask in 1 go -> i.e. the dictionary that was supposed to be created as a separate task, now gets packed with the task that uses it directly
That is essentially how it's going to work in practice and we're going to enable task packing by default once we have enough executors supporting it. It is still something that we haven't thoroughly tested but should help as a reference for the executors that do want to support it. (The send, poll and receive methods of the dask executor should help in understanding, and the class attribute of SUPPORTS_MANAGED_EXECUTION)
The text was updated successfully, but these errors were encountered: