You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
3.8.19 (default, Mar 20 2024, 19:58:24)
[GCC 11.2.0] linux
0.5.0
Problem description
We have found bugs in using experiment_grid for parallel experiments on certain torch versions or CUDA driver versions, specifically: CUDA ERROR: initialization error. We believe this is a bug in CUDA during the parallel training process. Through this error message prompt and related community issues, we have tentatively set the solution as:
look at omnisafe/common/experiment_grid.py
import multiprocessing as mp
modify pool = Pool(max_workers=num_pool) to pool = Pool(max_workers=num_pool, mp_context=mp.get_context('spawn')) in line 445.
If you encounter similar problems during the use of the OmniSafe experiment grid, you can refer to this solution. If you have a better solution, you are welcome to leave a message under this issue!
Required prerequisites
What version of OmniSafe are you using?
0.5.0
System information
3.8.19 (default, Mar 20 2024, 19:58:24)
[GCC 11.2.0] linux
0.5.0
Problem description
We have found bugs in using experiment_grid for parallel experiments on certain torch versions or CUDA driver versions, specifically:
CUDA ERROR: initialization error
. We believe this is a bug in CUDA during the parallel training process. Through this error message prompt and related community issues, we have tentatively set the solution as:omnisafe/common/experiment_grid.py
pool = Pool(max_workers=num_pool)
topool = Pool(max_workers=num_pool, mp_context=mp.get_context('spawn'))
in line 445.If you encounter similar problems during the use of the OmniSafe experiment grid, you can refer to this solution. If you have a better solution, you are welcome to leave a message under this issue!
Reference:
Reproducible example code
Command lines:
Traceback
No response
Expected behavior
No response
Additional context
No response
The text was updated successfully, but these errors were encountered: