-
Notifications
You must be signed in to change notification settings - Fork 16
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
mpi4py issue with large k-eigenvalue simulations in Lassen #101
Comments
@spasmann: was this like any of the issues you had seen? |
I've also encountered this issue, typically when running on the ND cluster. But I just ran the Takeda-1 problem with N=2e5 with 2 MPI processes and seem to get the same issue, although the output is somewhat different. This could be a problem with my installation of MPI. I will try again on the ND cluster when they are back online.
|
I think these issues happen due to the use of the lowercase, instead of the uppercase, mpi4py functions. Such as |
Tested the other mpi modules on Lassen. spectrum-mpi/2019.06.24 didn't run at all, and spectrum-mpi/2020.08.19 and spectrum-mpi/test-rolling-release got similar errors to the rolling release. |
On Lassen, MCDC breaks after the first eigenvalue cycle if the number of histories per cycle is larger than 1e5 and numba is enabled.
The text was updated successfully, but these errors were encountered: