Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Revert some COMPASS changes intended to support conda MPI #545

Merged

Conversation

xylar
Copy link
Collaborator

@xylar xylar commented May 1, 2020

It turns out not to be necessary to run python scripts with mpirun as long as we don't use the MPI version of netcdf4. This is possible even if we want to use the MPI version of esmf (which we do because ESMF_RegridWeightGen is way too slow in serial for certain mesh combinations).

This PR reverts:

  • 8217097, which modified setup_testcase.py to add mpirun to python scripts in certain circumstances
  • 190338d, which made sure planar_hex (which wasn't obviously a python script) also got run in this way.

This merge also updates to compass env. 0.1.5, which makes sure we always get the nompi variant of netcdf4, so that the shenanigans with mpirun isn't needed.

xylar added 3 commits May 1, 2020 07:15
This env will make sure we use the nompi variant of netcdf4
This reverts commit 8217097.

As long as the `netcdf4` package used is the `nompi` variant,
we don't need to detect MPI from the conda environment or call
python scripts with `mpirun`.
This reverts commit 190338d.

We do not need to run `planar_hex` with `mpirun` in any circumstances
as long as we use the `nompi` variant of `netcdf4`
@xylar
Copy link
Collaborator Author

xylar commented May 1, 2020

I just need to deploy compass 0.1.5 before this can get merged...

@xylar
Copy link
Collaborator Author

xylar commented May 1, 2020

Okay, compass 0.1.5 is deployed on LANL IC, Cori, Compy and Anvil.

Copy link
Contributor

@mark-petersen mark-petersen left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This passed the nightly regression suite using compass 0.1.5 on grizzly. I also created the other rpe_test domains with no problems. Thanks @xylar!

@mark-petersen
Copy link
Contributor

I'll merge in after Redi is completely settled on E3SM - probably in a few days.

@mark-petersen
Copy link
Contributor

Retested on IC. Passes nightly regression suite using both compass 0.1.4 and 0.1.5. Runs e3sm_coupling successfully with compass 0.1.4. The e3sm_coupling steps with ESMF_RegridWeightGen fail using compass_0.1.5, as described in #564 and MPAS-Dev/pyremap#15.

Merging in, as that specific problem is not caused by this PR.

@mark-petersen mark-petersen merged commit 9b19e52 into MPAS-Dev:ocean/develop May 19, 2020
@xylar xylar deleted the ocean/compass_revert_conda_mpi branch May 19, 2020 13:55
@xylar
Copy link
Collaborator Author

xylar commented May 19, 2020

Thanks for merging these 3, @mark-petersen. I'm working on a fix for #564 as you see.

@xylar xylar mentioned this pull request Aug 21, 2020
caozd999 pushed a commit to caozd999/MPAS-Model that referenced this pull request Jan 14, 2021
…an/develop

Revert some COMPASS changes intended to support conda MPI MPAS-Dev#545

It turns out not to be necessary to run python scripts with `mpirun` as
long as we don't use the MPI version of `netcdf4`.  This is possible
even if we want to use the MPI version of `esmf` (which we do because
`ESMF_RegridWeightGen` is way too slow in serial for certain mesh
combinations).

This PR reverts:
* 8217097, which modified `setup_testcase.py` to add `mpirun` to python
* scripts in certain circumstances
* 190338d, which made sure `planar_hex` (which wasn't obviously a python
* script) also got run in this way.

This merge also updates to `compass` env. 0.1.5, which makes sure we
always get the `nompi` variant of `netcdf4`, so that the shenanigans
with `mpirun` isn't needed.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants