-
The main difference from r20240302 is a switch from OpenMPI to MPICH as the MPI implementation that is provided.
-
The change is because with the OpenMPI conda build of parallel-netCDF, loading the netCDF4 Python module prevents
mpirun
/mpiexec
from being run from inside Python; this problem is not present in the MPICH based build. Switching to MPICH was chosen as the workaround following consultation with Jaspy users. -
Usage of MPICH should be similar to OpenMPI for simple examples such as
mpirun -n 2 ./my_executable
. (Note that if the executable is in the current directory, then the./
will need to be included - this is not required in OpenMPI.) The less commonly-used command line options are unlikely to be the same as in OpenMPI, so typempirun --help
for more options. -
Note that (as of May 2024) it remains the case for the time being that the MPI provided in Jaspy should only be used for within-node parallisation on LOTUS (
par-single
partition), and that separate modules on JASMIN outside of Jaspy are provided for code that usespar-multi
(seeeb/OpenMPI/*
andnetcdf/intel19.0.0.166/*
).
-
-
This Jaspy release also includes a general update of packages, in so far as available versions have changed in the 2 months since the previous release.
-
We are aware of a bug in some
ipython
versions, which affects the display of matplotlib windows after the interpreter has been invoked using the shell commandipython --pylab
. Unfortunately, one of the affected versions (8.22.2) is included in this Jaspy release, but there is a simple workaround: use the commandipython --pylab=qt
instead. Apologies that this issue was overlooked when preparing the Jaspy release, but please note that use of the pylab option is discouraged in any case. -
Note that the previous version (
r20240302
) was never the default version on JASMIN, so for other differences betweenr20240508
and the earlier default version (r20230718
), please also see the release notes for the previous version.