Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

PETSc packages uses mpicc instead of cc as compiler on cray #37010

Open
wspear opened this issue Apr 18, 2023 · 3 comments
Open

PETSc packages uses mpicc instead of cc as compiler on cray #37010

wspear opened this issue Apr 18, 2023 · 3 comments

Comments

@wspear
Copy link
Contributor

wspear commented Apr 18, 2023

On cray systems like perlmutter and polaris petsc as built by spack will have the petscvariables file set PCC/CC and other relevant compiler/linker variables to the path to mpicc (e.g. /opt/cray/pe/mpich/8.1.16/ofi/gnu/9.1/bin/mpicc). This works in some cases but cray's mpicc will not enable gpu accelerated mpi. The cc compiler must be specified for this functionality. Currently executables built using PETSc's provided make variables will fail to run unless MPICH GPU support is deactivated.

This configuration situation is caused by
"--with-cc=%s" % self.spec["mpi"].mpicc,
In the mpi variant case in the package. I think when cray is detected the package should instead use the default compiler: "--with-cc=%s" % os.environ["CC"],

@balay @BarrySmith @jedbrown @eugeneswalker

@wspear
Copy link
Contributor Author

wspear commented Apr 20, 2023

I've learned that using the cc/CC/ftn cray compilers is strongly discouraged by spack. Alternative fixes are being investigated.

@balay
Copy link
Contributor

balay commented Apr 20, 2023

There are 2 ways of using gpu enabled mpi [I might have used the second one with spack]

https://gitlab.com/petsc/petsc/-/blob/main/config/examples/arch-olcf-crusher.py

  • link directly with "-lmpi_gtl_hsa" [or similar for cuda]
  • load "craype-accel-amd-gfx90a" [or similar for cuda] - but this also requires --with-openmp=1 [otherwise linking in fortran objs fails]

@balay
Copy link
Contributor

balay commented Apr 20, 2023

Hm - 'modules' I guess work from CC - but not mpicc. So my recollection of my spack install could be wrong [I might have just tested without gpu enabled mpi]

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants