Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Building parts of Spack environments on different nodes: option1 (include other concretised environments) #58

Open
ccaefch0523 opened this issue Jun 14, 2024 · 4 comments

Comments

@ccaefch0523
Copy link

I created a new spacksite in Myriad using Spack 0.22 on the build01 node:

[ccspapp@build01 hpc-spack]$ spacksites/spacksites create fc-myriad-s0.22-test
/spack/0.22/fc-myriad-s0.22-test/spack/share/spack/setup-env.sh spack compiler find --scope=site
# SPACKSITES: have set SPACK_DISABLE_LOCAL_CONFIG=1
# SPACKSITES: have set HPC_SPACK_ROOT=/lustre/scratch/scratch/ccspapp/spack/0.22/hpc-spack
# SPACKSITES: Have set spacks's external compiler and python dependencies - now calling spack compiler find --scope=site
==> Added 3 new compilers to /lustre/shared/ucl/apps/spack/0.22/fc-myriad-s0.22-test/spack/etc/spack/compilers.yaml
    gcc@4.9.2  gcc@4.8.5  gcc@11.2.1
==> Compilers are defined in the following files:
    /lustre/shared/ucl/apps/spack/0.22/fc-myriad-s0.22-test/spack/etc/spack/compilers.yaml
# SPACKSITES: Now calling: #######
 source /lustre/scratch/scratch/ccspapp/spack/0.22/hpc-spack/spacksites/process-env-scripts/spack-deps-rhel-7.8.sh

source  /shared/ucl/apps/spack/0.22/fc-myriad-s0.22-test/spack/share/spack/setup-env.sh
spack config --scope=site get config
# SPACKSITES: ####################
# SPACKSITES: site.build_stage set to:  /shared/ucl/apps/spack/0.22/fc-myriad-s0.22-test/spack/../build_stage

gpg key trust done so I can use the buildcache

(spacksite: fc-myriad-s0.22-test) [ccspapp@build01 hpc-spack]$ spack gpg trust /shared/ucl/apps/spack/0.22/buildcache/build_cache/_pgp/8AD9CBD92CD2A4AEB15F3458969BB097C2225210.pub
gpg: key C2225210: public key "ARCHPCSolutions (GPG created for Spack) <rc-support@ucl.ac.uk>" imported
gpg: Total number processed: 1
gpg:               imported: 1  (RSA: 1)
gpg: inserting ownertrust of 6

but then I had an empty list of the buildcache:

(spacksite: fc-myriad-s0.22-test) [ccspapp@build01 hpc-spack]$ spack buildcache list --allarch
==> 0 cached builds.

so I tried updating the indexing without the flag -d and it works!

(spacksite: fc-myriad-s0.22-test) [ccspapp@build01 hpc-spack]$ spack buildcache update-index -d /shared/ucl/apps/spack/0.22/buildcache
==> Error: unrecognized arguments: -d
(spacksite: fc-myriad-s0.22-test) [ccspapp@build01 hpc-spack]$ spack buildcache update-index /shared/ucl/apps/spack/0.22/buildcache
(spacksite: fc-myriad-s0.22-test) [ccspapp@build01 hpc-spack]$ spack buildcache list --allarch
==> 134 cached builds.
-- linux-rhel7-cascadelake / gcc@11.2.1 -------------------------
autoconf@2.69        bzip2@1.0.8         gdbm@1.23       gmp@6.2.1        libxml2@2.10.3  pigz@2.8       xz@5.4.6
automake@1.16.5      diffutils@3.9       gettext@0.22.5  libiconv@1.17    m4@1.4.19       pkgconf@1.9.5  zlib@1.3.1
berkeley-db@18.1.40  findutils@4.9.0     glibc@2.17      libsigsegv@2.14  ncurses@6.4     readline@8.2   zstd@1.5.5
binutils@2.42        gcc-runtime@11.2.1  gmake@4.4.1     libtool@2.4.7    perl@5.38.0     tar@1.30

-- linux-rhel7-cascadelake / gcc@12.3.0 -------------------------
apr@1.7.4                           hdf5@1.14.3          ...

Experimenting with the myriad.yaml generated by Heather see https://github.com/UCL-ARC/hpc-spack/issues/56

(spacksite: fc-myriad-s0.22-test) [ccspapp@build01 hpc-spack]$ spack env create -d myproject  $HPC_SPACK_ROOT/spacksites/spack-env-templates/dev1/build/myriad.yaml
==> Created independent environment in: /lustre/scratch/scratch/ccspapp/spack/0.22/hpc-spack/myproject
==> Activate with: spack env activate myproject

I activated my env myproject

(spacksite: fc-myriad-s0.22-test) [ccspapp@build01 hpc-spack]$  spack env activate -p myproject
[myproject] (spacksite: fc-myriad-s0.22-test) [ccspapp@build01 hpc-spack]$ spack find 
==> In environment /lustre/scratch/scratch/ccspapp/spack/0.22/hpc-spack/myproject
==> 19 root specs
-- no arch / gcc@12.3.0 -----------------------------------------
 -  bcftools@1.16%gcc@12.3.0
 -  beast2@2.7.4%gcc@12.3.0
 -  bedtools2@2.31.0%gcc@12.3.0
 -  bwa@0.7.17%gcc@12.3.0
 -  cp2k@2023.2%gcc@12.3.0 +cuda+mpi+openmp~sirius build_system=cmake cuda_arch=80
 -  cuda@12.2.0%gcc@12.3.0
 -  dbcsr@2.6.0%gcc@12.3.0
 -  gatk@4.4.0.0%gcc@12.3.0
 -  gzip@1.13%gcc@12.3.0
 -  hdf5@1.14.3%gcc@12.3.0 +cxx+fortran+hl~mpi
 -  hdf5@1.14.3%gcc@12.3.0 +fortran+hl+mpi
 -  htslib@1.17%gcc@12.3.0
 -  netcdf-c@4.9.2%gcc@12.3.0 ~mpi
 -  netcdf-fortran@4.6.1%gcc@12.3.0
 -  openmpi@4.1.6%gcc@12.3.0  fabrics=cma,knem,ofi,psm2,ucx,xpmem schedulers=sge
 -  picard@2.26.2%gcc@12.3.0
 -  python@3.11.6%gcc@12.3.0
 -  samtools@1.17%gcc@12.3.0
 -  vcftools@0.1.16%gcc@12.3.0

==> 0 installed packages

Adding Gromacs to the spec in my env myprojectand then concretise

spack add gromacs@2023.1%gcc@12.3.0+double ^openmpi@4.1.6%gcc@12.3.0 
==> Adding gromacs@2023.1%gcc@12.3.0+double ^openmpi@4.1.6%gcc@12.3.0 to environment /lustre/scratch/scratch/ccspapp/spack/0.22/hpc-spack/myproject

Gromacs is added to the root specs

spack find
==> In environment /lustre/scratch/scratch/ccspapp/spack/0.22/hpc-spack/myproject
==> 20 root specs
-- no arch / gcc@12.3.0 -----------------------------------------
 -  bcftools@1.16%gcc@12.3.0
 -  beast2@2.7.4%gcc@12.3.0
 -  bedtools2@2.31.0%gcc@12.3.0
 -  bwa@0.7.17%gcc@12.3.0
 -  cp2k@2023.2%gcc@12.3.0 +cuda+mpi+openmp~sirius build_system=cmake cuda_arch=80
 -  cuda@12.2.0%gcc@12.3.0
 -  dbcsr@2.6.0%gcc@12.3.0
 -  gatk@4.4.0.0%gcc@12.3.0
 -  gromacs@2023.1%gcc@12.3.0 +double
 -  gzip@1.13%gcc@12.3.0
 -  hdf5@1.14.3%gcc@12.3.0 +cxx+fortran+hl~mpi
 -  hdf5@1.14.3%gcc@12.3.0 +fortran+hl+mpi
 -  htslib@1.17%gcc@12.3.0
 -  netcdf-c@4.9.2%gcc@12.3.0 ~mpi
 -  netcdf-fortran@4.6.1%gcc@12.3.0
 -  openmpi@4.1.6%gcc@12.3.0  fabrics=cma,knem,ofi,psm2,ucx,xpmem schedulers=sge
 -  picard@2.26.2%gcc@12.3.0
 -  python@3.11.6%gcc@12.3.0
 -  samtools@1.17%gcc@12.3.0
 -  vcftools@0.1.16%gcc@12.3.0

==> 0 installed packages

I need to define a rule to exclude installing/concretising cp2k in my env ???

@heatherkellyucl
Copy link
Collaborator

heatherkellyucl commented Jun 17, 2024

You don't need to have a rule excluding cp2k - instead it ought to be in the gpu environment.

Don't use my myriad.yaml directly, instead you need to modify it to use include_concrete instead of

  include:
  - $HPC_SPACK_ROOT/spacksites/spack-env-templates/dev1/build/base.yaml
  - $HPC_SPACK_ROOT/spacksites/spack-env-templates/dev1/build/gpu.yaml
  - $HPC_SPACK_ROOT/spacksites/spack-env-templates/dev1/build/gpu-on-gpu.yaml

so that you make separate concretised environments for base, gpu-on-gpu, gpu and then include those in myriad.yaml.

@heatherkellyucl
Copy link
Collaborator

(There's a question about which of the environments should include_concrete which others - so it ends up more like the chaining than what I did).

@heatherkellyucl heatherkellyucl changed the title Spack 0.22 environments option2 Building parts of Spack environments on different nodes: option1 (include other concretised environments) Jun 17, 2024
@heatherkellyucl
Copy link
Collaborator

For example, you concretise and build the base environment, then the gpu-on-gpu environment separately (and on a gpu node), then the gpu environment that includes the gpu-on-gpu environment, and finally the myriad environment that includes everything and which has anything extra that is only for myriad.

That's if we keep the split of how the environments are divided up the same as I initially did - if another split makes more sense, do that instead.

CPU Gromacs should go in base if we're keeping the split the same since CPU Gromacs gets installed everywhere.

@heatherkellyucl
Copy link
Collaborator

I got the most recent gromacs from spack develop: #44 (comment)

so we can have

gromacs@2024.2 +double
gromacs@2024.2 +cuda cuda_arch=80 
gromacs@2023 +double +plumed
gromacs@2023 +cuda cuda_arch=80 +plumed

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants