Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

The ESMF build system strikes back: failure to build on Narwhal #517

Closed
climbfuji opened this issue Mar 24, 2023 · 25 comments · Fixed by JCSDA/spack#252
Closed

The ESMF build system strikes back: failure to build on Narwhal #517

climbfuji opened this issue Mar 24, 2023 · 25 comments · Fixed by JCSDA/spack#252
Assignees
Labels
bug Something is not working INFRA JEDI Infrastructure

Comments

@climbfuji
Copy link
Collaborator

climbfuji commented Mar 24, 2023

Describe the bug
ESMF no longer builds on Narwhal. Same versions as in spack-stack-1.2.0: esmf@8.3.0b09, same compiler, same MPI library. Error is as follows. It's apparently missing the -lmpifort linker flag, and I don't know why:

make[5]: Leaving directory '/p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/cache/build_stage/heinzell/spack-stage-esmf-8.3.0b09-v6l7wwbbiwuonp2q7tlib3jfffkh3zs5/spack-src/src/apps/ESMF_PrintInfoC'
/p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/spack/lib/spack/env/intel/icpc -c  -std=c++11 -O -DNDEBUG -fPIC -pthread  -qopenmp -I/p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/cache/build_stage/heinzell/spack-stage-esmf-8.3.0b09-v6l7wwbbiwuonp2q7tlib3jfffkh3zs5/spack-src/src/apps/ESMF_PrintInfoC -I/p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/cache/build_stage/heinzell/spack-stage-esmf-8.3.0b09-v6l7wwbbiwuonp2q7tlib3jfffkh3zs5/spack-src/src/apps/ESMF_PrintInfoC/../include -I/p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/cache/build_stage/heinzell/spack-stage-esmf-8.3.0b09-v6l7wwbbiwuonp2q7tlib3jfffkh3zs5/spack-src/build_config/Unicos.intel.default -I/p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/cache/build_stage/heinzell/spack-stage-esmf-8.3.0b09-v6l7wwbbiwuonp2q7tlib3jfffkh3zs5/spack-src/src/Infrastructure -I/p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/cache/build_stage/heinzell/spack-stage-esmf-8.3.0b09-v6l7wwbbiwuonp2q7tlib3jfffkh3zs5/spack-src/src/Superstructure -I/p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/cache/build_stage/heinzell/spack-stage-esmf-8.3.0b09-v6l7wwbbiwuonp2q7tlib3jfffkh3zs5/spack-src/src/Infrastructure/stubs/pthread  -I/p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/cache/build_stage/heinzell/spack-stage-esmf-8.3.0b09-v6l7wwbbiwuonp2q7tlib3jfffkh3zs5/spack-src/src/include  -I/p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/envs/unified-env-intel-2021.4.0/install/intel/2021.4.0/netcdf-c-4.9.2-zmmqdw3/include -I/p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/cache/build_stage/heinzell/spack-stage-esmf-8.3.0b09-v6l7wwbbiwuonp2q7tlib3jfffkh3zs5/spack-src/src/prologue/yaml-cpp/include -DESMF_NO_INTEGER_1_BYTE -DESMF_NO_INTEGER_2_BYTE -DESMF_MOAB=1 -DESMF_LAPACK=1 -DESMF_LAPACK_INTERNAL=1 -DESMF_NO_ACC_SOFTWARE_STACK=1 -DESMF_NETCDF=1 -DESMF_YAMLCPP=1 -DESMF_YAML=1 -DESMF_PIO=1 -DESMF_MPIIO -DESMF_NO_OPENACC -DESMF_BOPT_O -DESMF_TESTCOMPTUNNEL -DS64=1 -DESMF_OS_Unicos=1 -DESMF_COMM=mpi -DESMF_DIR=/p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/cache/build_stage/heinzell/spack-stage-esmf-8.3.0b09-v6l7wwbbiwuonp2q7tlib3jfffkh3zs5/spack-src -D__SDIR__='"src/apps/ESMF_PrintInfoC"' -DESMF_CXXSTD=11 -DESMF_NO_POSIXIPC ESMF_PrintInfoC.c -o /p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/cache/build_stage/heinzell/spack-stage-esmf-8.3.0b09-v6l7wwbbiwuonp2q7tlib3jfffkh3zs5/spack-src/obj/objO/Unicos.intel.64.mpi.default/src/apps/ESMF_PrintInfoC/ESMF_PrintInfoC.o
icpc: command line warning #10121: overriding '-march=core-avx2' with '-march=core-avx2'
make chkdir_apps
make[5]: Entering directory '/p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/cache/build_stage/heinzell/spack-stage-esmf-8.3.0b09-v6l7wwbbiwuonp2q7tlib3jfffkh3zs5/spack-src/src/apps/ESMF_PrintInfoC'
make[5]: Leaving directory '/p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/cache/build_stage/heinzell/spack-stage-esmf-8.3.0b09-v6l7wwbbiwuonp2q7tlib3jfffkh3zs5/spack-src/src/apps/ESMF_PrintInfoC'
/p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/spack/lib/spack/env/intel/icpc -dynamic   -pthread -Wl,--no-as-needed  -qopenmp -L/p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/envs/unified-env-intel-2021.4.0/install/intel/2021.4.0/esmf-8.3.0b09-v6l7wwb/lib -L/p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/envs/unified-env-intel-2021.4.0/install/intel/2021.4.0/netcdf-c-4.9.2-zmmqdw3/lib -L/p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/envs/unified-env-intel-2021.4.0/install/intel/2021.4.0/netcdf-fortran-4.6.0-5m7rtkd/lib -L/opt/intel/oneapi_2021.4.0.3422/compiler/2021.4.0/linux/bin/intel64/../../compiler/lib/intel64_lin/ -Wl,-rpath,/p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/envs/unified-env-intel-2021.4.0/install/intel/2021.4.0/esmf-8.3.0b09-v6l7wwb/lib -Wl,-rpath,/p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/envs/unified-env-intel-2021.4.0/install/intel/2021.4.0/netcdf-c-4.9.2-zmmqdw3/lib -Wl,-rpath,/p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/envs/unified-env-intel-2021.4.0/install/intel/2021.4.0/netcdf-fortran-4.6.0-5m7rtkd/lib -o /p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/envs/unified-env-intel-2021.4.0/install/intel/2021.4.0/esmf-8.3.0b09-v6l7wwb/bin/ESMF_PrintInfoC /p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/cache/build_stage/heinzell/spack-stage-esmf-8.3.0b09-v6l7wwbbiwuonp2q7tlib3jfffkh3zs5/spack-src/obj/objO/Unicos.intel.64.mpi.default/src/apps/ESMF_PrintInfoC/ESMF_PrintInfoC.o -lesmf   -lrt -ldl -lnetcdf -lnetcdff -lnetcdf -lnetcdf
icpc: command line warning #10121: overriding '-march=core-avx2' with '-march=core-avx2'
ld: /p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/envs/unified-env-intel-2021.4.0/install/intel/2021.4.0/esmf-8.3.0b09-v6l7wwb/lib/libesmf.so: undefined reference to `mpi_info_free_'
ld: /p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/envs/unified-env-intel-2021.4.0/install/intel/2021.4.0/esmf-8.3.0b09-v6l7wwb/lib/libesmf.so: undefined reference to `mpi_type_commit_'
ld: /p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/envs/unified-env-intel-2021.4.0/install/intel/2021.4.0/esmf-8.3.0b09-v6l7wwb/lib/libesmf.so: undefined reference to `mpi_rsend_'
ld: /p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/envs/unified-env-intel-2021.4.0/install/intel/2021.4.0/esmf-8.3.0b09-v6l7wwb/lib/libesmf.so: undefined reference to `mpi_reduce_'
ld: /p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/envs/unified-env-intel-2021.4.0/install/intel/2021.4.0/esmf-8.3.0b09-v6l7wwb/lib/libesmf.so: undefined reference to `mpi_win_dup_fn_'
ld: /p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/envs/unified-env-intel-2021.4.0/install/intel/2021.4.0/esmf-8.3.0b09-v6l7wwb/lib/libesmf.so: undefined reference to `mpi_comm_null_copy_fn_'
ld: /p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/envs/unified-env-intel-2021.4.0/install/intel/2021.4.0/esmf-8.3.0b09-v6l7wwb/lib/libesmf.so: undefined reference to `mpi_file_read_all_'
ld: /p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/envs/unified-env-intel-2021.4.0/install/intel/2021.4.0/esmf-8.3.0b09-v6l7wwb/lib/libesmf.so: undefined reference to `mpi_info_create_'
ld: /p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/envs/unified-env-intel-2021.4.0/install/intel/2021.4.0/esmf-8.3.0b09-v6l7wwb/lib/libesmf.so: undefined reference to `mpi_type_create_subarray_'
ld: /p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/envs/unified-env-intel-2021.4.0/install/intel/2021.4.0/esmf-8.3.0b09-v6l7wwb/lib/libesmf.so: undefined reference to `mpi_null_copy_fn_'
ld: /p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/envs/unified-env-intel-2021.4.0/install/intel/2021.4.0/esmf-8.3.0b09-v6l7wwb/lib/libesmf.so: undefined reference to `mpi_comm_null_delete_fn_'
ld: /p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/envs/unified-env-intel-2021.4.0/install/intel/2021.4.0/esmf-8.3.0b09-v6l7wwb/lib/libesmf.so: undefined reference to `mpi_file_write_all_'
ld: /p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/envs/unified-env-intel-2021.4.0/install/intel/2021.4.0/esmf-8.3.0b09-v6l7wwb/lib/libesmf.so: undefined reference to `mpi_file_read_at_'
ld: /p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/envs/unified-env-intel-2021.4.0/install/intel/2021.4.0/esmf-8.3.0b09-v6l7wwb/lib/libesmf.so: undefined reference to `mpi_file_set_view_'
ld: /p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/envs/unified-env-intel-2021.4.0/install/intel/2021.4.0/esmf-8.3.0b09-v6l7wwb/lib/libesmf.so: undefined reference to `mpi_allreduce_'
ld: /p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/envs/unified-env-intel-2021.4.0/install/intel/2021.4.0/esmf-8.3.0b09-v6l7wwb/lib/libesmf.so: undefined reference to `mpi_type_null_copy_fn_'
ld: /p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/envs/unified-env-intel-2021.4.0/install/intel/2021.4.0/esmf-8.3.0b09-v6l7wwb/lib/libesmf.so: undefined reference to `mpi_type_contiguous_'
ld: /p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/envs/unified-env-intel-2021.4.0/install/intel/2021.4.0/esmf-8.3.0b09-v6l7wwb/lib/libesmf.so: undefined reference to `mpi_irecv_'
ld: /p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/envs/unified-env-intel-2021.4.0/install/intel/2021.4.0/esmf-8.3.0b09-v6l7wwb/lib/libesmf.so: undefined reference to `mpi_null_delete_fn_'
ld: /p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/envs/unified-env-intel-2021.4.0/install/intel/2021.4.0/esmf-8.3.0b09-v6l7wwb/lib/libesmf.so: undefined reference to `mpi_abort_'
ld: /p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/envs/unified-env-intel-2021.4.0/install/intel/2021.4.0/esmf-8.3.0b09-v6l7wwb/lib/libesmf.so: undefined reference to `mpi_win_null_delete_fn_'
ld: /p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/envs/unified-env-intel-2021.4.0/install/intel/2021.4.0/esmf-8.3.0b09-v6l7wwb/lib/libesmf.so: undefined reference to `mpi_type_get_envelope_'
ld: /p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/envs/unified-env-intel-2021.4.0/install/intel/2021.4.0/esmf-8.3.0b09-v6l7wwb/lib/libesmf.so: undefined reference to `mpi_gather_'
ld: /p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/envs/unified-env-intel-2021.4.0/install/intel/2021.4.0/esmf-8.3.0b09-v6l7wwb/lib/libesmf.so: undefined reference to `mpi_comm_rank_'
ld: /p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/envs/unified-env-intel-2021.4.0/install/intel/2021.4.0/esmf-8.3.0b09-v6l7wwb/lib/libesmf.so: undefined reference to `mpi_win_null_copy_fn_'
ld: /p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/envs/unified-env-intel-2021.4.0/install/intel/2021.4.0/esmf-8.3.0b09-v6l7wwb/lib/libesmf.so: undefined reference to `mpi_file_write_at_'
ld: /p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/envs/unified-env-intel-2021.4.0/install/intel/2021.4.0/esmf-8.3.0b09-v6l7wwb/lib/libesmf.so: undefined reference to `mpi_comm_group_'
ld: /p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/envs/unified-env-intel-2021.4.0/install/intel/2021.4.0/esmf-8.3.0b09-v6l7wwb/lib/libesmf.so: undefined reference to `mpi_irsend_'
ld: /p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/envs/unified-env-intel-2021.4.0/install/intel/2021.4.0/esmf-8.3.0b09-v6l7wwb/lib/libesmf.so: undefined reference to `mpi_info_set_'
ld: /p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/envs/unified-env-intel-2021.4.0/install/intel/2021.4.0/esmf-8.3.0b09-v6l7wwb/lib/libesmf.so: undefined reference to `mpi_wait_'
ld: /p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/envs/unified-env-intel-2021.4.0/install/intel/2021.4.0/esmf-8.3.0b09-v6l7wwb/lib/libesmf.so: undefined reference to `mpi_group_incl_'
ld: /p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/envs/unified-env-intel-2021.4.0/install/intel/2021.4.0/esmf-8.3.0b09-v6l7wwb/lib/libesmf.so: undefined reference to `mpi_comm_size_'
ld: /p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/envs/unified-env-intel-2021.4.0/install/intel/2021.4.0/esmf-8.3.0b09-v6l7wwb/lib/libesmf.so: undefined reference to `mpi_send_'
ld: /p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/envs/unified-env-intel-2021.4.0/install/intel/2021.4.0/esmf-8.3.0b09-v6l7wwb/lib/libesmf.so: undefined reference to `mpi_comm_dup_fn_'
ld: /p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/envs/unified-env-intel-2021.4.0/install/intel/2021.4.0/esmf-8.3.0b09-v6l7wwb/lib/libesmf.so: undefined reference to `mpi_barrier_'
ld: /p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/envs/unified-env-intel-2021.4.0/install/intel/2021.4.0/esmf-8.3.0b09-v6l7wwb/lib/libesmf.so: undefined reference to `mpi_file_close_'
ld: /p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/envs/unified-env-intel-2021.4.0/install/intel/2021.4.0/esmf-8.3.0b09-v6l7wwb/lib/libesmf.so: undefined reference to `mpi_intercomm_merge_'
ld: /p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/envs/unified-env-intel-2021.4.0/install/intel/2021.4.0/esmf-8.3.0b09-v6l7wwb/lib/libesmf.so: undefined reference to `mpi_dup_fn_'
ld: /p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/envs/unified-env-intel-2021.4.0/install/intel/2021.4.0/esmf-8.3.0b09-v6l7wwb/lib/libesmf.so: undefined reference to `mpi_comm_free_'
ld: /p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/envs/unified-env-intel-2021.4.0/install/intel/2021.4.0/esmf-8.3.0b09-v6l7wwb/lib/libesmf.so: undefined reference to `mpi_type_free_'
ld: /p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/envs/unified-env-intel-2021.4.0/install/intel/2021.4.0/esmf-8.3.0b09-v6l7wwb/lib/libesmf.so: undefined reference to `mpi_conversion_fn_null_'
ld: /p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/envs/unified-env-intel-2021.4.0/install/intel/2021.4.0/esmf-8.3.0b09-v6l7wwb/lib/libesmf.so: undefined reference to `mpi_error_string_'
ld: /p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/envs/unified-env-intel-2021.4.0/install/intel/2021.4.0/esmf-8.3.0b09-v6l7wwb/lib/libesmf.so: undefined reference to `mpi_isend_'
ld: /p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/envs/unified-env-intel-2021.4.0/install/intel/2021.4.0/esmf-8.3.0b09-v6l7wwb/lib/libesmf.so: undefined reference to `mpi_recv_'
ld: /p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/envs/unified-env-intel-2021.4.0/install/intel/2021.4.0/esmf-8.3.0b09-v6l7wwb/lib/libesmf.so: undefined reference to `mpi_bcast_'
ld: /p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/envs/unified-env-intel-2021.4.0/install/intel/2021.4.0/esmf-8.3.0b09-v6l7wwb/lib/libesmf.so: undefined reference to `mpi_waitany_'
ld: /p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/envs/unified-env-intel-2021.4.0/install/intel/2021.4.0/esmf-8.3.0b09-v6l7wwb/lib/libesmf.so: undefined reference to `mpi_intercomm_create_'
ld: /p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/envs/unified-env-intel-2021.4.0/install/intel/2021.4.0/esmf-8.3.0b09-v6l7wwb/lib/libesmf.so: undefined reference to `mpi_comm_create_'
ld: /p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/envs/unified-env-intel-2021.4.0/install/intel/2021.4.0/esmf-8.3.0b09-v6l7wwb/lib/libesmf.so: undefined reference to `mpi_finalize_'
ld: /p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/envs/unified-env-intel-2021.4.0/install/intel/2021.4.0/esmf-8.3.0b09-v6l7wwb/lib/libesmf.so: undefined reference to `mpi_file_open_'
ld: /p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/envs/unified-env-intel-2021.4.0/install/intel/2021.4.0/esmf-8.3.0b09-v6l7wwb/lib/libesmf.so: undefined reference to `mpi_type_dup_fn_'
ld: /p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/envs/unified-env-intel-2021.4.0/install/intel/2021.4.0/esmf-8.3.0b09-v6l7wwb/lib/libesmf.so: undefined reference to `mpi_group_free_'
ld: /p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/envs/unified-env-intel-2021.4.0/install/intel/2021.4.0/esmf-8.3.0b09-v6l7wwb/lib/libesmf.so: undefined reference to `mpi_type_null_delete_fn_'
ld: /p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/envs/unified-env-intel-2021.4.0/install/intel/2021.4.0/esmf-8.3.0b09-v6l7wwb/lib/libesmf.so: undefined reference to `mpi_type_create_indexed_block_'
make[4]: *** [/p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/cache/build_stage/heinzell/spack-stage-esmf-8.3.0b09-v6l7wwbbiwuonp2q7tlib3jfffkh3zs5/spack-src/build/common.mk:2438: /p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/envs/unified-env-intel-2021.4.0/install/intel/2021.4.0/esmf-8.3.0b09-v6l7wwb/bin/ESMF_PrintInfoC] Error 1
make[4]: Leaving directory '/p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/cache/build_stage/heinzell/spack-stage-esmf-8.3.0b09-v6l7wwbbiwuonp2q7tlib3jfffkh3zs5/spack-src/src/apps/ESMF_PrintInfoC'
make[3]: *** [/p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/cache/build_stage/heinzell/spack-stage-esmf-8.3.0b09-v6l7wwbbiwuonp2q7tlib3jfffkh3zs5/spack-src/build/common.mk:3661: tree] Error 1
make[3]: Leaving directory '/p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/cache/build_stage/heinzell/spack-stage-esmf-8.3.0b09-v6l7wwbbiwuonp2q7tlib3jfffkh3zs5/spack-src/src/apps'
make[2]: *** [/p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/cache/build_stage/heinzell/spack-stage-esmf-8.3.0b09-v6l7wwbbiwuonp2q7tlib3jfffkh3zs5/spack-src/build/common.mk:2425: build_apps] Error 2
make[2]: Leaving directory '/p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/cache/build_stage/heinzell/spack-stage-esmf-8.3.0b09-v6l7wwbbiwuonp2q7tlib3jfffkh3zs5/spack-src'
make[1]: *** [makefile:644: install_apps] Error 2
make[1]: Leaving directory '/p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/cache/build_stage/heinzell/spack-stage-esmf-8.3.0b09-v6l7wwbbiwuonp2q7tlib3jfffkh3zs5/spack-src'
make: *** [makefile:682: install] Error 2
==> Error: ProcessError: Command exited with status 2:
    'make' '-j8' 'install'

To see the difference between the spack recipes, checkout our spack fork and run:

git diff origin/release/1.2.0 origin/release/1.3.0 -- var/spack/repos/builtin/packages/esmf/package.py

The main spack build options seem to be all correct (and same as for the successful build in spack-stack-1.2.0):

--------------------------------------------------------------
 * User set ESMF environment variables *
ESMF_BOPT=O
ESMF_COMM=mpi
ESMF_COMPILER=intel
ESMF_CXX=/p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/spack/lib/spack/env/intel/icpc
ESMF_CXXCOMPILEOPTS=
ESMF_DIR=/p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/cache/build_stage/heinzell/spack-stage-esmf-8.3.0b09-v6l7wwbbiwuonp2q7tlib3jfffkh3zs5/spack-src
ESMF_F90=/p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/spack/lib/spack/env/intel/ifort
ESMF_F90COMPILEOPTS=
ESMF_INSTALL_BINDIR=bin
ESMF_INSTALL_LIBDIR=lib
ESMF_INSTALL_MODDIR=include
ESMF_INSTALL_PREFIX=/p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/envs/unified-env-intel-2021.4.0/install/intel/2021.4.0/esmf-8.3.0b09-v6l7wwb
ESMF_LAPACK=internal
ESMF_NETCDF=nc-config
ESMF_NFCONFIG=nf-config
ESMF_OS=Unicos
ESMF_PIO=internal

--------------------------------------------------------------
 * ESMF environment variables *
ESMF_DIR: /p/app/projects/NEPTUNE/spack-stack/spack-stack-1.3.0/cache/build_stage/heinzell/spack-stage-esmf-8.3.0b09-v6l7wwbbiwuonp2q7tlib3jfffkh3zs5/spack-src
ESMF_OS:                Unicos
ESMF_MACHINE:           x86_64
ESMF_ABI:               64
ESMF_COMPILER:          intel
ESMF_BOPT:              O
ESMF_COMM:              mpi
ESMF_SITE:              default
ESMF_PTHREADS:          ON
ESMF_OPENMP:            ON
ESMF_OPENACC:           OFF
ESMF_CXXSTD:            11

My guess is that the change to use build_environment introduced the error.

To Reproduce
Try to build ESMF on Narwhal.

Expected behavior
For once, only for once, no ESMF build failures when we roll out a new release.

System:
Narwhal (Cray) with Intel-2021.4.0, cray-mpich, Cray compiler wrappers.

Additional context
Add any other context about the problem here.

@climbfuji climbfuji added the bug Something is not working label Mar 24, 2023
@climbfuji climbfuji self-assigned this Mar 24, 2023
@climbfuji
Copy link
Collaborator Author

From the linker errors, it looks like it is not seeing the Fortran MPI routines:

> for file in /opt/cray/pe/mpich/8.1.14/ofi/intel/19.0/lib/lib*; do echo $file; nm $file | grep mpi_info_free; done
/opt/cray/pe/mpich/8.1.14/ofi/intel/19.0/lib/libfmpich.so
00000000000477f0 W mpi_info_free
00000000000477f0 W mpi_info_free_
00000000000477f0 W mpi_info_free__
000000000005a690 T mpi_info_free_f08_
00000000000477f0 W pmpi_info_free
00000000000477f0 T pmpi_info_free_
00000000000477f0 W pmpi_info_free__
/opt/cray/pe/mpich/8.1.14/ofi/intel/19.0/lib/libmpi.a
/opt/cray/pe/mpich/8.1.14/ofi/intel/19.0/lib/libmpich.a
/opt/cray/pe/mpich/8.1.14/ofi/intel/19.0/lib/libmpichf90.a
0000000000000000 W mpi_info_free
0000000000000000 W mpi_info_free_
0000000000000000 W mpi_info_free__
0000000000000000 W pmpi_info_free
0000000000000000 T pmpi_info_free_
0000000000000000 W pmpi_info_free__
0000000000000000 T mpi_info_free_f08_
/opt/cray/pe/mpich/8.1.14/ofi/intel/19.0/lib/libmpichf90.so
00000000000477f0 W mpi_info_free
00000000000477f0 W mpi_info_free_
00000000000477f0 W mpi_info_free__
000000000005a690 T mpi_info_free_f08_
00000000000477f0 W pmpi_info_free
00000000000477f0 T pmpi_info_free_
00000000000477f0 W pmpi_info_free__
/opt/cray/pe/mpich/8.1.14/ofi/intel/19.0/lib/libmpich.so
/opt/cray/pe/mpich/8.1.14/ofi/intel/19.0/lib/libmpifort.a
0000000000000000 W mpi_info_free
0000000000000000 W mpi_info_free_
0000000000000000 W mpi_info_free__
0000000000000000 W pmpi_info_free
0000000000000000 T pmpi_info_free_
0000000000000000 W pmpi_info_free__
0000000000000000 T mpi_info_free_f08_
/opt/cray/pe/mpich/8.1.14/ofi/intel/19.0/lib/libmpifort_intel.a
0000000000000000 W mpi_info_free
0000000000000000 W mpi_info_free_
0000000000000000 W mpi_info_free__
0000000000000000 W pmpi_info_free
0000000000000000 T pmpi_info_free_
0000000000000000 W pmpi_info_free__
0000000000000000 T mpi_info_free_f08_
/opt/cray/pe/mpich/8.1.14/ofi/intel/19.0/lib/libmpifort_intel.so
00000000000477f0 W mpi_info_free
00000000000477f0 W mpi_info_free_
00000000000477f0 W mpi_info_free__
000000000005a690 T mpi_info_free_f08_
00000000000477f0 W pmpi_info_free
00000000000477f0 T pmpi_info_free_
00000000000477f0 W pmpi_info_free__
/opt/cray/pe/mpich/8.1.14/ofi/intel/19.0/lib/libmpifort_intel.so.12
00000000000477f0 W mpi_info_free
00000000000477f0 W mpi_info_free_
00000000000477f0 W mpi_info_free__
000000000005a690 T mpi_info_free_f08_
00000000000477f0 W pmpi_info_free
00000000000477f0 T pmpi_info_free_
00000000000477f0 W pmpi_info_free__
/opt/cray/pe/mpich/8.1.14/ofi/intel/19.0/lib/libmpifort_intel.so.12.0.0
00000000000477f0 W mpi_info_free
00000000000477f0 W mpi_info_free_
00000000000477f0 W mpi_info_free__
000000000005a690 T mpi_info_free_f08_
00000000000477f0 W pmpi_info_free
00000000000477f0 T pmpi_info_free_
00000000000477f0 W pmpi_info_free__
/opt/cray/pe/mpich/8.1.14/ofi/intel/19.0/lib/libmpifort.so
00000000000477f0 W mpi_info_free
00000000000477f0 W mpi_info_free_
00000000000477f0 W mpi_info_free__
000000000005a690 T mpi_info_free_f08_
00000000000477f0 W pmpi_info_free
00000000000477f0 T pmpi_info_free_
00000000000477f0 W pmpi_info_free__
/opt/cray/pe/mpich/8.1.14/ofi/intel/19.0/lib/libmpi_intel.a
/opt/cray/pe/mpich/8.1.14/ofi/intel/19.0/lib/libmpi_intel.so
/opt/cray/pe/mpich/8.1.14/ofi/intel/19.0/lib/libmpi_intel.so.12
/opt/cray/pe/mpich/8.1.14/ofi/intel/19.0/lib/libmpi_intel.so.12.0.0
/opt/cray/pe/mpich/8.1.14/ofi/intel/19.0/lib/libmpi.so
/opt/cray/pe/mpich/8.1.14/ofi/intel/19.0/lib/libmpl.so
/opt/cray/pe/mpich/8.1.14/ofi/intel/19.0/lib/libopa.so
/opt/cray/pe/mpich/8.1.14/ofi/intel/19.0/lib/libtvmpich.a
/opt/cray/pe/mpich/8.1.14/ofi/intel/19.0/lib/libtvmpich.so
/opt/cray/pe/mpich/8.1.14/ofi/intel/19.0/lib/libtvmpich.so.12
/opt/cray/pe/mpich/8.1.14/ofi/intel/19.0/lib/libtvmpich.so.12.0.0

@climbfuji
Copy link
Collaborator Author

@jedwards4b @mathomp4 Here I have another ESMF build error, this time on Narwhal. If you have time, please see the description above. Everything is the same (OS, Compiler, MPI, ...) and it built just fine last time round. The difference is probably the move from edit to build_environment that we merged from the authoritative spack repository.

But I have no idea how this would change something, since the esmf.mk files look the same, it still says it's Unicos and mpi, ...

I got it to build by making the following change:

--- a/var/spack/repos/builtin/packages/esmf/package.py
+++ b/var/spack/repos/builtin/packages/esmf/package.py
@@ -286,6 +286,7 @@ def setup_build_environment(self, env):
         if "+mpi" in spec:
             if "^cray-mpich" in self.spec:
                 env.set("ESMF_COMM", "mpi")
+                env.set("ESMF_CXXLINKLIBS", "-lmpifort -lmpi")
             elif "^mvapich2" in spec:
                 env.set("ESMF_COMM", "mvapich2")
             elif "^mpich" in spec:

I am hesitant creating a PR for our spack fork (or the authoritative spack repo), because

  • I don't know if it breaks ESMF elsewhere (probably not?)
  • I don't understand why it is needed now and wasn't needed earlier.

Any ideas?

@jedwards4b
Copy link
Collaborator

It's not at all clear to me why you would need that, but I'll try it and let you know if I run into any problems.

@climbfuji
Copy link
Collaborator Author

Thanks @jedwards4b . Fortunately, it allowed me to finish the build on Narwhal making this change.

@mathomp4
Copy link
Collaborator

I'd give advice but I've never even had access to a Cray! I was going to say "No one uses ESMF_COMM=mpi" but, well, I guess you do!

It might be worth talking with ESMF (pinging @theurich) and having them add a cray-mpich ESMF_COMM?

@climbfuji
Copy link
Collaborator Author

ESMF_COMM=mpi has been there forever, because Crays are supported by ESMF since a very long time. And that setting worked just fine on Narwhal until the last updates to the ESMF package config on Narwhal. I just don't see how those would change the build, I can't see it from the generated esmf.mk file either.

@theurich
Copy link

@climbfuji one thing I am wondering is the usage of icpc directly, instead of some sort of MPI compiler wrapper. Since this is ESMF_OS=Unicos the typical expectation would be CC, cc, and ftn. Those wrappers supply all of the necessary compile&link flags. E.g. see the summary.dat from the nightly ESMF regression testing on a similar machine. On lines 128-137 you see the few explicitly set ESMF_ build environment variables. But also see lines 212-217 that the Cray compiler wrappers are being used. Finally if you search for mpich, you see it is not explicitly used as linker flags, b/c it comes through the compiler wrappers.

When this worked fine before, was this also with direct usage of icpc, or is there a chance it was actually picking CC to link the C++ executable?

@climbfuji
Copy link
Collaborator Author

Update/somewhat related. I am also getting linker errors for esmf@8.4.1 on macOS now that I am building static libraries (related to nlohmann-json):

     66796    Making directory /Users/heinzell/prod/spack-stack-1.3.0/cache/build_stage/spack-stage-esmf-8.4.1-vwqlqzxicbx3ynpqcueqhepzxnoqltvd/spack-src/obj/objg/Darwin.gfortranclang.64.openmpi.default/src/apps
              /ESMF_PrintInfo for apps output
     66797    /Users/heinzell/prod/spack-stack-1.3.0/envs/skylab-env/install/apple-clang/13.1.6/openmpi-4.1.4-2w5uhmt/bin/mpif90 -c -fallow-argument-mismatch -g -Wall -Wextra -Wconversion -Wno-unused -Wno-unused
              -dummy-argument -fimplicit-none -fcheck=all,no-pointer -fPIC  -m64 -mcmodel=small -pthread -ffree-line-length-none -fno-backtrace -I/Users/heinzell/prod/spack-stack-1.3.0/cache/build_stage/spack-st
              age-esmf-8.4.1-vwqlqzxicbx3ynpqcueqhepzxnoqltvd/spack-src/src/apps/ESMF_PrintInfo -I/Users/heinzell/prod/spack-stack-1.3.0/cache/build_stage/spack-stage-esmf-8.4.1-vwqlqzxicbx3ynpqcueqhepzxnoqltvd/
              spack-src/build_config/Darwin.gfortranclang.default -I/Users/heinzell/prod/spack-stack-1.3.0/cache/build_stage/spack-stage-esmf-8.4.1-vwqlqzxicbx3ynpqcueqhepzxnoqltvd/spack-src/src/Infrastructure -
              I/Users/heinzell/prod/spack-stack-1.3.0/cache/build_stage/spack-stage-esmf-8.4.1-vwqlqzxicbx3ynpqcueqhepzxnoqltvd/spack-src/src/Superstructure -I/Users/heinzell/prod/spack-stack-1.3.0/envs/skylab-e
              nv/install/apple-clang/13.1.6/esmf-8.4.1-vwqlqzx/include -I/Users/heinzell/prod/spack-stack-1.3.0/cache/build_stage/spack-stage-esmf-8.4.1-vwqlqzxicbx3ynpqcueqhepzxnoqltvd/spack-src/src/include -I/
              Users/heinzell/prod/spack-stack-1.3.0/envs/skylab-env/install/apple-clang/13.1.6/netcdf-c-4.9.2-lysv35e/include -I/Users/heinzell/prod/spack-stack-1.3.0/envs/skylab-env/install/apple-clang/13.1.6/n
              etcdf-fortran-4.6.0-u4g24rl/include -I/Users/heinzell/prod/spack-stack-1.3.0/envs/skylab-env/install/apple-clang/13.1.6/parallelio-2.5.9-wkwc6ly/include  -DESMF_NO_INTEGER_1_BYTE -DESMF_NO_INTEGER_
              2_BYTE -DESMF_MOAB=1 -DESMF_LAPACK=1 -DESMF_LAPACK_INTERNAL=1 -DESMF_NO_ACC_SOFTWARE_STACK=1 -DESMF_NETCDF=1 -DESMF_YAMLCPP=1 -DESMF_YAML=1 -DESMF_PIO=1 -DESMF_NO_OPENMP -DESMF_NO_OPENACC -DESMF_BO
              PT_g -DESMF_TESTCOMPTUNNEL -DSx86_64_small=1 -DESMF_OS_Darwin=1 -DESMF_COMM=openmpi -DESMF_DIR=/Users/heinzell/prod/spack-stack-1.3.0/cache/build_stage/spack-stage-esmf-8.4.1-vwqlqzxicbx3ynpqcueqhe
              pzxnoqltvd/spack-src ESMF_PrintInfo.F90 -o /Users/heinzell/prod/spack-stack-1.3.0/cache/build_stage/spack-stage-esmf-8.4.1-vwqlqzxicbx3ynpqcueqhepzxnoqltvd/spack-src/obj/objg/Darwin.gfortranclang.6
              4.openmpi.default/src/apps/ESMF_PrintInfo/ESMF_PrintInfo.o
     66798    /Applications/Xcode.app/Contents/Developer/usr/bin/make chkdir_apps
     66799    /Users/heinzell/prod/spack-stack-1.3.0/envs/skylab-env/install/apple-clang/13.1.6/openmpi-4.1.4-2w5uhmt/bin/mpif90     -m64 -mcmodel=small -pthread -L/Users/heinzell/prod/spack-stack-1.3.0/envs/sky
              lab-env/install/apple-clang/13.1.6/esmf-8.4.1-vwqlqzx/lib -L/Users/heinzell/prod/spack-stack-1.3.0/envs/skylab-env/install/apple-clang/13.1.6/netcdf-c-4.9.2-lysv35e/lib -L/Users/heinzell/prod/spack
              -stack-1.3.0/envs/skylab-env/install/apple-clang/13.1.6/netcdf-fortran-4.6.0-u4g24rl/lib -L/Users/heinzell/prod/spack-stack-1.3.0/envs/skylab-env/install/apple-clang/13.1.6/parallelio-2.5.9-wkwc6ly
              /lib -L/usr/local/Cellar/gcc/11.3.0/bin/../lib/gcc/11/gcc/x86_64-apple-darwin21/11/../../../ -Wl,-rpath,/Users/heinzell/prod/spack-stack-1.3.0/envs/skylab-env/install/apple-clang/13.1.6/esmf-8.4.1-
              vwqlqzx/lib -Wl,-rpath,/Users/heinzell/prod/spack-stack-1.3.0/envs/skylab-env/install/apple-clang/13.1.6/netcdf-c-4.9.2-lysv35e/lib -Wl,-rpath,/Users/heinzell/prod/spack-stack-1.3.0/envs/skylab-env
              /install/apple-clang/13.1.6/netcdf-fortran-4.6.0-u4g24rl/lib  -Wl,-rpath,/Users/heinzell/prod/spack-stack-1.3.0/envs/skylab-env/install/apple-clang/13.1.6/parallelio-2.5.9-wkwc6ly/lib -Wl,-rpath,/u
              sr/local/Cellar/gcc/11.3.0/bin/../lib/gcc/11/gcc/x86_64-apple-darwin21/11/../../../ -o /Users/heinzell/prod/spack-stack-1.3.0/envs/skylab-env/install/apple-clang/13.1.6/esmf-8.4.1-vwqlqzx/bin/ESMF_
              PrintInfo /Users/heinzell/prod/spack-stack-1.3.0/cache/build_stage/spack-stage-esmf-8.4.1-vwqlqzxicbx3ynpqcueqhepzxnoqltvd/spack-src/obj/objg/Darwin.gfortranclang.64.openmpi.default/src/apps/ESMF_P
              rintInfo/ESMF_PrintInfo.o -lesmf   -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lmpi -lgfortran -lnetcdf -lnetcdff -lnetcdf -lnetcdf -lpioc
     66800    ld: warning: dylib (/usr/local/Cellar/gcc/11.3.0/lib/gcc/11/libgfortran.dylib) was built for newer macOS version (12.2) than being linked (12.0)
     66801    ld: warning: dylib (/usr/local/Cellar/gcc/11.3.0/lib/gcc/11/libquadmath.dylib) was built for newer macOS version (12.2) than being linked (12.0)
  >> 66802    Undefined symbols for architecture x86_64:
     66803      "__ZNKSt13runtime_error4whatEv", referenced from:
     66804          __ZNK8nlohmann16json_abi_v3_11_26detail9exception4whatEv in libesmf.a(ESMCI_Info.o)
     66805          __ZNK8nlohmann16json_abi_v3_11_26detail9exception4whatEv in libesmf.a(ESMCI_Base.o)
     66806          __ZNK8nlohmann16json_abi_v3_11_26detail9exception4whatEv in libesmf.a(ESMCI_PIO_Handler.o)
     66807          __ZNK8nlohmann16json_abi_v3_11_26detail9exception4whatEv in libesmf.a(ESMC_InfoCDef.o)
     66808          __ZNK8nlohmann16json_abi_v3_11_26detail9exception4whatEv in libesmf.a(ESMCI_Array.o)

     ...

     68114          __ZN5ESMCI7SciComp6createEPKcPi in libesmf.a(ESMCI_Comp.o)
     68115          __ZNSt3__1L10__str_findIcmNS_11char_traitsIcEELm18446744073709551615EEET0_PKT_S3_S6_S3_S3_ in libesmf.a(ESMCI_Comp.o)
     68116          __ZNSt3__1L19__str_find_first_ofIcmNS_11char_traitsIcEELm18446744073709551615EEET0_PKT_S3_S6_S3_S3_ in libesmf.a(ESMCI_Comp.o)
     68117          Dwarf Exception Unwind Info (__eh_frame) in libesmf.a(ESMCI_Comp.o)
     68118          ...
     68119    ld: symbol(s) not found for architecture x86_64
  >> 68120    collect2: error: ld returned 1 exit status
  >> 68121    make[4]: *** [/Users/heinzell/prod/spack-stack-1.3.0/envs/skylab-env/install/apple-clang/13.1.6/esmf-8.4.1-vwqlqzx/bin/ESMF_PrintInfo] Error 1
  >> 68122    make[3]: *** [tree] Error 1
  >> 68123    make[2]: *** [build_apps] Error 2
  >> 68124    make[1]: *** [install_apps] Error 2
  >> 68125    make: *** [install] Error 2

See build log for details:
  /Users/heinzell/prod/spack-stack-1.3.0/cache/build_stage/spack-stage-esmf-8.4.1-vwqlqzxicbx3ynpqcueqhepzxnoqltvd/spack-build-out.txt

==> Warning: Skipping build of mapl-2.35.2-ey43mp7jryw7r4g55c5yl6hclva3v4o2 since esmf-8.4.1-vwqlqzxicbx3ynpqcueqhepzxnoqltvd failed
==> Error: mapl-2.35.2-ey43mp7jryw7r4g55c5yl6hclva3v4o2: Package was not installed
==> Error: Installation request failed.  Refer to reported errors for failing package(s).
==> Removing failure mark on esmf-8.4.1-vwqlqzxicbx3ynpqcueqhepzxnoqltvd
==> Removing failure mark on mapl-2.35.2-ey43mp7jryw7r4g55c5yl6hclva3v4o2

@climbfuji climbfuji mentioned this issue Apr 14, 2023
13 tasks
@climbfuji
Copy link
Collaborator Author

See #545 (comment) for the solution for macOS (but not yet for Narwhal).

@theurich
Copy link

@climbfuji one thing I am wondering is the usage of icpc directly, instead of some sort of MPI compiler wrapper. Since this is ESMF_OS=Unicos the typical expectation would be CC, cc, and ftn. Those wrappers supply all of the necessary compile&link flags. E.g. see the summary.dat from the nightly ESMF regression testing on a similar machine. On lines 128-137 you see the few explicitly set ESMF_ build environment variables. But also see lines 212-217 that the Cray compiler wrappers are being used. Finally if you search for mpich, you see it is not explicitly used as linker flags, b/c it comes through the compiler wrappers.

When this worked fine before, was this also with direct usage of icpc, or is there a chance it was actually picking CC to link the C++ executable?

@climbfuji - in order to move forward with the Narwhal issue... I had a few questions wrt whether the Cray compiler wrappers are used or not. Having complete ESMF build logs under Spack for previous successful build vs now failing build would be great.

@climbfuji
Copy link
Collaborator Author

Unfortunately I don't have those, because I made that manual change (add -lmpi...) and installed them this way. Will have to reproduce.

@theurich
Copy link

Don't worry about failing one. I am actually more interested in the previous build that worked before you had to add the extra -lmpi flag. Do you have logs from that old build still, or do they get overwritten/deleted after a while?

And again, I am mostly interested in verifying the suspicion that previously, when it was working, the ftn,CC,cc Cray wrappers were used, and now (where things fail without the extra -lmpi) its with direct usage of the Intel ifort, icpc, icc front-ends. You may know that this is how it is/was, even without the old logs?

@climbfuji
Copy link
Collaborator Author

log.install.intel-2021.4.0-narwhal-esmf-ok.gz

Please see here for the last successful install w/o adding -lmpi

@theurich
Copy link

@climbfuji - the issue on Narwhal is a bit interesting because really all that changed here is the Spack part. You are still using the same compiler and ESMF version.
It seems like you have a working work-around, which is great. On the other hand, I am actually thinking that v8.4.2b03, the tag you just tested for the macOS issue, might work fine on Narwhal even without the -lmpi work-around under Spack. Is it an option for you to move to ESMF v8.4.2 on Narwhal, or are there other constraints to consider?

@climbfuji climbfuji added the INFRA JEDI Infrastructure label Apr 25, 2023
@climbfuji
Copy link
Collaborator Author

Unfortunately I cannot test on Narwhal this week. Having v8.3.0b09 there is not critical, let's check next week if v8.4.0b03 works.

@theurich
Copy link

Okay, that's fine. I will move forward with the v8.4.2 release either way, since EMC (i.e. Jun) is really pushing for it.

@climbfuji
Copy link
Collaborator Author

Yes please. Definitely solves the macOS problem!

@edwardhartnett
Copy link
Collaborator

@theurich my understanding is that your team has taken over the spack build of ESMF. Is that correct?

If not, when will that happen?

@jedwards4b
Copy link
Collaborator

@edwardhartnett : esmf in spack develop represents a collaboration which includes Gerhard as well as Dom, myself, Ufuk and Alexander Richart.

@edwardhartnett
Copy link
Collaborator

@jedwards4b in a recent meeting the ESMF team volunteered to take on this task and to ensure that spack can always build the latest ESMF releases.

This is intended to reduce the resources that NOAA has to spend supporting the home-rolled ESMF build, which has been a considerable drain. We have only a small team to manage the installs of over 100 packages, and ESMF has, for a long time, been the tall pole in the tent, requiring a great deal of effort from the NOAA team - far more than any other single package.

It is expected that @AlexanderRichert-NOAA will no longer need to devote time to maintaining the spack build, and @Hang-Lei-NOAA will no longer have to spend so much time debugging and fixing ESMF build problems on NOAA platforms.

Is the latest ESMF release supported by spack? If not, when will it be?

@jedwards4b
Copy link
Collaborator

The 8.4.2 release was added to spack develop earlier today.

@theurich
Copy link

@edwardhartnett - The ESMF team is taking an active role in maintaining the ESMF package under Spack. Of course it is a collaborative effort.

@climbfuji
Copy link
Collaborator Author

@edwardhartnett I think this has worked very well over the last weeks or months. For sure @AlexanderRichert-NOAA and I have to spend some time on helping to debug and test on systems where the ESMF developers have no access, but apart from that @theurich and his colleagues have taken over the maintenance tasks for ESMF(, and @mathomp4 for MAPL).

@climbfuji
Copy link
Collaborator Author

FWIW, I had to apply the same hack (add env.set("ESMF_CXXLINKLIBS", "-lmpifort -lmpi")) on Gaea C5 for esmf@8.3.0b09.

@climbfuji
Copy link
Collaborator Author

But fortunately we have a way forward with esmf@8.4.2!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something is not working INFRA JEDI Infrastructure
Projects
None yet
Development

Successfully merging a pull request may close this issue.

5 participants