Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update OpenMPI to version 4.1.4 [12.4.x] #7898

Conversation

fwyzard
Copy link
Contributor

@fwyzard fwyzard commented May 30, 2022

The Open MPI community is pleased to announce the Open MPI v4.1.4 release.
This release contains a number of bug fixes, as well as the UCC collectives component to accelerate collectives on systems with the UCC library installed.

Open MPI v4.1.4 can be downloaded from the Open MPI website:

https://www.open-mpi.org/software/ompi/v4.1/

Changes to v4.1.4 compared to v4.1.3:

  • fix possible length integer overflow in numerous non-blocking collective operations;
  • fix segmentation fault in UCX if MPI Tool interface is finalized before MPI_Init is called;
  • remove /usr/bin/python dependency in configure;
  • fix OMPIO issue with long double etypes;
  • update treematch topology component to fix numerous correctness issues;
  • fix memory leak in UCX MCA parameter registration;
  • fix long operation closing file descriptors on non-Linux systems that can appear as a hang to users;
  • fix for attribute handling on GCC 11 due to pointer aliasing;
  • fix multithreaded race in UCX PML's datatype handling;
  • fix a correctness issue in CUDA Reduce algorithm;
  • fix compilation issue with CUDA GPUDirect RDMA support;
  • fix to make shmem_calloc(..., 0) conform to the OpenSHMEM specification;
  • add UCC collectives component;
  • fix divide by zero issue in OMPI IO component;
  • fix compile issue with libnl when not in standard search locations.

The Open MPI community is pleased to announce the Open MPI v4.1.4 release.
This release contains a number of bug fixes, as well as the UCC collectives
component to accelerate collectives on systems with the UCC library installed.

Open MPI v4.1.4 can be downloaded from the Open MPI website:

  https://www.open-mpi.org/software/ompi/v4.1/

Changes to v4.1.4 compared to v4.1.3:

  - fix possible length integer overflow in numerous non-blocking collective
    operations;
  - fix segmentation fault in UCX if MPI Tool interface is finalized before
    MPI_Init is called;
  - remove /usr/bin/python dependency in configure;
  - fix OMPIO issue with long double etypes;
  - update treematch topology component to fix numerous correctness issues;
  - fix memory leak in UCX MCA parameter registration;
  - fix long operation closing file descriptors on non-Linux systems that can
    appear as a hang to users;
  - fix for attribute handling on GCC 11 due to pointer aliasing;
  - fix multithreaded race in UCX PML's datatype handling;
  - fix a correctness issue in CUDA Reduce algorithm;
  - fix compilation issue with CUDA GPUDirect RDMA support;
  - fix to make shmem_calloc(..., 0) conform to the OpenSHMEM specification;
  - add UCC collectives component;
  - fix divide by zero issue in OMPI IO component;
  - fix compile issue with libnl when not in standard search locations.
@fwyzard
Copy link
Contributor Author

fwyzard commented May 30, 2022

backport #7897

@fwyzard
Copy link
Contributor Author

fwyzard commented May 30, 2022

type bugfix

@cmsbuild
Copy link
Contributor

cmsbuild commented May 30, 2022

A new Pull Request was created by @fwyzard (Andrea Bocci) for branch IB/CMSSW_12_4_X/master.

@cmsbuild, @smuzaffar, @aandvalenzuela, @iarspider can you please review it and eventually sign? Thanks.
@perrotta, @dpiparo, @qliphy you are the release manager for this.
cms-bot commands are listed here

@fwyzard
Copy link
Contributor Author

fwyzard commented May 30, 2022

please test

@fwyzard fwyzard changed the title Update OpenMPI to version 4.1.4 Update OpenMPI to version 4.1.4 [12.4.x] May 30, 2022
@cmsbuild
Copy link
Contributor

-1

Failed Tests: UnitTests
Summary: https://cmssdt.cern.ch/SDT/jenkins-artifacts/pull-request-integration/PR-dfd8d1/25067/summary.html
COMMIT: b255c66
CMSSW: CMSSW_12_4_X_2022-05-29-2300/el8_amd64_gcc10
User test area: For local testing, you can use /cvmfs/cms-ci.cern.ch/week1/cms-sw/cmsdist/7898/25067/install.sh to create a dev area with all the needed externals and cmssw changes.

Unit Tests

I found errors in the following unit tests:

---> test SiStripDAQ_O2O_test had ERRORS

Comparison Summary

Summary:

  • No significant changes to the logs found
  • Reco comparison results: 0 differences found in the comparisons
  • DQMHistoTests: Total files compared: 50
  • DQMHistoTests: Total histograms compared: 3664365
  • DQMHistoTests: Total failures: 2
  • DQMHistoTests: Total nulls: 0
  • DQMHistoTests: Total successes: 3664341
  • DQMHistoTests: Total skipped: 22
  • DQMHistoTests: Total Missing objects: 0
  • DQMHistoSizes: Histogram memory added: -1085.156 KiB( 49 files compared)
  • DQMHistoSizes: changed ( 11634.0,... ): -30.252 KiB GEM/Efficiency
  • DQMHistoSizes: changed ( 23234.0,... ): -57.619 KiB GEM/Efficiency
  • DQMHistoSizes: changed ( 35034.0,... ): -116.275 KiB GEM/Efficiency
  • Checked 208 log files, 45 edm output root files, 50 DQM output files
  • TriggerResults: no differences found

@smuzaffar
Copy link
Contributor

+externals
@qliphy @perrotta feel free to include this for 12.4.X

@cmsbuild
Copy link
Contributor

This pull request is fully signed and it will be integrated in one of the next IB/CMSSW_12_4_X/master IBs (but tests are reportedly failing). This pull request will now be reviewed by the release team before it's merged. @perrotta, @dpiparo, @qliphy (and backports should be raised in the release meeting by the corresponding L2)

@qliphy
Copy link
Contributor

qliphy commented May 31, 2022

merge

@cmsbuild cmsbuild merged commit af0ac23 into cms-sw:IB/CMSSW_12_4_X/master May 31, 2022
@fwyzard fwyzard deleted the IB/CMSSW_12_4_X/master_OpenMPI_4.1.4 branch July 6, 2022 22:45
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants