Skip to content

Commit

Permalink
4.7.3 BUGFIX
Browse files Browse the repository at this point in the history
In version 4.7.0 the routines accounting for rank-8 arrays were
introduced, under specific precompilation flags, for various functions
including MPI_Bcast.

However, a bug was also introduced in the overloaded functions
broadcasting scalar variables: the variable "data" with intent in was
copied into an internal variable "data_", which was then broadcast.

This created a problem whereby the value for the variable stored in the
master was not broadcast to the other cores, which remained with their
respective values.

This of course rendered the code unusable in any parallel scenario.

With this commit I don't remove the data_ variable yet, but I broadcast the
original input one.

The problem may in fact not be completely solved: the reason this was done was,
apparently, an MPI error message warning that scalar input variable could not be broadcast. I have not yet encountered such a message with openmpi 3 and 4 across different setups.

If such a problem should occur, we would have to think of a definitive
solutions.
  • Loading branch information
lcrippa committed Sep 3, 2021
1 parent 09e5af6 commit fe78179
Show file tree
Hide file tree
Showing 2 changed files with 5 additions and 5 deletions.
2 changes: 1 addition & 1 deletion CMakeLists.txt
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
##################################################
CMAKE_MINIMUM_REQUIRED(VERSION 3.0.0)
PROJECT(scifor Fortran)
SET(VERSION 4.6.39)
SET(VERSION 4.7.3)


MESSAGE(STATUS "OS: ${CMAKE_SYSTEM_NAME} ${CMAKE_SYSTEM_VERSION}")
Expand Down
8 changes: 4 additions & 4 deletions src/SF_MPI/SF_MPI.f90
Original file line number Diff line number Diff line change
Expand Up @@ -340,7 +340,7 @@ subroutine MPI_Bcast_Bool_0(comm,data,root)
rank=0;if(present(root))rank=root
if(comm==MPI_COMM_NULL)return
data_(1) = data
call MPI_BCAST(data_,1,MPI_LOGICAL,rank,comm,ierr)
call MPI_BCAST(data,1,MPI_LOGICAL,rank,comm,ierr)
call Error_MPI(sub='MPI_Bcast_Bool_0')
end subroutine MPI_Bcast_Bool_0
!
Expand Down Expand Up @@ -438,7 +438,7 @@ subroutine MPI_Bcast_Int_0(comm,data,root)
rank=0;if(present(root))rank=root
if(comm==MPI_COMM_NULL)return
data_(1) = data
call MPI_BCAST(data_,1,MPI_INTEGER,rank,comm,ierr)
call MPI_BCAST(data,1,MPI_INTEGER,rank,comm,ierr)
call Error_MPI(sub='MPI_Bcast_Int_0')
end subroutine MPI_Bcast_Int_0
!
Expand Down Expand Up @@ -535,7 +535,7 @@ subroutine MPI_Bcast_Dble_0(comm,data,root)
rank=0;if(present(root))rank=root
if(comm==MPI_COMM_NULL)return
data_(1) = data
call MPI_BCAST(data_,1,MPI_DOUBLE_PRECISION,rank,comm,ierr)
call MPI_BCAST(data,1,MPI_DOUBLE_PRECISION,rank,comm,ierr)
call Error_MPI(sub='MPI_Bcast_Dble_0')
end subroutine MPI_Bcast_Dble_0
!
Expand Down Expand Up @@ -633,7 +633,7 @@ subroutine MPI_Bcast_Cmplx_0(comm,data,root)
rank=0;if(present(root))rank=root
if(comm==MPI_COMM_NULL)return
data_(1) = data
call MPI_BCAST(data_,1,MPI_DOUBLE_COMPLEX,rank,comm,ierr)
call MPI_BCAST(data,1,MPI_DOUBLE_COMPLEX,rank,comm,ierr)
call Error_MPI(sub='MPI_Bcast_Cmplx_0')
end subroutine MPI_Bcast_Cmplx_0
!
Expand Down

0 comments on commit fe78179

Please sign in to comment.