Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Pass multisim record to Colvars for mwABF #4

Draft
wants to merge 1 commit into
base: main
Choose a base branch
from
Draft

Conversation

jhenin
Copy link
Member

@jhenin jhenin commented Mar 6, 2024

This PR is for sharing, testing, and discussion. Changes will be contributed to the main Colvars and Gromacs repositories.

The MPI-based replica communication routines are now shared by Gromacs, LAMMPS, and Tinker-HP. Ultimately they should go into the proxy base class, however, we may need to play with the build system of the various codes to make sure that the base class compiles correctly with and without MPI present. I haven't yet found a clean way to achieve that, so I've replicated the (short) code.

@giacomofiorin
Copy link
Member

Looks like a great starting point! I hadn't realized that all that was needed was just adding another MDModules notification :-D

I also fully agree with simplifying the inheritance tree by bringing MPI support into the base class, and just leave NAMD/Charm++ as the special case (which it is). Conditional compilation works okay for Tcl, I see no reason not to do the same for MPI, using the MPI_VERSION macro as the conditional.

Then the base class can just initialize its communicator-class pointer as MPI_COMM_NULL, and let each derived proxy initialize it using data passed by the MD engine.

@jhenin
Copy link
Member Author

jhenin commented Mar 8, 2024

The problem with MPI_VERSION is that it is defined in mpi.h, so we cannot use it to decide whether we can include mpi.h in the first place. I can't see a way to decide that without a hint from the build system.

@giacomofiorin
Copy link
Member

giacomofiorin commented Mar 8, 2024

The problem with MPI_VERSION is that it is defined in mpi.h, so we cannot use it to decide whether we can include mpi.h in the first place. I can't see a way to decide that without a hint from the build system.

Well, each engine handles that differently, but I can see that each engine has preprocessor macros that can identify (a) the engine and (b) the presence of a bona fide MPI library. I could list all the sub-cases here, but then it'd take not much more effort to draft a PR in the Colvars repo. A proposed way to handle this is in this PR: Colvars/colvars#720

As far as making a proof of concept that accessing the inter-replica communicator is possible in GROMACS, I think you've basically got it with this PR.

giacomofiorin added a commit to Colvars/colvars that referenced this pull request Sep 23, 2024
giacomofiorin added a commit to Colvars/colvars that referenced this pull request Sep 24, 2024
Simplified version of the code in Colvars/gromacs#4

Patch core GROMACS files
giacomofiorin added a commit to Colvars/colvars that referenced this pull request Oct 1, 2024
Simplified version of the code in Colvars/gromacs#4

Patch core GROMACS files
giacomofiorin added a commit to Colvars/colvars that referenced this pull request Oct 1, 2024
Simplified version of the code in Colvars/gromacs#4

Patch core GROMACS files
giacomofiorin added a commit to Colvars/colvars that referenced this pull request Oct 2, 2024
Simplified version of the code in Colvars/gromacs#4

Patch core GROMACS files
giacomofiorin added a commit to Colvars/colvars that referenced this pull request Oct 3, 2024
Simplified version of the code in Colvars/gromacs#4

Patch core GROMACS files
giacomofiorin added a commit to Colvars/colvars that referenced this pull request Oct 4, 2024
Simplified version of the code in Colvars/gromacs#4

Patch core GROMACS files
giacomofiorin added a commit to Colvars/colvars that referenced this pull request Oct 5, 2024
Simplified version of the code in Colvars/gromacs#4

Patch core GROMACS files
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants