[gmx-developers] toolchains and MPI interoperability

Eric Irrgang ericirrgang at gmail.com
Fri Aug 19 11:57:06 CEST 2022


Thanks, Joe. I guess I didn't provide sufficient background.

I should also say that this thread is especially targeted at anyone who is already building MPI-enabled software that links to libgromacs_mpi.so.

We are trying to make it easier for gromacs libraries to be linked into MPI-aware client code and coordinate sharing of MPI resources by allocating subcommunicators. A key use case is to be able to build the Python gmxapi package such that an MPI communicator can be acquired from the mpi4py Python package, then passed to libgromacs for use instead of MPI_COMM_WORLD.

We previously introduced an optional template header (https://gitlab.com/gromacs/gromacs/-/blob/main/api/gmxapi/include/resourceassignment.cmakein.h) to allow client code to pass MPI_Comm objects if and only if the installation and client build are mutually MPI compatible.

This means, for instance, that a Python gmxapi _user_ (or `pip install`, on behalf of the user) needs to be able to find a tool chain that is mutually compatible between mpi4py and the GROMACS installation, which was likely installed at a different time, likely by a different person, and maybe without the benefit of HPC-style environment modules.

> As to the first question, do we have any evidence that the information we get from FindMPI.cmake is insufficient for our purposes? My default position would be that we should change our cmake to be more in line with standard usage rather than try to warp the standard usage to fit our legacy approach. 

Right. Both GROMACS and client software should use FindMPI.

So, to clarify my first question:

How do we help a client build system to get the same results from FindMPI that the GROMACS build system got?

We added the gromacs-hints.cmake file to help client build systems find the same CMAKE_CXX_COMPILER, and that may be sufficient to force FindMPI to _fail_ in the client build system if it finds an incompatible mpicxx, but that's only half an answer.

I would think we should record MPI_CXX_COMPILER (which is both an input and output of FindMPI) in gromacs-hints.cmake, but maybe someone on the list has an idea of why that would be inappropriate or insufficient.

> Since MPI_COMM_WORLD is no longer directly exposed, it might be the case that the answer to your second question involves making utility module an OBJECT so that MPI can be linked privately. Then we could just expose an API call that gives the MPI version/compiler info for the utility module. I am a bit out of my depth here, so let me know if this solution is way off base.

This is about client software that is built against the GROMACS installation, and which does not have access to the build tree. It could be helpful to have run time checks, in case someone replaces their mpi4py package, for instance, but that is a secondary concern.

The primary problem is getting a compatible tool chain. For that, we have to rely on machinery accessible through `find_package(gmxapi)` and/or `find_package(gromacs)`, augmented by the help we can provide through documentation, status messages, and the file we provide for use as `cmake -C path/to/gromacs-hints.cmake` (which we automatically insert under the hood of `pip install gmxapi`).

> On Wed, Aug 17, 2022 at 2:03 PM Eric Irrgang <ericirrgang at gmail.com> wrote:
> Hi Devs.
> 
> I'm writing to ask for input on improving support for integrating gromacs libraries into external MPI-aware code.
> 
> Background: I'm trying to finish up the Python bindings to allow an MPI communicator to be passed to libgromacs (to be used instead of MPI_COMM_WORLD).
> 
> Here are some questions that need to be answered. If you have any input, please let me know.
> 
> If you prefer to log your responses in the issue tracking system, please see https://gitlab.com/gromacs/gromacs/-/issues/3777 and https://gitlab.com/gromacs/gromacs/-/issues/4447
> 
> 
> Questions:
> 
> * What should we add to https://gitlab.com/gromacs/gromacs/-/blob/main/src/gromacs/gromacs-hints.in.cmake to help hint `find_package(MPI ...)`? (https://cmake.org/cmake/help/latest/module/FindMPI.html)
> 
> * If a user of a client project has difficulty determining the actual compiler path associated with the MPI compiler wrapper they are advised to use, how might we provide better feedback from our `gromacs_check_compiler()` CMake function? https://gitlab.com/gromacs/gromacs/-/blob/main/src/gromacs/gromacs-config.cmake.cmakein#L128
> 
> 
> Additional background:
> 
> With https://gitlab.com/gromacs/gromacs/-/issues/3671, we resolved that the appropriate way to configure CMake is to provide the non-wrapper compilers to CMAKE_CXX_COMPILER and to provide the MPI-wrapper paths only to MPI_CXX_COMPILER (if necessary).
> 
> However, FindMPI only reports the compiler wrapper path by setting MPI_<lang>_COMPILER. It does not gain or report any additional insight on the path to the wrapped compiler, as far as I can tell. It _does_ find and use the appropriate "showme" option and does its best to confirm that the wrapped compiler and the CMAKE_<LANG>_COMPILER is that same or compatible, but it FindMPI.cmake does not return the MPI_COMPILE_CMDLINE that it calculates, and it may not have an absolute path to the compiler, anyway.
> 
> Best,
> Eric
> -- 
> Gromacs Developers mailing list
> 
> * Please search the archive at https://www.gromacs.org/gmx-devel.html before posting!
> 
> * Can't post? Read https://www.gromacs.org/gmx-devel.html
> 
> * For (un)subscribe requests visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-developers or send a mail to gmx-developers-request at gromacs.org.
> -- 
> Gromacs Developers mailing list
> 
> * Please search the archive at https://www.gromacs.org/gmx-devel.html before posting!
> 
> * Can't post? Read https://www.gromacs.org/gmx-devel.html
> 
> * For (un)subscribe requests visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-developers or send a mail to gmx-developers-request at gromacs.org.



More information about the gromacs.org_gmx-developers mailing list