[gmx-users] Gromacs + Plumed parallel

Mark Abraham mark.j.abraham at gmail.com
Fri Feb 3 15:31:54 CET 2017


Hi,

By default, the name gmx_d implies GROMACS was not built with GMX_MPI=on.
That gmx_d mdrun accepts -ntmpi confirms that it was built with the
(default) thread-MPI configured instead. The PLUMED error message says that
MPI isn't initialized, which it won't be if GROMACS (and/or PLUMED) wasn't
built with MPI support :-)

Please double check that the PLUMED docs suggest that GROMACS is built with
MPI support if you want PLUMED to have MPI support, and if not, please
suggest it to them :-)

Mark

On Fri, Feb 3, 2017 at 3:01 PM <nikolaev at spbau.ru> wrote:

> Hello, everyone!
>
> I have one problem with running gromacs with plumed using more than one
> processor. When I install plumed, and patch gromacs with it, it writes
> that everything is OK:
> """
> You are patching in runtime mode
> Be warned that when you will run MD you will use the PLUMED version
> pointed at
> by the PLUMED_KERNEL environment variable
>
> PLUMED is compiled with MPI support so you can configure gromacs-5.1.2
> with MPI """
>
> But when I try to run smth like this:
>
> /home/nikolaev_d/JACK/gromacs-5.1.2/build/bin/gmx_d mdrun -deffnm md_1 -v
> -ntmpi 2 -ntomp 4 -plumed plumed.dat
>
> There is an error:
>
> """
> ********** STACK DUMP **********
>
> /home/nikolaev_d/JACK/lib/libplumedKernel.so(_ZN4PLMD9Exception28abortIfExceptionsAreDisabledEv+0x3f)
> [0x7f2ac72d72b9]
>
> /home/nikolaev_d/JACK/lib/libplumedKernel.so(_ZN4PLMD9ExceptionC2ERKSsS2_jS2_+0x28)
> [0x7f2ac72d7766]
>
> /home/nikolaev_d/JACK/lib/libplumedKernel.so(_ZN4PLMD12Communicator8Set_commEPv+0x80)
> [0x7f2ac72d3e8c]
>
> /home/nikolaev_d/JACK/lib/libplumedKernel.so(_ZN4PLMD10PlumedMain3cmdERKSsPv+0x40b4)
> [0x7f2ac7198bee]
> /home/nikolaev_d/JACK/lib/libplumedKernel.so(plumedmain_cmd+0xed)
> [0x7f2ac71a0ea7]
> /home/nikolaev_d/JACK/gromacs-5.1.2/build/bin/gmx_d(do_md+0x254e)
> [0x41b62e]
> /home/nikolaev_d/JACK/gromacs-5.1.2/build/bin/gmx_d(mdrunner+0x19e7)
> [0x433607]
> /home/nikolaev_d/JACK/gromacs-5.1.2/build/bin/gmx_d() [0x433d54]
>
> /home/nikolaev_d/JACK/gromacs-5.1.2/build/bin/../lib/libgromacs_d.so.1(+0xd45cb6)
> [0x7f2b41e53cb6]
> /lib/x86_64-linux-gnu/libpthread.so.0(+0x80a4) [0x7f2b400e70a4]
> /lib/x86_64-linux-gnu/libc.so.6(clone+0x6d) [0x7f2b3f36e62d]
> ******** END STACK DUMP ********
>
> +++ Internal PLUMED error
> +++ file Communicator.cpp, line 102
> +++ message: assertion failed initialized(), you are trying to use an MPI
> function, but MPI is not initialized
> """
>
> However, when I try to run
> /home/nikolaev_d/JACK/gromacs-5.1.2/build/bin/gmx_d mdrun -deffnm md_1 -v
> -ntmpi 1 -ntomp 1 -plumed plumed.dat
>
> everything works fine.
>
> What's the problem? I've read that it might be the incompatibility of some
> some parallelization parameters. But I tried to install gromacs in
> different ways. In the last time I used:
>
> cmake .. -DGMX_BUILD_OWN_FFTW=ON -DGMX_GPU=OFF -DGMX_DOUBLE=ON
> -DCMAKE_INSTALL_PREFIX=/home/nikolaev_d/JACK for that.
>
> Here I attach the config.log and the Makefile.conf of the PLUMED..
>
> Thank you in advace,
> Dmitrii.
>
>
> --
> Gromacs Users mailing list
>
> * Please search the archive at
> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
> posting!
>
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
> * For (un)subscribe requests visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
> send a mail to gmx-users-request at gromacs.org.
>


More information about the gromacs.org_gmx-users mailing list