[gmx-users] gromacs 4.6.7 patched with plumed 2.1.5 plumed replica exchange: The number of nodes (1) is not a multiple of the number of simulations (2)

Anna Vernon lappala.anna at gmail.com
Thu Oct 27 04:12:31 CEST 2016


Hello,

I have configured and compiled, patched gromacs with plumed, both have 
mpi functionality and the two pieces of software function with each 
other okay in example tests.

However, when I try to do plumed parallel tempering examples shown here 
https://plumed.github.io/doc-v2.1/user-doc/html/belfast-7.html , 
something goes wrong.

I am hoping to understand why I get the following error:

Program mdrun_mpi, VERSION 4.6.7
Source code file: /home/anna/Downloads/gromacs-4.6.7/src/gmxlib/main.c, 
line: 424

Fatal error:
The number of nodes (1) is not a multiple of the number of simulations (2)

As I execute  the following command:

*mpirun -np 2 mdrun_mpi -s TOPO/topol -plumed -multi 2 -replex 100**
*

My mpirun is functional otherwise, and gromacs is patched with plumed. 
In fact, I can run plumed example belfast-4 
https://plumed.github.io/doc-v2.1/user-doc/html/belfast-4.html

using

* mpirun -np 2 mdrun_mpi -plumed plumed.dat -nsteps 100000 **
*


and this command runs  and produces output of the following form:


Back Off! I just backed up md.log to ./#md.log.5#

Overriding nsteps with value passed on the command line: 100000 steps, 
200.000 ps

Using 1 MPI process

Non-default thread affinity set probably by the OpenMP library,
disabling internal thread affinity

Back Off! I just backed up traj.xtc to ./#traj.xtc.4#

Back Off! I just backed up ener.edr to ./#ener.edr.4#
starting mdrun 'ALANINE DIPEPTIDE'
100000 steps,    200.0 ps.

Number of CPUs detected (4) does not match the number reported by OpenMP 
(2).
Consider setting the launch configuration manually!
Reading file topol.tpr, VERSION 4.6.5 (single precision)

Overriding nsteps with value passed on the command line: 100000 steps, 
200.000 ps

Using 1 MPI process

Non-default thread affinity set probably by the OpenMP library,
disabling internal thread affinity

Back Off! I just backed up traj.xtc to ./#traj.xtc.5#

Back Off! I just backed up ener.edr to ./#ener.edr.5#
starting mdrun 'ALANINE DIPEPTIDE'
100000 steps,    200.0 ps.

Writing final coordinates.

Back Off! I just backed up confout.gro to ./#confout.gro.2#

                Core t (s)   Wall t (s)        (%)
        Time:        4.060        4.074       99.7
                  (ns/day)    (hour/ns)
Performance:     4241.408        0.006

gcq#200: "My Head Goes Pop Pop Pop Pop Pop" (F. Black)


Writing final coordinates.

Back Off! I just backed up confout.gro to ./#confout.gro.3#

                Core t (s)   Wall t (s)        (%)
        Time:        4.136        4.154       99.6
                  (ns/day)    (hour/ns)
Performance:     4160.237        0.006

gcq#200: "My Head Goes Pop Pop Pop Pop Pop" (F. Black)

I find it difficult to understand why in the first example I get the 
error message and in the second one, not.

Thank you for your time.

Anna



More information about the gromacs.org_gmx-users mailing list