[gmx-users] Domain decomposition distance restrains in gromacs2016.1

Bakary N'tji Diallo diallobakary4 at gmail.com
Sat Nov 11 08:52:55 CET 2017


Hello


I’m trying to run a simulation with distance restraint using Gromacs
version 2016.1-dev.

The distance restraint file contains:

[ distance_restraints ]

; ai aj type index type. low up1 up2 fac

  6602  2478  1  0   1   0.24 0.30 0.35 1.0

  6602  2504  1  0   1   0.24 0.30 0.35 1.0

  6602  3811  1  0   1   0.24 0.30 0.35 1.0



With


disre                          = Simple

disre-fc                 = 1000


in mdp files.


And the .top file has

#include "distancerestraints.itp"

Run with:

mpirun -np ${NP} -machinefile ${PBS_NODEFILE} gmx_mpi mdrun -rdd 0.1 -cpi
-maxh 48 -deffnm md_0_1


When running the simulation the following warning appears after the last
grompp  before simulation run.

#atoms involved in distance restraints should be within the same domain. If
this is not the case mdrun generates a fatal error. If you encounter this,
use a single MPI rank (Verlet+OpenMP+GPUs work fine).

(The simulation is running with *mpirun -np 1 *but from my understanding it
is using a single processor/core which is slow.)

WARNING: Can not write distance restraint data to energy file with domain
decomposition

Effectively the simulation generates a fatal error.

Different mdrun options to control the domain decomposition (  -rdd, -dds,
  -rcon) were unsuccessfully tried.

Thank you in advance

-- 
*b*
*akary*


More information about the gromacs.org_gmx-users mailing list