[gmx-developers] distance_restraints and domain decomposition

Berk Hess hessb at mpip-mainz.mpg.de
Wed Mar 18 14:57:30 CET 2009


Hi,

You are averaging over multiple pairs (4) within 2 restraints (index 0 
and 1).
This is not supported (yet) with DD.

Are you sure you want to average over 4 atom pairs within one restraint
and not use 4 separate restraints?

Berk

andrea spitaleri wrote:
> Hi there,
> I am facing a problem with the domain decomposition and distance_restraints. My system is a protein
> with Zn ions and in order to keep the ion in the right position, i use distance_restraints in my
> topology:
>
> [distance_restraints]
> ;ai aj type index type' low up1 up2 fac
>
> 613  98 1   0   1   0.18 0.30 0.35 5
> 613 119 1   0   1   0.18 0.30 0.35 5
> 613 274 1   0   1   0.18 0.30 0.35 5
> 613 300 1   0   1   0.18 0.30 0.35 5
> 614 479 1   1   1   0.18 0.30 0.35 5
> 614 219 1   1   1   0.18 0.30 0.35 5
> 614 198 1   1   1   0.18 0.30 0.35 5
> 614 450 1   1   1   0.18 0.30 0.35 5
>
> Now, the run goes fine using one single cpu but as soon I ask for a parallel run I get this error:
>
> Initializing the distance restraints
> NOTE: atoms involved in distance restraints should be within the longest cut-off distance, if this
> is not the case mdrun generates a fatal error, in that case use particle decomposition (mdrun option
> -pd)
>
> -------------------------------------------------------
> Program mdrun, VERSION 4.0.3
> Source code file: disre.c, line: 135
>
> Fatal error:
> Time or ensemble averaged or multiple pair distance restraints do not work (yet) with domain
> decomposition, use particle decomposition (mdrun option -pd)
>
> If I use -pd option, the run goes fine but dramatically slowly (days instead of hour).
>
> My distances are all below the cut-off (~ 0.21).
> I found only one post (http://www.gromacs.org/pipermail/gmx-users/2008-December/038507.html) with
> probably a bugzilla submission suggestion but I could not find any answer to it.
>
> Any help?
>
> Thanks in advance for your help
>
> Regards
>
> andrea
>
>
>
>
>   




More information about the gromacs.org_gmx-developers mailing list