[gmx-users] Parallel pulling with Gromacs 4.0.7: COMM mode problem

Aykut Erbas aerbas at ph.tum.de
Tue Mar 30 14:44:39 CEST 2010

Hi Chris

Thanks for your replay

I will go through step by step

chris.neale at utoronto.ca wrote:
> Correction: what you are actually doing is pulling your solute to the 
> (X,Y) center of mass of your surface + (2.0 * 0.001/ps) nm in X + the 
> initial center of mass distance in X and Y.
That is actually what I expected but it does smtg else

> -- original message --
> Dear Aykut,
> Please give a more complete description of 1) the .mdp options and
> expected behaviour that you get with gmx 3.3.3 on a single core and 2)
> the .mdp options and difference that you get with gmx 4.0.7 in parallel.

If you pull in G3 with AFM option, with your reference groups is the 
surface, in the output pull.pdo file what you will get is  solute 
(pulled group) coordinates /wrt the surface...
the coordinates of your reference groups as a function of time does not 
change, right...
Note that I have angular COMM mode for such simulation.

G3 pull.ppa I used

runtype         = afm
Skip steps      = 100
ngroups         = 1
group1          = AA1
reference_group = DIAM
pulldim         = Y Y N
afm_k1          = 15.0
afm_rate1       = 0.0001
afm_dir1        = 1.0 0.0 0.0
afm_init1       = 2.0 0.1 1.1

G4 mdp file, which is unsuccessful to reproduce above simulations under G3

comm_mode                = angular
nstcomm                  = 1
comm_grps                = DIAM
pull                     = umbrella
pull_start               = yes
pull_geometry            = position
pull_dim                 = Y Y N
pull_nstxout             = 200
pull_nstfout             = 200
pull_ngroups             = 1
pull_group0              = DIAM
pull_group1              = AA1
pull_vec1                = 1.0 0.0 0.0
pull_init1               = 2.0 0.1 1.0
pull_rate1               = 0.0001
pull_k1                  = 15

> For example, the following text is difficult to understand and doesn't
> match with the .mdp options that you provide below: "the surface moves
> in the opposite direction of motion and you have the pulling
> coordinates in absolute coord"
That means that if you set

that the output coordinates for my pulled groups should be /wrt the 
surface (DIAM)
However, the situation is completely, how can I say, smtg else...

1) the surface is moving although it should not (in principle, above G4 
mdp file eliminates any DIAM motion)
2) output pulled groups coordinates are /wrt  absolute coordinates.
3) the init_grps, which is the surface, moves in the opposite direction 
of the pulling to conserve momentum of the whole system. This cannot be 
because my COMM group is the surface, not the entire system

as you said

init_grps + (2.0 * 0.001/ps) + the initial center of mass distance in X 
and Y

however, it treats like iit_grps is 0 0 0 all the time
but it is not

In the log file, I can see that COMM grp is the surface and the pulling 
is /wrt the surface again
but the output gives smtg as COMM grp is the whole box
> That said, I'll take a guess that your problem is related to your X*Y
> only (no Z) pull group:
> pull_dim                 = Y Y N
> That you seem to think will include some Z component:
> pull_init1               = 2.0 0.0 1.0
> But what you are actually doing is pulling your solute to the (X,Y)
> center of mass of your surface + (2.0 * 0.001/ps) nm in X.
> Which seems strange given that you are using semi-isotropic pressure
> coupling ... a method intended to treat the Z dimension in a special
> way, and not the X dimension.

There is no problem about the system parameters I have bunch of 
simulations running well under G3
the pressure is anisotropic since compressibility of the surface is 
different than the water.
> Finally, you could easily test for yourself what happens when you run
> gmx 4.0.7 in serial so that you could compare to gmx 3.3.3 in serial
> and thus simplify the issue here.
G3 pulling code unfortunately is not so efficient (especially in the 
cluster I use: hostfile problems),
actually this why I switched to G4

> Chris.
Thanks again
I hope I've described the problem

More information about the gromacs.org_gmx-users mailing list