[gmx-users] Re: Issues regarding exclusions and Charge group distribution
Mark Abraham
Mark.Abraham at anu.edu.au
Fri Jun 12 07:17:56 CEST 2009
Manik Mayur wrote:
> On Fri, Jun 12, 2009 at 8:46 AM, Mark Abraham <Mark.Abraham at anu.edu.au
> <mailto:Mark.Abraham at anu.edu.au>> wrote:
>
> Manik Mayur wrote:
>
> Hi,
>
> The last mail supposedly bounced from the server due to
> attachments. So please excuse in case you find this as a repetition.
>
> I am trying to simulate a system where I want to exclude all nb
> interactions between frozen groups. So I followed the manual and
> defined them in energygrp_excl. But upon doing that it throws
> the following warning.
>
> WARNING 1 [file topol.top, line 1583]:
> Can not exclude the lattice Coulomb energy between energy groups
>
> What is the reason of the warning and should I be worried about
> it or safely ignore it.
>
>
> So, what algorithm are you using for your long-ranged electrostatic
> interactions? Do you know roughly how it works? Does it then make
> sense to try to exclude some of the long-range interactions?
>
>
> I am using PME. I honestly do not have much idea about its
> implementation details but I will definitely look into it. Here I was
> trying to remove all interactions between the frozen group so that I can
> speedup my simulation a little bit. Also since the atoms in the frozen
> group are uncharged, my basic concern was to remove even the LJ
> interaction calculation.
OK. The description in the GROMACS manual is OK on the maths and
implementation details, but doesn't describe the underlying strategy.
See http://en.wikipedia.org/wiki/Particle_mesh_Ewald for starters. The
long-ranged component cannot be decomposed group-wise, so cannot be
excluded group-wise. There are many posts in the archive on this topic.
> Till now I have ignored the warning. So is it ok?
Yes, but you're only saving time in your short-ranged electrostatic
calculations.
> Next, I am facing a strange problem with mdrun_mpi. If I run it
> on 4 nodes then it throws up the following error. Which I traced
> to md.log where I saw:
>
> There are: 15923 Atoms
> Charge group distribution at step 0: 0 3264 3185 0 (why these
> zeroes?)
>
>
> You have atoms that are not part of a charge group (or, probably,
> are part of one of size one and charge zero.
>
> In my topology file, All the atoms that I have defined within a molecule
> do belong to a charge group. Could you please elaborate on the second
> part of your statement?
I'm not sure how the accounting works - GROMACS might not be counting
charge groups that have only excluded interactions, or not counting
charge groups that have no non-zero charges (and thus never participate
in an electrostatic interaction). Whatever is going on, the workload
looks like it might be unbalanced over the four processors.
Mark
> Grid: 3 x 4 x 5 cells
>
> However with 2 nodes it has and runs fine:
>
> There are: 15923 Atoms
> Charge group distribution at step 0: 3226 3223
> Grid: 4 x 5 x 25 cells
>
> The error-
>
>
> No, that's the dump from MPI after the error. You need to consult
> both the stdout and the .log file.
>
> Mark
>
> Making 1D domain decomposition 1 x 1 x 4
> starting mdrun 'TransverseElectrode'
> 100000 steps, 200.0 ps.
> step 0
>
> NOTE: Turning on dynamic load balancing
>
> [md-comp1:32374] *** Process received signal ***
> [md-comp1:32377] *** Process received signal ***
> [md-comp1:32377] Signal: Segmentation fault (11)
> [md-comp1:32377] Signal code: Address not mapped (1)
> [md-comp1:32377] Failing at address: (nil)
> [md-comp1:32374] Signal: Segmentation fault (11)
> [md-comp1:32374] Signal code: Address not mapped (1)
> [md-comp1:32374] Failing at address: (nil)
> [md-comp1:32377] [ 0] [0xb7fd8440]
> [md-comp1:32374] [ 0] [0xb7ef0440]
> [md-comp1:32374] *** End of error message ***
> [md-comp1:32377] *** End of error message ***
>
> I would be extremely thankful, if somebody points me to the
> source of the error and the workaround.
>
> Thanks,
>
> Manik Mayur
> Graduate student
> Microfluidics Lab
> Dept. of Mechanical Engg.
> IIT Kharagpur
> INDIA
>
>
> ------------------------------------------------------------------------
>
> _______________________________________________
> gmx-users mailing list gmx-users at gromacs.org
> <mailto:gmx-users at gromacs.org>
> http://lists.gromacs.org/mailman/listinfo/gmx-users
> Please search the archive at http://www.gromacs.org/search
> before posting!
> Please don't post (un)subscribe requests to the list. Use the
> www interface or send it to gmx-users-request at gromacs.org
> <mailto:gmx-users-request at gromacs.org>.
> Can't post? Read http://www.gromacs.org/mailing_lists/users.php
>
> _______________________________________________
> gmx-users mailing list gmx-users at gromacs.org
> <mailto:gmx-users at gromacs.org>
> http://lists.gromacs.org/mailman/listinfo/gmx-users
> Please search the archive at http://www.gromacs.org/search before
> posting!
> Please don't post (un)subscribe requests to the list. Use the www
> interface or send it to gmx-users-request at gromacs.org
> <mailto:gmx-users-request at gromacs.org>.
> Can't post? Read http://www.gromacs.org/mailing_lists/users.php
>
>
> Manik Mayur
> Graduate student
> Microfluidics Lab
> Dept. of Mechanical Engg.
> IIT Kharagpur
> INDIA
>
>
> ------------------------------------------------------------------------
>
> _______________________________________________
> gmx-users mailing list gmx-users at gromacs.org
> http://lists.gromacs.org/mailman/listinfo/gmx-users
> Please search the archive at http://www.gromacs.org/search before posting!
> Please don't post (un)subscribe requests to the list. Use the
> www interface or send it to gmx-users-request at gromacs.org.
> Can't post? Read http://www.gromacs.org/mailing_lists/users.php
More information about the gromacs.org_gmx-users
mailing list