[gmx-users] Setting rcon according to system
Sergio Perez
sperezconesa at gmail.com
Wed Nov 14 10:18:08 CET 2018
Hello,
First of all thanks for the help :)
I don't necessarily need to run it with 100 processors, I just want to know
how much I can reduce rcon taking into account the knowledge of my system
without compromising the accuracy. Let me give some more details of my
system. The system is a sodium montmorillonite clay with two solid
alumino-silicate layers with two aqueous interlayers between them. The
system has TIP4P waters, some OH bonds within the clay and the bonds of the
uranyl hydrated ion described in my previous email as constraints. The
system is orthorrhombic 4.67070x4.49090x3.77930 and has 9046 atoms.
This is the ouput of gromacs:
Initializing Domain Decomposition on 100 ranks
Dynamic load balancing: locked
Initial maximum inter charge-group distances:
two-body bonded interactions: 0.470 nm, Tab. Bonds NC, atoms 10 13
Minimum cell size due to bonded interactions: 0.000 nm
Maximum distance for 5 constraints, at 120 deg. angles, all-trans: 0.842 nm
Estimated maximum distance required for P-LINCS: 0.842 nm
This distance will limit the DD cell size, you can override this with -rcon
Guess for relative PME load: 0.04
Will use 90 particle-particle and 10 PME only ranks
This is a guess, check the performance at the end of the log file
Using 10 separate PME ranks, as guessed by mdrun
Scaling the initial minimum size with 1/0.8 (option -dds) = 1.25
Optimizing the DD grid for 90 cells with a minimum initial size of 1.052 nm
The maximum allowed number of cells is: X 4 Y 4 Z 3
-------------------------------------------------------
Program: mdrun_mpi, version 2018.1
Source file: src/gromacs/domdec/domdec.cpp (line 6571)
MPI rank: 0 (out of 100)
Fatal error:
There is no domain decomposition for 90 ranks that is compatible with the
given box and a minimum cell size of 1.05193 nm
Change the number of ranks or mdrun option -rcon or -dds or your LINCS
settings
Look in the log file for details on the domain decomposition
For more information and tips for troubleshooting, please check the GROMACS
website at http://www.gromacs.org/Documentation/Errors
-------------------------------------------------------
Thank you for your help!
On Wed, Nov 14, 2018 at 5:28 AM Mark Abraham <mark.j.abraham at gmail.com>
wrote:
> Hi,
>
> Possibly. It would be simpler to use fewer processors, such that the
> domains can be larger.
>
> What does mdrun think it needs for -rcon?
>
> Mark
>
> On Tue, Nov 13, 2018 at 7:06 AM Sergio Perez <sperezconesa at gmail.com>
> wrote:
>
> > Dear gmx comunity,
> >
> > I have been running my system without any problems with 100 processors.
> But
> > I decided to make some of the bonds of my main molecule constrains. My
> > molecule is not an extended chain, it is a molecular hydrated ion, in
> > particular the uranyl cation with 5 water molecules forming a pentagonal
> by
> > bipyramid. At this point I get a domain decomposition error and I would
> > like to reduce rcon in order to run with 100 processors. Since I know
> that
> > by the shape of my molecule, two atoms connected by several constraints
> > will never be further appart than 0.6nm, can I use this safely for -rcon?
> >
> > Thank you very much!
> > Best regards,
> > Sergio Pérez-Conesa
> > --
> > Gromacs Users mailing list
> >
> > * Please search the archive at
> > http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
> > posting!
> >
> > * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
> >
> > * For (un)subscribe requests visit
> > https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
> > send a mail to gmx-users-request at gromacs.org.
> --
> Gromacs Users mailing list
>
> * Please search the archive at
> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
> posting!
>
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
> * For (un)subscribe requests visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
> send a mail to gmx-users-request at gromacs.org.
More information about the gromacs.org_gmx-users
mailing list