[gmx-users] Fwd: Simulation blowing up in parallel (h-bonds constraints)
John Whittaker
johnwhittake at zedat.fu-berlin.de
Thu Aug 22 14:20:00 CEST 2019
Hi Jan,
> Dear gromacs-users,
>
> I am encountering a puzzling problem with my simulation of a simple
> protein (~2.000 atoms, only standard amino acids, Calcium cofactor) in
> tip3p water (~10.000 molecules) using the amber99sb-ildn force field and
> lincs constraints set to "h-bonds". When I try to run the production on
> more than one MPI rank, the simulation fails with errors like:
>
> Fatal error:
> 80 particles communicated to PME rank 1 are more than 2/3 times the
> cut-off
> out of the domain decomposition cell of their charge group in dimension x.
> This usually means that your system is not well equilibrated.
>
> Or warnings like:
>
> WARNING: Listed nonbonded interaction between particles 224 and 235
> at distance 3.720 which is larger than the table limit 2.569 nm.
When you got this ^ error, did you visualize your trajectory and check
what was up with these particular particles? Maybe that could give you
some clues about what's happening.
By the way, the mailing list doesn't support attachments. If you want to
share a file, you either have to copy/paste it in the email or upload it
to a file-sharing website (e.g. Dropbox).
- John
>
> (see run_hbonds_tight_parallel.log)
>
>
> I am quite sure that my system is well equilibrated, though (correct me,
> if I am mistaken here). The problem could not be resolved by using a
> very strict preparation protocol of the system (.mdp files attached),
> using a 1 fs timestep and high lincs (iter 2, order 6) accuracy:
>
> - steepest decent minimisation (emtol 100 kJ/mol) in vacuum
>
> - solvation, neutralisation
>
> - steepest decent minimisation (emtol 100 kJ/mol)
>
> - conjugate gradient minimisation (emtol 1 kJ/mol, not reached though
> in single precision)
>
> - NVT eq. 300K (v-rescale tau_t 0.1, coupling groups
> protein/non-protein, all-bonds constraints, position restraints on the
> protein) 300 ps
>
> - NPT eq. (Berendsen, tau_p 1, all-bonds constraints, position
> restraints on the protein) 300 ps
>
> - NPT 2 eq. (no position restraints on the protein) 300 ps
>
> - NPT 3 eq. (switching to h-bond constraints) 400 ps
>
> - NPT 4 eq. (switching to Parrinello-Rahman) 400 ps
>
> Temperature, pressure, density and potential show reasonable
> fluctuations, I think, during the last run:
>
> Energy Average Err.Est. RMSD Tot-Drift
> -------------------------------------------------------------------------------
> Potential -534764 39 766.653 5.78602 (kJ/mol)
>
> Density 1002.32 0.2 2.49712 -1.27332
> (kg/m^3)
>
> Temperature 299.931 0.061 1.61166 0.0932882 (K)
>
> Pressure -1.90648 0.8 120.449 2.49509 (bar)
>
> I can not spot any geometrical clashes in the system. Keeping the 1 fs
> timestep and high lincs accuracy also during production does not help
> (as well as decreasing shake-tol and verlet-buffer-tolerance). Switching
> back to all-bonds constraints, however, allows a parallel run again. The
> simulation with h-bonds constraints runs only stable, if run serial on
> one node (only OpenMP thread usage).
>
> I'd like to keep the h-bonds constraints because switching to all-bonds
> is discouraged by gromacs for the used force field:
>
> NOTE 1 [file unknown]:
> You are using constraints on all bonds, whereas the forcefield has been
> parametrized only with constraints involving hydrogen atoms. We suggest
> using constraints = h-bonds instead, this will also improve
> performance.
>
> Any suggestions on how to achieve a MPI parallel run with this system
> keeping the h-bonds constraints or advice on better equilibration are
> highly appreciated.
>
> Thank you for your help!
>
> Best regards
> Jan
>
>
>
>
>
> --
> Gromacs Users mailing list
>
> * Please search the archive at
> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
> posting!
>
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
> * For (un)subscribe requests visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send
> a mail to gmx-users-request at gromacs.org.
More information about the gromacs.org_gmx-users
mailing list