[gmx-users] Free energy calculation using inter-molecular bonded interaction

Atsutoshi Okabe atsutoshi0708 at gmail.com
Tue Sep 8 16:55:32 CEST 2015


Thanks, Justin.
I uploaded .tpr files that runs on one core(no error) and multiple cores(error occurred) below.
http://redmine.gromacs.org/issues/1820
I hope that you give me any comments about  that problem.

Bests,
Atsutoshi

2015/09/03 23:00、Justin Lemkul <jalemkul at vt.edu> のメール:

> 
> 
> On 9/3/15 9:14 AM, Atsutoshi Okabe wrote:
>> Thanks, Justin.
>> I tried turning [bonds],  [angles], [dihedrals] interaction on one-by-one.
>> However, same errors occurred in all cases.
>> Also, I tried the calculation on 1 cpu using the following command and the error did not occur!
>> (that means I did not use parallelization)
>> gmx_mpi mdrun -deffnm equil$LAMBDA  # 1 cpu
>> 
> 
> Well, here you're invoking an MPI-enabled mdrun without specifying a core count (-nt) so please make sure (from the .log file) that you are, in fact, only using one core.
> 
>> So, I am wondering if the calculation using both [intermolecular-interactions] tool and parallelization can not work and that causes the error messages below.
>> [mpiexec at leon304] HYDT_bscu_wait_for_completion (./tools/bootstrap/utils/bscu_wait.c:101): one of the processes terminated badly; aborting
>> [mpiexec at leon304] HYDT_bsci_wait_for_completion (./tools/bootstrap/src/bsci_wait.c:18): bootstrap device returned error waiting for completion
>> [mpiexec at leon304] HYD_pmci_wait_for_completion (./pm/pmiserv/pmiserv_pmci.c:521): bootstrap server returned error waiting for completion
>> [mpiexec at leon304] main (./ui/mpich/mpiexec.c:548): process manager error waiting for completion
>> 
>> I used the following command for parallelization
>> mpirun -np 8 gmx_mpi mdrun -deffnm equil$LAMBDA # 8cpus
>> 
> 
> Sounds buggy, but I routinely run my calculations with intermolecular interactions over 32 cores, so it's not inherently a parallelization problem. Could be something unique to your setup.  Please file a bug report onredmine.gromacs.org with this information and provide a .tpr file of a system that runs on one core but fails in parallel.
> 
> -Justin
> 
>> 
>> Bests,
>> Atsutoshi
>> 
>> 2015/09/03 5:42、Justin Lemkul <jalemkul at vt.edu> のメール:
>> 
>>> 
>>> 
>>> On 9/2/15 10:21 AM, Atsutoshi Okabe wrote:
>>>> I mean that the entire [intermolecular_interaction] directive were commented out below;
>>>> ;[ intermolecular-interactions ]
>>>> 
>> I tried turning [bonds] or [angles] or [dihedrals] interaction on one-by-one.
>> However, same errors occurred in all cases.
>> Then, I tried the calculation on 1cpu using the following command and the error did not occur!
>> (that means I do not use parallelization)
>> gmx_mpi mdrun -deffnm equil$LAMBDA  # 1 cpu
>> 
>> So, I am wondering if the calculation using both [intermolecular-interactions] tool and parallelization can not work and that causes the error messages.
>> [mpiexec at leon304] HYDT_bscu_wait_for_completion (./tools/bootstrap/utils/bscu_wait.c:101): one of the processes terminated badly; aborting
>> [mpiexec at leon304] HYDT_bsci_wait_for_completion (./tools/bootstrap/src/bsci_wait.c:18): bootstrap device returned error waiting for completion
>> [mpiexec at leon304] HYD_pmci_wait_for_completion (./pm/pmiserv/pmiserv_pmci.c:521): bootstrap server returned error waiting for completion
>> [mpiexec at leon304] main (./ui/mpich/mpiexec.c:548): process manager error waiting for completion
>> 
>> I use the following command to use parallelization
>> mpirun -np 8 gmx_mpi mdrun -deffnm equil$LAMBDA # 8cpus
>>> 
>>>> ;[ bonds ]
>>>> ;411 3173  6  0.55  2090
>>>> 
>>>> ;[ angles ]
>>>> ;411 3173 3152  1  164.5  20.9
>>>> ;409 411 3173  1  122.3  20.9
>>>> 
>>>> ;[ dihedrals ]
>>>> ;409 411 3173 3152  2  12.0  20.9
>>>> ;407 409 411 3173  2  75.6  20.9
>>>> ;411 3173 3152 3144  2  -126.0  20.9
>>>> 
>>>> I checked global atom number in .gro file of protein-ligand complex. It looks no problem.
>>>> (407, 409 and 411 are atom number of protein,  3172, 3144 and 3152 are atom number of ligand in gro file of complex)
>>>> Curiously, this error occurs when the vdW interaction between ligand and the environment becomes to turn-off(it means Lambda=0.7~1.0).
>>>> Conversely, the calculations were finished normally when the vdW interaction remains(it means Lambda=0.0~0.7).
>>>> 
>>> 
>>> I've never tried using [bonds] in this context; the pull code is a better approach.
>>> 
>>> Perhaps there's an issue with the negative dihedral?  Have you tried turning these interactions on one-by-one to see if there is a specific entry that leads to a failure?  That's the most surefire way to find it.
>>> 
>>> -Justin
>>> 
>>> --
>>> ==================================================
>>> 
>>> Justin A. Lemkul, Ph.D.
>>> Ruth L. Kirschstein NRSA Postdoctoral Fellow
>>> 
>>> Department of Pharmaceutical Sciences
>>> School of Pharmacy
>>> Health Sciences Facility II, Room 629
>>> University of Maryland, Baltimore
>>> 20 Penn St.
>>> Baltimore, MD 21201
>>> 
>>> jalemkul at outerbanks.umaryland.edu | (410) 706-7441
>>> http://mackerell.umaryland.edu/~jalemkul
>>> 
>>> ==================================================
>>> --
>>> Gromacs Users mailing list
>>> 
>>> * Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!
>>> 
>>> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>>> 
>>> * For (un)subscribe requests visit
>>> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a mail to gmx-users-request at gromacs.org.
>> 
> 
> -- 
> ==================================================
> 
> Justin A. Lemkul, Ph.D.
> Ruth L. Kirschstein NRSA Postdoctoral Fellow
> 
> Department of Pharmaceutical Sciences
> School of Pharmacy
> Health Sciences Facility II, Room 629
> University of Maryland, Baltimore
> 20 Penn St.
> Baltimore, MD 21201
> 
> jalemkul at outerbanks.umaryland.edu | (410) 706-7441
> http://mackerell.umaryland.edu/~jalemkul
> 
> ==================================================
> -- 
> Gromacs Users mailing list
> 
> * Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!
> 
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
> 
> * For (un)subscribe requests visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a mail to gmx-users-request at gromacs.org.



More information about the gromacs.org_gmx-users mailing list