[gmx-users] Problem with OpenMP+MPI
jesmin jahan
shraban03 at gmail.com
Wed Feb 27 16:01:56 CET 2013
Hi Justin,
Thanks for your reply.
I am able to run it now. I am using the following parameters.
--------------------------------------------------------------------------------------------------------------
constraints = none
integrator = md
cutoff-scheme = Verlet
pbc = xyz
verlet-buffer-drift = -1
dt = 0.001
nsteps = 0
rcoulomb = 1
rvdw = 1
rlist = 1
nstgbradii = 1
rgbradii = 1
implicit_solvent = GBSA
gb_algorithm = HCT ;
sa_algorithm = None
gb_dielectric_offset = 0.02
optimize_fft = yes
energygrps = protein
--------------------------------------------------------------------------------------------
I got the following energy values for this run.
Statistics over 1 steps using 1 frames
Energies (kJ/mol)
GB Polarization LJ (SR) Coulomb (SR) Potential Kinetic En.
-1.52255e+05 5.75419e+08 -2.44085e+05 5.75023e+08 6.36555e+11
Total Energy Temperature Pressure (bar)
6.37130e+11 1.60603e+10 7.90024e+10
###############################################################################
On the other hand, when I used gromacs-4.5.3, with the following parameters:
-----------------------------------------------------------------------------------------------------------------------------------------
constraints = none
integrator = md
pbc = no
dt = 0.001
nsteps = 0
rcoulomb = 300
rvdw = 300
rlist = 300
nstgbradii = 300
rgbradii = 300
implicit_solvent = GBSA
gb_algorithm = HCT ;
sa_algorithm = None
gb_dielectric_offset = 0.02
optimize_fft = yes
energygrps = protein
--------------------------------------------------------------------------------------
I got the following energy values:
Energies (kJ/mol)
GB Polarization LJ (SR) Coulomb (SR) Potential Kinetic En.
-1.58040e+04 0.00000e+00 -2.42022e+05 -2.57826e+05 0.00000e+00
Total Energy Temperature Pressure (bar)
-2.57826e+05 0.00000e+00 0.00000e+00
You can see that there is a huge difference (10 times), in these two
energy values!
How can I get a similar value that I was receiving previously (i.e.,
-1.58040e+04)? The previous energy was correct, while the current
value is 10 times more than the previous one.
Thanks,
Jesmin
On Wed, Feb 27, 2013 at 8:55 AM, Justin Lemkul <jalemkul at vt.edu> wrote:
>
>
> On 2/27/13 8:53 AM, jesmin jahan wrote:
>>
>> Thanks Crasten for your reply.
>>
>> Even after removing comment, I am getting the same error message.
>
>
> You need to re-create the .tpr file after modifying the .mdp file. If
> you're getting the same error, you're using the wrong .tpr file.
>
>
>> Do you have any sample .mdp file that can be used while using MPI+OPENMP?
>>
>> I am trying to compute the GB energy (implicit solvent based).
>>
>
> Note that implicit solvent calculations can be run on no more than 2
> processors. You'll get a fatal error if you try to use more.
>
> -Justin
>
>
>> On Wed, Feb 27, 2013 at 2:59 AM, Carsten Kutzner <ckutzne at gwdg.de> wrote:
>>>
>>> Hi,
>>>
>>> On Feb 27, 2013, at 6:55 AM, jesmin jahan <shraban03 at gmail.com> wrote:
>>>
>>>> Dear Gromacs Users,
>>>>
>>>> I am trying to run the following command on gromacs 4.6
>>>>
>>>> mdrun -ntmpi 2 -ntomp 6 -s imd.tpr
>>>>
>>>> But I am getting the following error
>>>>
>>>> OpenMP threads have been requested with cut-off scheme Group, but
>>>> these are only supported with cut-off scheme Verlet
>>>>
>>>> Does any one know a solution to the problem?
>>>>
>>>> I am using the following .mdp file
>>>>
>>>> constraints = none
>>>> integrator = md
>>>> ;cutoff-scheme = Verlet
>>>
>>> yes, as the note says, use the verlet cutoff scheme (by deleting the ";")
>>>
>>> Carsten
>>>
>>>> pbc = no
>>>> dt = 0.001
>>>> nsteps = 0
>>>> rcoulomb = 300
>>>> rvdw = 300
>>>> rlist = 300
>>>> nstgbradii = 300
>>>> rgbradii = 300
>>>> implicit_solvent = GBSA
>>>> gb_algorithm = HCT ;
>>>> sa_algorithm = None
>>>> gb_dielectric_offset = 0.02
>>>> ;optimize_fft = yes
>>>> energygrps = protein
>>>>
>>>> Please let me know what to change so that it runs perfectly!
>>>>
>>>> Thanks,
>>>> Jesmin
>>>> --
>>>> Jesmin Jahan Tithi
>>>> PhD Student, CS
>>>> Stony Brook University, NY-11790.
>>>> --
>>>> gmx-users mailing list gmx-users at gromacs.org
>>>> http://lists.gromacs.org/mailman/listinfo/gmx-users
>>>> * Please search the archive at
>>>> http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
>>>> * Please don't post (un)subscribe requests to the list. Use the
>>>> www interface or send it to gmx-users-request at gromacs.org.
>>>> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>>>
>>>
>>>
>>> --
>>> Dr. Carsten Kutzner
>>> Max Planck Institute for Biophysical Chemistry
>>> Theoretical and Computational Biophysics
>>> Am Fassberg 11, 37077 Goettingen, Germany
>>> Tel. +49-551-2012313, Fax: +49-551-2012302
>>> http://www.mpibpc.mpg.de/grubmueller/kutzner
>>> http://www.mpibpc.mpg.de/grubmueller/sppexa
>>>
>>> --
>>> gmx-users mailing list gmx-users at gromacs.org
>>> http://lists.gromacs.org/mailman/listinfo/gmx-users
>>> * Please search the archive at
>>> http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
>>> * Please don't post (un)subscribe requests to the list. Use the
>>> www interface or send it to gmx-users-request at gromacs.org.
>>> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>>
>>
>>
>>
>
> --
> ========================================
>
> Justin A. Lemkul, Ph.D.
> Research Scientist
> Department of Biochemistry
> Virginia Tech
> Blacksburg, VA
> jalemkul[at]vt.edu | (540) 231-9080
> http://www.bevanlab.biochem.vt.edu/Pages/Personal/justin
>
> ========================================
>
> --
> gmx-users mailing list gmx-users at gromacs.org
> http://lists.gromacs.org/mailman/listinfo/gmx-users
> * Please search the archive at
> http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
> * Please don't post (un)subscribe requests to the list. Use the www
> interface or send it to gmx-users-request at gromacs.org.
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
--
Jesmin Jahan Tithi
PhD Student, CS
Stony Brook University, NY-11790.
More information about the gromacs.org_gmx-users
mailing list