[gmx-users] implicit solvent error
Alex
nedomacho at gmail.com
Sun Jun 24 09:47:46 CEST 2018
Your EM is unrelated to dynamics. I mean, it could, but we don't know
anything about your simulated system. I am of course assuming that your
MPI setup is optimal for gmx and you actually get to use those 16
threads, assuming those aren't an emulation of some sort.
On 6/24/2018 1:33 AM, Chhaya Singh wrote:
> Hi,
> I have attached the energy minimization mdp file.
> please look through it .
>
>
>
> cpp =
> /lib/cpp ; prepocessor of
> the current machine
> define =
> -DFLEXIBLE ; -DPOSRES,
> -DPOSRES_IONS ;DFLEX_SPC; FLEXible SPC and POSition REStraints
>
> integrator =
> steep ; steepest descent
> algorithm
> dt =
> 0.005 ; time step in ps
> nsteps =
> 5000 ; number of steps
>
> emtol =
> 100 ; convergence
> criterion
> emstep =
> 0.05 ; intial step size
> constraints = none
> constraint-algorithm = lincs
> unconstrained-start =
> no ; Do not constrain
> the start configuration
> ;shake_tol = 0.0001
> nstlist =
> 0 ; step frequency
> for updating neighbour list
> ns_type =
> simple ; grid ; method
> for nighbour searching (?)
> nstxout =
> 100 ; frequency for
> writing coords to output
> nstvout =
> 100 ; frequency for
> writing velocities to output
> nstfout = 0 ; frequency for writing forces to output
> nstlog = 100 ; frequency for writing energies to log file
> nstenergy = 100 ; frequency for writing energies to energy file
> nstxtcout = 0 ; frequency for writing coords to xtc traj
> xtc_grps = system ; group(s) whose coords are to be written in
> xtc traj
> energygrps = system ; group(s) whose energy is to be written in
> energy file
> pbc = no ; use pbc
> rlist = 1.4 ; cutoff (nm)
> coulombtype = cutoff ; truncation for minimisation, with large
> cutoff
> rcoulomb = 1.4
> vdwtype = cut-off ; truncation for minimisation, with large
> cutoff
> rvdw = 1.4
> nstcomm = 0 ; number of steps for centre of mass motion
> removal (in vacuo only!)
> Tcoupl = no
> Pcoupl = no
> "min-implicit.mdp" 40L,
> 2616C
> 1,1 Top
>
>
>
> the system I am using has the following information:
> PBS -l select=1:ncpus=16:mpiprocs=16
> #PBS -l walltime=24:00:00
>
>
>
> On 24 June 2018 at 13:00, Alex <nedomacho at gmail.com> wrote:
>
>> This input has no information about implicit solvent and a simple google
>> search immediately yields mdp examples using gbsa. As far as performance is
>> concerned, we don't know the specs of your machine or the size of your
>> system. With cutoff electrostatics and a cutoff of 5 nm, one can expect
>> quite a bit of scaling with system size beyond linear.
>>
>> Alex
>>
>>
>>
>> On 6/24/2018 1:04 AM, Chhaya Singh wrote:
>>
>>> Hi,
>>>
>>> I am trying to simulate a protein in an implicit solvent in groamcs using
>>> amber ff99sb ildn .
>>> the mdpfile that I am using is I have shown below:
>>>
>>>
>>> integrator = md
>>> dt = 0.001 ;0.005 ; ps !
>>> nsteps = 5000000 ; total 5 ns.
>>>
>>> nstlog = 5000
>>> nstxout = 0 ;1000
>>> nstvout = 0 ;1000
>>> nstfout = 0 ;1000
>>> nstxtcout = 5000
>>> nstenergy = 5000
>>>
>>> nstlist = 10
>>>
>>> cutoff-scheme = group
>>>
>>> rlist = 5
>>> rvdw = 5
>>> rcoulomb = 5
>>> coulombtype = cut-off
>>> vdwtype = cut-off
>>> bd_fric = 0
>>> ;ld_seed = -1
>>> pbc = no
>>> ns_type = grid ;simple => gives domain decomposition error
>>> constraints = all-bonds
>>> lincs_order = 4
>>> lincs_iter = 1
>>> lincs-warnangle = 30
>>>
>>> Tcoupl = v-rescale
>>> tau_t = 1.0
>>> tc-grps = Protein
>>> ref_t = 310
>>>
>>>
>>>
>>> This is the mdp file that I am using for equilibration and production run,
>>> if there is anything that I can fix in mdp file please let me know.
>>> I am getting very less speed using an implicit solvent in gromacs.
>>> is there any way to increase the speed.
>>> the speed right now I am getting is 0.47- 0.74 ns /day using one node.
>>> please help.
>>>
>> --
>> Gromacs Users mailing list
>>
>> * Please search the archive at http://www.gromacs.org/Support
>> /Mailing_Lists/GMX-Users_List before posting!
>>
>> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>>
>> * For (un)subscribe requests visit
>> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
>> send a mail to gmx-users-request at gromacs.org.
>>
More information about the gromacs.org_gmx-users
mailing list