[gmx-users] REMD slow's down drastically
Christopher Neale
chris.neale at alum.utoronto.ca
Mon Feb 24 22:56:29 CET 2014
Presuming that you have indeed set up the number of processors correctly (should be running on a different number of cored for different number of replicas to do a fair test), could it be a thread pinning issue?
I run on a Nehalem system with 8 cores/node but, because of the Nehalem hyperthreading (I think), gromacs always complains if I run "mpirun -np $N mdrun" where $N is the number of cores
NOTE: The number of threads is not equal to the number of (logical) cores
and the -pin option is set to auto: will not pin thread to cores.
This can lead to significant performance degradation.
Consider using -pin on (and -pinoffset in case you run multiple jobs).
However, if I use $N = 2 times the number of cores, then I don't get that note, instead getting:
"Pinning threads with a logical core stride of 1"
Aside, if anybody has a suggestion about how I should handle the thread pinning in my case, or if it matters, then I would be happy to hear it (my throughput seems to be good though).
Finally, this comment is off topic, but you might want to reconsider having the CL ions in a separate temperature coupling group.
Chris.
________________________________________
From: gromacs.org_gmx-users-bounces at maillist.sys.kth.se <gromacs.org_gmx-users-bounces at maillist.sys.kth.se> on behalf of Singam Karthick <sikart21 at yahoo.in>
Sent: 24 February 2014 02:32
To: gromacs.org_gmx-users at maillist.sys.kth.se
Subject: [gmx-users] REMD slow's down drastically
Dear members,
I am trying to run REMD simulation for poly Alanine (12 residue) system. I used remd generator to get the range of temperature with the exchange probability of 0.3. I was getting the 125 replicas. I tried to simulate 125 replicas its drastically slow down the simulation time (for 70 pico seconds it took around 17 hours ) could anyone please tell me how to solve this issue.
Following is the MDP file
title = G4Ga3a4a5 production.
;define = ;-DPOSRES ; position restrain the protein
; Run parameters
integrator = md ; leap-frog integrator
nsteps = 12500000 ; 2 * 5000000 = 3ns
dt = 0.002 ; 2 fs
; Output control
nstxout = 0 ; save coordinates every 0.2 ps
nstvout = 10000 ; save velocities every 0.2 ps
nstxtcout = 500 ; save xtc coordinate every 0.2 ps
nstenergy = 500 ; save energies every 0.2 ps
nstlog = 100 ; update log file every 0.2 ps
; Bond parameters
continuation = yes ; Restarting after NVT
constraint_algorithm = lincs ; holonomic constraints
constraints = hbonds ; all bonds (even heavy atom-H bonds) constrained
lincs_iter = 1 ; accuracy of LINCS
lincs_order = 4 ; also related to accuracy
morse = no
; Neighborsearching
ns_type = grid ; search neighboring grid cels
nstlist = 5 ; 10 fs
rlist = 1.0 ; short-range neighborlist cutoff (in nm)
rcoulomb = 1.0 ; short-range electrostatic cutoff (in nm)
rvdw = 1.0 ; short-range van der Waals cutoff (in nm)
; Electrostatics
coulombtype = PME ; Particle Mesh Ewald for long-range electrostatics
pme_order = 4 ; cubic interpolation
fourierspacing = 0.16 ; grid spacing for FFT
; Temperature coupling is on
tcoupl = V-rescale ; modified Berendsen thermostat
tc-grps = protein SOL Cl ;two coupling groups - more accurate
tau_t = 0.1 0.1 0.1 ; time constant, in ps
ref_t = XXXXX XXXXX XXXXX ; reference temperature, one for each group, in K
; Pressure coupling is on
pcoupl = Parrinello-Rahman ; Pressure coupling on in NPT
pcoupltype = isotropic ; uniform scaling of box vectors
tau_p = 2.0 ; time constant, in ps
ref_p = 1.0 ; reference pressure, in bar
compressibility = 4.5e-5 ; isothermal compressibility of water, bar^-1
; Periodic boundary conditions
pbc = xyz ; 3-D PBC
; Dispersion correction
DispCorr = EnerPres ; account for cut-off vdW scheme
regards
singam
--
Gromacs Users mailing list
* Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!
* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a mail to gmx-users-request at gromacs.org.
More information about the gromacs.org_gmx-users
mailing list