[gmx-users] PME nodes
Carsten Kutzner
ckutzne at gwdg.de
Mon Jun 8 09:59:22 CEST 2009
On Jun 6, 2009, at 1:20 PM, XAvier Periole wrote:
>
> On Jun 6, 2009, at 1:08 PM, Justin A. Lemkul wrote:
>
>>
>>
>> XAvier Periole wrote:
>>> Dears,
>>> I am having troubles finding the better balance between the PME
>>> CPUs and the rest.
>>> I played with the rdd, rcon and -npme options but nothing really
>>> appears very
>>> straightforwardly best.
>>> I'd appreciate if some of you could post their experience in that
>>> matter. I mean the
>>> number of pme nodes as compared the total number of CPUs used.
>>> I think this info as been discussed recently on the list but the
>>> archive is not accessible.
>>> It may matter that I have a system containing about 70000 atoms, a
>>> protein in a bilayer.
>>
>> Some advice that I got from Berk long ago has worked beautifully
>> for me. You want a 3:1 PP:PME balance for a regular triclinic cell
>> (grompp will report the relative PME load as 25% if your parameters
>> create such a balance), 2:1 for an octahedron. My scaling has been
>> great using this information, without having to alter -rdd, -rcon,
>> etc.
>
> Thanks for the info. I got more or less to that ratio. Although a
> 2:1 PP:PME sometimes is better.
>
> However my problem now is to get 256 CPUs more efficient (ns/day)
> than 128 CPUs. The communications
> become a limiting factor ... can't get it to go faster! The system
> might be small, but not sure.
>
> I'll take a look at the CVS tool.
There is also a version for gromacs 4.0.x available for download at
www.mpibpc.mpg.de/home/grubmueller/projects/MethodAdvancements/Gromacs/
Regards,
Carsten
--
Dr. Carsten Kutzner
Max Planck Institute for Biophysical Chemistry
Theoretical and Computational Biophysics
Am Fassberg 11, 37077 Goettingen, Germany
Tel. +49-551-2012313, Fax: +49-551-2012302
http://www.mpibpc.mpg.de/home/grubmueller/ihp/ckutzne
More information about the gromacs.org_gmx-users
mailing list