[gmx-users] large scaling required to acheive optimal mesh load

Berk Hess gmx3 at hotmail.com
Fri Sep 11 10:07:50 CEST 2009


Hi,

You don't say at all what your systems looks like, so it is difficult to give
proper advice.
As far as electrostatics is concerned, this scaling has very little effect
on the accuracy of the results.
For the Lennard-Jones interaction however, the longer cut-off
puts many more pairs in the interaction list. If you were using
dispersion correction, this longer LJ cut-off will make the results
more accurate. If you were not using dispersion correction, you
will get a lower pressure or higher density (depending on if you
are simulating NVT or NPT).

A low density will usually cause this large shift in optimal settings.

Berk

> Date: Thu, 10 Sep 2009 15:35:49 +0100
> From: Jennifer.Williams at ed.ac.uk
> To: gmx-users at gromacs.org
> Subject: [gmx-users] large scaling required to acheive optimal mesh load
> 
> Hello user,
> 
> I am simulating a unit cell with dimensions 70x70x38 nm using PME. I  
> started out with a cut-off of rvdw = rcoloumb = rlist=0.9 and a  
> Spacing for the PME/PPPM FFT grid= 0.12, optimize fft = yes
> 
> I get the following output when I compile the .tpr file:
> 
> Using a fourier grid of 60x60x33, spacing 0.117 0.117 0.117
> Estimate for the relative computational load of the PME mesh part: 0.97
> 
> NOTE 1 [file SMO_CO2.top, line 2159]:
>    The optimal PME mesh load for parallel simulations is below 0.5
>    and for highly parallel simulations between 0.25 and 0.33,
>    for higher performance, increase the cut-off and the PME grid spacing
> 
> I did a number of test-runs increasing the cut-offs and the grid  
> spacing by a factor of themselves. However I had to nearly double the  
> cut-off and grid spacing in order to get the PME mesh load below 50.  
>  From the forum notes on the topic I got the impression that only a  
> small scaling factor was needed.
> 
> My question is, are the values which I have achieved reasonable?
> 
> Cut-off: 1.665 and grid spacing 0.222
> 
> This is the output using these values....
> 
> Checking consistency between energy and charge groups...
> Calculating fourier grid dimensions for X Y Z
> Using a fourier grid of 32x32x18, spacing 0.219 0.219 0.215
> Estimate for the relative computational load of the PME mesh part: 0.38
> This run will generate roughly 63 Mb of data
> writing run input file...
> 
> Does changing these values have any effect on the results of the mdrun  
> or only on the speed?
> 
> Thanks in advance,
> 
> Jenny
> 
> 
> 
> -- 
> The University of Edinburgh is a charitable body, registered in
> Scotland, with registration number SC005336.
> 
> 
> _______________________________________________
> gmx-users mailing list    gmx-users at gromacs.org
> http://lists.gromacs.org/mailman/listinfo/gmx-users
> Please search the archive at http://www.gromacs.org/search before posting!
> Please don't post (un)subscribe requests to the list. Use the 
> www interface or send it to gmx-users-request at gromacs.org.
> Can't post? Read http://www.gromacs.org/mailing_lists/users.php

_________________________________________________________________
See all the ways you can stay connected to friends and family
http://www.microsoft.com/windows/windowslive/default.aspx
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://maillist.sys.kth.se/pipermail/gromacs.org_gmx-users/attachments/20090911/00e122c3/attachment.html>


More information about the gromacs.org_gmx-users mailing list