[gmx-users] AVX libraries with GMX4.6.3
Mark Abraham
mark.j.abraham at gmail.com
Tue Sep 3 18:41:37 CEST 2013
On Tue, Sep 3, 2013 at 6:17 PM, Ali Sinan Saglam <asinansaglam at gmail.com> wrote:
> Hi Mark,
>
> 1e-9 is indeed an abnormally high tolerance to use, I was just testing
> higher precisions to see if I can get better energy conservation.
Did you read all of what manual section 7.3 has to say about the
ewald_rtol parameter?
> I ran some 1ns tests with different tolerances and compared avx and sse
> instruction sets. I'm still seeing a discrepancy between the two
> instruction sets as can be seen;
> AVX : https://www.dropbox.com/s/glmhyj1l18va8xp/avx_tol_comp_2.png
> SSE : https://www.dropbox.com/s/6bwz4j9d0xywe2w/sse_tol_comp_2.png
> where in SET_TEMP_X, SET=the instruction set used for the binary
> compilation and X=(ewald_rtol=1e-X). For X=5,6,7 they are basically the
> same but for X=8,9 SSE is an order of magnitude better than AVX. For a more
> normal value like X=5,6 the acceleration types certainly agree so this
> might not be very important but I wanted to post what I have seen
> nonetheless.
Thanks, but as you will see in your .log files, you are choosing
values for ewald_rtol that choose an Ewald splitting parameter beta
that probably computes nonsense in the FFTs in concert with the FFT
grid in your .mdp - overflow, underflow, accumulation error, whatever.
The acceleration setting affects how GROMACS uses SIMD to prepare
and/or post-process the FFT, but for all I know the FFT was already
nonsense.
Mark
> Best,
> Ali Sinan Saglam
>
>
> On Fri, Aug 30, 2013 at 2:32 PM, Mark Abraham <mark.j.abraham at gmail.com>wrote:
>
>> ewald_rtol of 1e-9 is a very long way from the normal range of values,
>> and your other electrostatics settings seem pretty normal. I'd expect
>> these results to be of doubtful value, but I am surprised to see
>> that/if there is a marked different with acceleration type. I suggest
>> you try a normal simulation protocol and see how you go.
>>
>> Mark
>>
>> On Fri, Aug 30, 2013 at 7:31 PM, Ali Sinan Saglam
>> <asinansaglam at gmail.com> wrote:
>> > Hi Mark,
>> >
>> > I have tried GCC 4.7.1, 4.7.2 and ICC 13 (with and without MKL, my
>> > colleague also tried double precision) compilers, all showed the same
>> > behavior.
>> >
>> > I will add the .mdp file I have used but the problem consists for all
>> input
>> > files my colleagues tried (they had their own input files, independent to
>> > me). For example the first graph is Barnase - Barstar in a large box
>> > (~100,000 atoms, explicit tip3p waters, neutralized and a small amount of
>> > salt, roughly 60-70 atoms of NaCl) and the mdp file used was;
>> > ;##################################################### INTEGRATOR
>> > ######################################################
>> > integrator = md ; Molecular
>> > dynamics
>> > dt = 0.002 ; Timestep
>> (ps)
>> > nsteps = 500000 ; Simulation
>> > duration (timesteps)
>> > nstcomm = 1000 ; Center of
>> > mass motion removal interval (timesteps)
>> > comm-grps = system ; Remove
>> center
>> > of mass motion of system
>> > ;###################################################### ENSEMBLE
>> > #######################################################
>> > Pcoupl = no ; Disable
>> > barostat
>> > tcoupl = no ; Disable
>> > thermostat
>> > ;################################################ BONDED INTERACTIONS
>> > ##################################################
>> > constraints = hbonds ; Disable
>> > constraints
>> > constraint_algorithm = LINCS ; Constrain
>> > bonds using LINCS
>> > lincs_iter = 1 ; Number of
>> > LINCS iterations
>> > ;############################################### NONBONDED INTERACTIONS
>> > ################################################
>> > coulombtype = PME ; Switch/PME
>> > long-range electrostatics
>> > fourierspacing = 0.1
>> > pme_order = 4
>> > ewald_rtol = 0.000000001
>> > pbc = xyz ; Periodic
>> > boundary condition
>> > rcoulomb = 1.00 ; Short-range
>> > electrostatic cutoff (nm)
>> > ;rcoulomb_switch = 0.9 ;
>> Short-range
>> > electrostatic switch cutoff (nm)
>> > vdwtype = Switch ; Switch van
>> > der Waals interactions
>> > rvdw = 0.9 ; Van der
>> Waals
>> > cutoff (nm)
>> > rvdw_switch = 0.8 ; Van der
>> Waals
>> > switch cutoff (nm)
>> > DispCorr = EnerPres ; Long-range
>> > dispersion correction to energy and pressure
>> > ns_type = grid ; Update
>> > neighbor list using grid
>> > nstlist = 10 ; Neighbor
>> list
>> > update interval (timesteps)
>> > rlist = 1.00 ; Neighbor
>> list
>> > cut-off (nm)
>> > continuation = yes
>> > ;####################################################### OUTPUT
>> > ########################################################
>> > nstlog = 500 ; Energy log
>> > output interval (timesteps)
>> > nstenergy = 500000 ; Energy
>> output
>> > interval (timesteps)
>> > nstxout = 500000 ;
>> > Full-resolution trajectory output interval (timesteps)
>> > nstvout = 500000 ;
>> > Full-resolution velocity output interval (timesteps)
>> > nstfout = 500000 ;
>> > Full-resolution force output interval (timesteps)
>> > nstxtcout = 500 ;
>> > Reduced-resolution trajectory output interval (timesteps)
>> > xtc-precision = 10000 ;
>> > Reduced-resolution trajectory output precision
>> >
>> > While the second graph is for EG5 (~130,000 atoms I believe, I don't have
>> > the exact number but I remember that it's a bigger system than mine,
>> same
>> > water model). I'm unsure what my 3rd colleague tested but had similar
>> > results (his system heated to 400+K) and we didn't share any input files.
>> > All conserved energy fine with SSE instruction set (my results were
>> losing
>> > energy with 1.1kcal/mol.ps)
>> >
>> > Best,
>> > Ali Sinan Saglam
>> >
>> >
>> > On Fri, Aug 30, 2013 at 12:03 PM, Mark Abraham <mark.j.abraham at gmail.com
>> >wrote:
>> >
>> >> With what compiler? (e.g. consult mdrun -version) There are many known
>> >> bugs in early point versions of each minor release of gcc, for
>> >> example. This is why the installation instructions stress getting the
>> >> latest version of your compiler.
>> >>
>> >> Otherwise, what is your .mdp file and simulation system contents?
>> >>
>> >> Mark
>> >>
>> >> On Fri, Aug 30, 2013 at 5:48 PM, Ali Sinan Saglam
>> >> <asinansaglam at gmail.com> wrote:
>> >> > Hi,
>> >> >
>> >> > I have been running some energy conservation tests with GMX4.6.3 and
>> had
>> >> > encountered some issues when using the AVX-256 instruction set.
>> >> >
>> >> > I first noticed that my systems were freezing on a cluster that uses
>> >> > Sandy-bridge CPUs and did not on a different cluster using SSE. After
>> >> > realizing that the problems were not in my configuration and setup I
>> >> > started compiling multiple copies of Gromacs and saw that only the
>> >> AVX-256
>> >> > compiled binaries showed this behavior.
>> >> >
>> >> > After a few colleagues reproduced the same results, I'm now pretty
>> sure
>> >> > this is the case. Two colleagues started completely independently and
>> >> > reproduced the same behavior on 2 different clusters and 2 desktop
>> >> machines
>> >> > (all sandy-brige, in all cases AVX yielded really weird temperatures,
>> >> > sometimes exponential freezing sometimes heating etc.).
>> >> >
>> >> > Gromacs was compiled with -DGMX_CPU_ACCELERATION=AVX or =SSE4.1 (or
>> any
>> >> > version of SSE) and that was the only difference between the two
>> >> binaries.
>> >> > The method of simulation was exactly the same outside of the mdrun
>> that
>> >> was
>> >> > used. The temperature results from my simulations can be seen here;
>> >> > https://www.dropbox.com/s/ul7g0fb4il17wxm/avx_sse_temp.png
>> >> > Another example with a different person reproducing the results;
>> >> > https://www.dropbox.com/s/x7mr3kcgmvd78ie/adam_avx_sse_temp.png
>> >> >
>> >> > Cmake command was;
>> >> > cmake SOURCE -DGMX_BUILD_OWN_FFTW=on -DGMX_OPENMM=off -DGMX_MPI=off
>> >> > -DGMX_GPU=off -DGMX_THREAD_MPI=on -DGMX_CPU_ACCELERATION=AVX (or
>> SSE4.1)
>> >> > (also tried THREAD_MPI=off, similar results)
>> >> >
>> >> > Also all builds passed regression tests.
>> >> >
>> >> > I just wondered if this is a known problem or are we making a mistake
>> >> while
>> >> > compiling?
>> >> >
>> >> > Best,
>> >> > Ali Sinan Saglam
>> >> > --
>> >> > gmx-users mailing list gmx-users at gromacs.org
>> >> > http://lists.gromacs.org/mailman/listinfo/gmx-users
>> >> > * Please search the archive at
>> >> http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
>> >> > * Please don't post (un)subscribe requests to the list. Use the
>> >> > www interface or send it to gmx-users-request at gromacs.org.
>> >> > * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>> >> --
>> >> gmx-users mailing list gmx-users at gromacs.org
>> >> http://lists.gromacs.org/mailman/listinfo/gmx-users
>> >> * Please search the archive at
>> >> http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
>> >> * Please don't post (un)subscribe requests to the list. Use the
>> >> www interface or send it to gmx-users-request at gromacs.org.
>> >> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>> >>
>> >
>> >
>> >
>> > --
>> > Ali Sinan Saglam
>> > Graduate Student in Chemistry
>> > Chong Lab, Room 338, Eberly Hall
>> > University of Pittsburgh
>> > Pittsburgh, PA 15260
>> > --
>> > gmx-users mailing list gmx-users at gromacs.org
>> > http://lists.gromacs.org/mailman/listinfo/gmx-users
>> > * Please search the archive at
>> http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
>> > * Please don't post (un)subscribe requests to the list. Use the
>> > www interface or send it to gmx-users-request at gromacs.org.
>> > * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>> --
>> gmx-users mailing list gmx-users at gromacs.org
>> http://lists.gromacs.org/mailman/listinfo/gmx-users
>> * Please search the archive at
>> http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
>> * Please don't post (un)subscribe requests to the list. Use the
>> www interface or send it to gmx-users-request at gromacs.org.
>> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>>
>
>
>
> --
> Ali Sinan Saglam
> Graduate Student in Chemistry
> Chong Lab, Room 338, Eberly Hall
> University of Pittsburgh
> Pittsburgh, PA 15260
> --
> gmx-users mailing list gmx-users at gromacs.org
> http://lists.gromacs.org/mailman/listinfo/gmx-users
> * Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
> * Please don't post (un)subscribe requests to the list. Use the
> www interface or send it to gmx-users-request at gromacs.org.
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
More information about the gromacs.org_gmx-users
mailing list