[gmx-users] mpi/openmp setup to avoid #1578 on 5.0

Szilárd Páll pall.szilard at gmail.com
Fri Sep 12 18:10:12 CEST 2014

On Fri, Sep 12, 2014 at 5:36 PM, Johnny Lu <johnny.lu128 at gmail.com> wrote:
> Hi.
> I wonder if I run with following, will I get bug 1578 of gromacs 5.0?
> The number of OpenMP threads was set by environment variable
> Using 12 MPI processes
> Using 1 OpenMP thread per MPI process

No, as the redmine message states "Note that without OpenMP threads
there are no issues." (i.e. single thread per rank). Unfortunately the
redmine auto-update mechanism is not working correctly and the actual
change that fixed it (6ba80a from Aug 15) did not show up on the
redmine page. This change is included in the 5.0.1 release.

> And, is it ok to use tpr of version 5.0.1 double precision on gromacs 5.0
> double precision?

Yes, that should work.

> I have 5.0.1 on one machine, but 5.0 on another machine.
> By the way, would gromacs 5.0.1 single precision with GPU have nearly as
> little truncation error (like cutting 3.212 to 3.2) as the double precision
> version, in NVE simulation?

No, but truncation is, typically not a big issue - unless you have a
very large system (where differences in coordinates can be coordinates
can be 2-3 orders of magnitude).


> --
> Gromacs Users mailing list
> * Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
> * For (un)subscribe requests visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a mail to gmx-users-request at gromacs.org.

More information about the gromacs.org_gmx-users mailing list