[gmx-developers] Trying to understand performance issue - due to all threads pinning to first core?

Szilárd Páll szilard.pall at cbr.su.se
Thu Dec 20 02:34:54 CET 2012


Hi,

As Roland sugested, the automated thread affinity setting will end up
locking multiple mdrun processes started on the same machine to the first N
cores (N=#of threads mdrun is started with).

The thread pinning is really a must with OpenMP and NUMA platforms (without
it you can loose up to 50% performance, perhaps even more depending on how
dumb your kernel is) and it does a small advantage even without it (haven't
seen more than 5%). Linux kernels by default spread out threads across NUMA
regions which will maximize memory throughput of an application, but it's
almost always the worst strategy when it comes to inter-thread
communication. While for the sake of convenience we could disable affinity
setting if not running with OpenMP, this would lead to an inconsistent
mdrun behavior, one that depends on the command line options i a
non-obvious manner.

Sander suggested a way to detect the multiple mdrun/node case and avoid it
in an elegant manner by implementing a small
mdrun-mdrun communication interface (sockets would probably be the best).
This would enable mdrun to do have an optional feature that enables an
mdrun discovery phase of a few seconds during which all other running
mdrun-s would communicate which cores are they pinned to and this would
enable the newly starting mdrun to avoid clashes.

However, we will probably have no time to add this minor feature to mdrun
before 4.6, unless somebody could help with the implementation.

Does anybody know any other way to detect that in general some other
process is pinned to a certain CPU?

Cheers,
--
Szilárd



On Wed, Dec 19, 2012 at 8:16 PM, Shirts, Michael (mrs5pt) <
mrs5pt at eservices.virginia.edu> wrote:

>
> University of Virginia
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://maillist.sys.kth.se/pipermail/gromacs.org_gmx-developers/attachments/20121220/757754e4/attachment.html>


More information about the gromacs.org_gmx-developers mailing list