[gmx-users] Problem with OMP_NUM_THREADS=12 mpirun -np 16 mdrun_mpi

Szilárd Páll szilard.pall at cbr.su.se
Thu Aug 30 15:25:09 CEST 2012


On Thu, Aug 30, 2012 at 1:06 AM, Roland Schulz <roland at utk.edu> wrote:
> Hi,
>
> the OpenMM code is still under review. You can download it using

I guess you meant OpenMP. As far as I know, the OpenMM build has not
been tested much.

> git fetch https://gerrit.gromacs.org/gromacs refs/changes/83/1283/14 && git
> checkout FETCH_HEAD

I've been advising people to get the nbnxn_hybrid_acc branch until the
merge happens (which has the advantage of not having to check which
one is the latest patch set. However, I just noticed that the last
patches pushed to gerrit have not been pushed there. I'll push up the
changes asap.

--
Szilárd


> You can check https://gerrit.gromacs.org/#/c/1283/ for the latest version
> of it (as of time of writing the above line gives you the latest).
>
> Roland
>
> On Wed, Aug 29, 2012 at 10:37 AM, jesmin jahan <shraban03 at gmail.com> wrote:
>
>> Thanks David and Szilárd.
>>
>> I am attaching a log file that I have got from my experiment. Please
>> have a look. It says, gromacs version 4.6-dev
>> I am using  :-)  VERSION 4.6-dev-20120820-87e5bcf  (-: of Gromacs.
>>
>> I have used the commands:
>>
>> git clone git://git.gromacs.org/gromacs.git
>> cd gromacs
>> git checkout --track -b release-4-6 origin/release-4-6 as written on
>> the gromacs website.
>>
>> to download it and
>>
>> used cmake .. -DGMX_MPI=ON -DGMX_OPENMP=ON to configure it.
>>
>> Is it the case that later version of 4.6 has this feature?
>>
>> Please let me know.
>>
>> Thanks,
>> Jesmin
>>
>> On Wed, Aug 29, 2012 at 4:27 AM, Szilárd Páll <szilard.pall at cbr.su.se>
>> wrote:
>> > On Wed, Aug 29, 2012 at 5:32 AM, jesmin jahan <shraban03 at gmail.com>
>> wrote:
>> >> Dear All,
>> >>
>> >> I have installed gromacs VERSION 4.6-dev-20120820-87e5bcf with
>> >> -DGMX_MPI=ON . I am assuming as OPENMP is default, it will be
>> >> automatically installed.
>> >>
>> >> My Compiler is
>> >> /opt/apps/intel11_1/mvapich2/1.6/bin/mpicc Intel icc (ICC) 11.1 20101201
>> >>
>> >> And I am using OMP_NUM_THREADS=12 mpirun -np 16 mdrun_mpi -s imd.tpr
>> >>
>> >> I was hopping this will run 16 processes each with 12 threads.
>> >> However, in the log file I saw something like this:
>> >>
>> >>  R E A L   C Y C L E   A N D   T I M E   A C C O U N T I N G
>> >>
>> >>  Computing:         Nodes     Number     G-Cycles    Seconds     %
>> >> -----------------------------------------------------------------------
>> >>  Domain decomp.        16          1        0.027        0.0     1.8
>> >>  Comm. coord.          16          1        0.002        0.0     0.1
>> >>  Neighbor search       16          1        0.113        0.1     7.7
>> >>  Force                 16          1        1.236        0.8    83.4
>> >>  Wait + Comm. F        16          1        0.015        0.0     1.0
>> >>  Update                16          1        0.005        0.0     0.4
>> >>  Comm. energies        16          1        0.008        0.0     0.5
>> >>  Rest                  16                   0.076        0.0     5.1
>> >> -----------------------------------------------------------------------
>> >>  Total                 16                   1.481        0.9   100.0
>> >> -----------------------------------------------------------------------
>> >>
>> >>
>> >> Its not clear whether each of the 16 nodes runs 12 threads internally
>> or not.
>> >
>> > No it's not. That out put is not from 4.6, you should have an extra
>> > column with the number of threads.
>> >
>> > --
>> > Szilárd
>> >
>> >
>> >> If anyone knows about this, please let me know.
>> >>
>> >> Thanks for help.
>> >>
>> >> Best Regards,
>> >> Jesmin
>> >>
>> >>
>> >>
>> >> --
>> >> Jesmin Jahan Tithi
>> >> PhD Student, CS
>> >> Stony Brook University, NY-11790.
>> >> --
>> >> gmx-users mailing list    gmx-users at gromacs.org
>> >> http://lists.gromacs.org/mailman/listinfo/gmx-users
>> >> * Please search the archive at
>> http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
>> >> * Please don't post (un)subscribe requests to the list. Use the
>> >> www interface or send it to gmx-users-request at gromacs.org.
>> >> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>> > --
>> > gmx-users mailing list    gmx-users at gromacs.org
>> > http://lists.gromacs.org/mailman/listinfo/gmx-users
>> > * Please search the archive at
>> http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
>> > * Please don't post (un)subscribe requests to the list. Use the
>> > www interface or send it to gmx-users-request at gromacs.org.
>> > * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>>
>>
>>
>> --
>> Jesmin Jahan Tithi
>> PhD Student, CS
>> Stony Brook University, NY-11790.
>>
>
>
>
> --
> ORNL/UT Center for Molecular Biophysics cmb.ornl.gov
> 865-241-1537, ORNL PO BOX 2008 MS6309
> --
> gmx-users mailing list    gmx-users at gromacs.org
> http://lists.gromacs.org/mailman/listinfo/gmx-users
> * Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
> * Please don't post (un)subscribe requests to the list. Use the
> www interface or send it to gmx-users-request at gromacs.org.
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists



More information about the gromacs.org_gmx-users mailing list