[gmx-users] Regarding parallel run in Gromacs 5.0

Mark Abraham mark.j.abraham at gmail.com
Fri Dec 5 12:51:15 CET 2014


On Fri, Dec 5, 2014 at 12:36 PM, Bikash Ranjan Sahoo <
bikash.bioinformatic at protein.osaka-u.ac.jp> wrote:

> Dear Dr. Abraham,
>   I have attached the md.log file.
>

Please follow the installation instructions and use a more recent compiler
- gcc 4.3 is nearly prehistoric these days, and has bugs for the kind of
SIMD usage Gromacs needs. I'd also guess it has problems with reporting the
number of hardware threads, since 276 and 288 are nonsense for your machine.

The command that I executed was    *"mdrun_mpi -s em.tpr -nt 20".*
>

That's different from last time...


> Kindly suggest me how can I run using 20 CPUs. I will be highly obliged if
> you kindly send me the exact command for parallel minimization or exact
> command for re-installation of Gromacs 5.0.2   in parallel. I am little
> confused at the
>

Please read
http://www.gromacs.org/Documentation/Acceleration_and_parallelization#section_12.
Your use of -nt on the command line is potentially useful only for a
thread-MPI enabled Gromacs, and you have compiled Gromacs to use real MPI.
This is exactly what the end of the .log file tells you. Please read them
:-).

Mark


*cmake .. rest flags ~ ~ ~.*
>
>
> The full installation command I ran was
>
> *cmake .. -DGMX_BUILD_OWN_FFTW=ON
> -DCMAKE_INSTALL_PREFIX:PATH=/user1/tanpaku/bussei/bics at 1986/Bikash/cmake/gro/gro5.0
> -DGMX_MPI=ON -DGMX_BUILD_SHARED_LIBS=ON*
>
>
> Is it right ? I have all _mpi binaries in my gromacs and pdb2gmx_mpi,
> editconf_mpi, grompp_mpi is running nice. Kindly suggest.
>
>
> ​Thanking you
> In anticipation of your reply
> Bikash, Osaka Japan​
>
> On Fri, Dec 5, 2014 at 8:25 PM, Mark Abraham <mark.j.abraham at gmail.com>
> wrote:
>
>> Hi,
>>
>> On Fri, Dec 5, 2014 at 8:21 AM, Bikash Ranjan Sahoo <
>> bikash.bioinformatics at gmail.com> wrote:
>>
>> > ​​Dear All
>> >
>> > I have upgraded my Gromacs v4.5.5 to 5.0.
>>
>>
>> Please get the latest 5.0.2 release, in that case ;-)
>>
>> I am unable to run parallel
>> > minimization and simulation in cluster. Previously (v4.5.5) I was
>> > simulating using the command “mdrun –v –s em.tpr –nt 30 &” and
>> > thus assigning 30 CPU’s for the system. However, in the new version it
>> is
>> > not working, and giving the below mentioned errors.  The command as per
>> the
>> > Dr. Justin tutorial “gmx mdrun -v -deffnm em” is also not working for my
>> > cluster installation, but running fine in my local computer. Kindly
>> help me
>> > how to run minimization using mdrun in parallel.
>> >
>> >
>> >
>> > GROMACS: gmx mdrun, VERSION 5.0
>> > Executable: Bikash/cmake/gro/gro5.0/bin/gmx
>> > Library dir: Bikash/cmake/gro/gro5.0/share/gromacs/top
>> > Command line:
>> > gmx mdrun -nt 30 -deffnm em
>> >
>> >
>> > Back Off! I just backed up em.log to ./#em.log.1#
>> >
>> > Number of hardware threads detected (288) does not match the number
>> > reported by OpenMP (276).
>> >
>>
>> That's pretty bizarre. What kind of computer is this? Can you share your
>> whole log file (e.g. on a file-sharing service), please?
>>
>>
>> > Consider setting the launch configuration manually!
>> > Reading file em.tpr, VERSION 5.0 (single precision)
>> > The number of OpenMP threads was set by environment variable
>> > OMP_NUM_THREADS to 6
>> >
>> > Non-default thread affinity set, disabling internal thread affinity
>> > Using 5 MPI threads
>> >
>>
>> Gromacs 4.6 and up are able to use OpenMP with the Verlet cut-off scheme,
>> so the interpretation of -nt 30 changes if the standard environment
>> variable OMP_NUM_THREADS is set. In your case the cluster / job script is
>> probably setting it for you for some reason. You can force the old
>> interpretation with -ntmpi 30, or you can set the variable more
>> appropriately with
>>
>> export OMP_NUM_THREADS=1
>>
>> Segmentation fault
>> >
>>
>> However, this should never happen, so learning more from the log file
>> would
>> be valuable.
>>
>> Mark
>>
>>
>> >
>> >
>> > ​Thanking you
>> > In anticipation of your reply
>> > Bikash
>> > Osaka, Japan​
>> > --
>> > Gromacs Users mailing list
>> >
>> > * Please search the archive at
>> > http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
>> > posting!
>> >
>> > * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>> >
>> > * For (un)subscribe requests visit
>> > https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
>> > send a mail to gmx-users-request at gromacs.org.
>> >
>> --
>> Gromacs Users mailing list
>>
>> * Please search the archive at
>> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
>> posting!
>>
>> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>>
>> * For (un)subscribe requests visit
>> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
>> send a mail to gmx-users-request at gromacs.org.
>>
>
>


More information about the gromacs.org_gmx-users mailing list