[gmx-users] MPI and OpenMP threads

Szilárd Páll pall.szilard at gmail.com
Tue Feb 16 16:40:57 CET 2016


On Tue, Feb 16, 2016 at 2:50 PM, Alexander Alexander
<alexanderwien2k at gmail.com> wrote:
> Thanks for your response.
>
> I hope you mean with "what has been used in parallel environment" this:
>
> -------------------------------------------------------
> hard resource_list:         h_rt=259000,h_vmem=6500M,h_stack=256M
>
> env_list:                   PSM_RANKS_PER_CONTEXT=4
>
> parallel environment:  smp range: 32
>
> verify_suitable_queues:     1
>
> scheduling info:            (Collecting of scheduler job information is
> turned off)
> -----------------------------------------------------------

No, I mean what's the MPI library, compiler + OpenMP library, job
scheduler, hardware etc. Have you tried varying any of these to
eliminate the error? If you have not, you should start with that - or
ask your admins to do that for you.

I have little clue what the above information refers to, nor am I sure
what exactly does "#$ -pe smp" and "#$ -pe orte_sl32*" mean your
original email.

--
Szilárd

> Cheers,
> Alex
>
>
> On Tue, Feb 16, 2016 at 2:19 PM, Szilárd Páll <pall.szilard at gmail.com>
> wrote:
>
>> On Tue, Feb 16, 2016 at 1:49 PM, Alexander Alexander
>> <alexanderwien2k at gmail.com> wrote:
>> > Dear Gromacs user,
>> >
>> > Gromacs works fine for me with the below first group of parallel
>> > environment and threads but it crashed with the second group complaining
>> > that:
>> >
>> > libgomp: Thread creation failed: Resource temporarily unavailable
>> > Thread creation failed: Resource temporarily unavailable
>>
>> What exactly are you using in that "parallel environment"? The error
>> is somewhere in there, so without some details about it it's hard to
>> tell. My guess is that (assuming you're using MPI) your MPI
>> environment is screwed up or misbehaving in some way.
>>
>> BTW, you don't need MPI if you want to run on that one node with 64 cores!
>>
>> --
>> Szilárd
>>
>> >
>> > Works fine  :-)
>> > ----------------------------------
>> > #$ -pe smp 16
>> >
>> > gmx mdrun -deffnm nvt -s nvt.tpr -ntomp 4 -ntmpi 4
>> > ------------------------------------
>> > #$ -pe smp 6
>> >
>> > gmx mdrun -deffnm nvt -s nvt.tpr -ntomp 2 -ntmpi 3
>> > --------------------------------------
>> >
>> >
>> > Crashed  :-(
>> > ------------------------------------
>> > #$ -pe smp 32
>> >
>> > gmx mdrun -deffnm nvt -s nvt.tpr -ntomp 4 -ntmpi 8
>> > -------------------------------------
>> > #$ -pe smp 32
>> >
>> > gmx mdrun -deffnm nvt -s nvt.tpr -ntomp 2 -ntmpi 16
>> > -------------------------------------
>> > #$ -pe orte_sl32* 64
>> >
>> > gmx mdrun -deffnm nvt -s nvt.tpr -ntomp 32 -ntmpi 2
>> > ---------------------------------------
>> >
>> > Here is the properties of the machine which I am using for Gromacs:
>> >
>> > Running on 1 node with total 64 cores, 64 logical cores
>> > Hardware detected:
>> >   CPU info:
>> >     Vendor: AuthenticAMD
>> >     Brand:  AMD Opteron(TM) Processor 6276
>> >     SIMD instructions most likely to fit this hardware: AVX_128_FMA
>> >     SIMD instructions selected at GROMACS compile time: AVX_128_FMA
>> >
>> >
>> > Any body knows what is the reason please?
>> >
>> > Best regards,
>> > Alex
>> > --
>> > Gromacs Users mailing list
>> >
>> > * Please search the archive at
>> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
>> posting!
>> >
>> > * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>> >
>> > * For (un)subscribe requests visit
>> > https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
>> send a mail to gmx-users-request at gromacs.org.
>> --
>> Gromacs Users mailing list
>>
>> * Please search the archive at
>> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
>> posting!
>>
>> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>>
>> * For (un)subscribe requests visit
>> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
>> send a mail to gmx-users-request at gromacs.org.
> --
> Gromacs Users mailing list
>
> * Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!
>
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
> * For (un)subscribe requests visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a mail to gmx-users-request at gromacs.org.


More information about the gromacs.org_gmx-users mailing list