[gmx-users] MPI GPU job failed
jkrieger at mrc-lmb.cam.ac.uk
jkrieger at mrc-lmb.cam.ac.uk
Thu Aug 11 15:30:09 CEST 2016
I'd suggest installing another gromacs version without MPI then. Your
system doesn't have enough CPU nodes to support it I imagine as you asked
for 2 and got 1. You could try the following first though:
mpirun -np 2 gmx_mpi mdrun -ntomp 10 -v -s 62.tpr -gpu_id 01
That way rather than having 1 mpi process with 20 openmp threads you have
2 mpi processes with 10 openmp threads if it works. I'm not sure whether
it will though.
> Hi, I used your suggested command line, but it failed with the following
> messages:
>
>
> -------------------------------------------------------
> Program gmx mdrun, VERSION 5.1.3
> Source code file:
> /home/albert/Downloads/gromacs/gromacs-5.1.3/src/gromacs/gmxlib/gmx_detect_hardware.cpp,
> line: 458
>
> Fatal error:
> Incorrect launch configuration: mismatching number of PP MPI processes
> and GPUs per node.
> gmx_mpi was started with 1 PP MPI process per node, but you provided 2
> GPUs.
> For more information and tips for troubleshooting, please check the
> GROMACS
> website at http://www.gromacs.org/Documentation/Errors
> -------------------------------------------------------
>
> Halting program gmx mdrun
> --------------------------------------------------------------------------
> MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
> with errorcode 1.
>
> NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
> You may or may not see output from other processes, depending on
> exactly when Open MPI kills them.
> --------------------------------------------------------------------------
> Using 1 MPI process
> Using 20 OpenMP threads
>
> 2 GPUs user-selected for this run.
> Mapping of GPU IDs to the 1 PP rank in this node: 0,1
>
>
> -------------------------------------------------------
> Program gmx mdrun, VERSION 5.1.3
> Source code file:
> /home/albert/Downloads/gromacs/gromacs-5.1.3/src/gromacs/gmxlib/gmx_detect_hardware.cpp,
> line: 458
>
> Fatal error:
> Incorrect launch configuration: mismatching number of PP MPI processes
> and GPUs per node.
> gmx_mpi was started with 1 PP MPI process per node, but you provided 2
> GPUs.
> For more information and tips for troubleshooting, please check the
> GROMACS
> website at http://www.gromacs.org/Documentation/Errors
> -------------------------------------------------------
>
>
>
>
> On 08/11/2016 01:18 PM, jkrieger at mrc-lmb.cam.ac.uk wrote:
>> The problem is you compiled gromacs with mpi (hence the default _mpi in
>> your command). You therefore need to set the number of mpi processes
>> rather than threads. The appropriate command would instead be the
>> following:
>>
>> mpirun -np 2 gmx_mpi mdrun -v -s 62.tpr -gpu_id 01
>>
>> Alternatively you could compile a different gromacs version without mpi.
>> This should have thread-mpi and openmp by default if you leave out
>> DGMX_MPI=ON from the cmake command.
>>
>> Best wishes
>> James
>
> --
> Gromacs Users mailing list
>
> * Please search the archive at
> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
> posting!
>
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
> * For (un)subscribe requests visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send
> a mail to gmx-users-request at gromacs.org.
>
More information about the gromacs.org_gmx-users
mailing list