[gmx-users] strange GPU performance

Albert mailmd2011 at gmail.com
Mon Jul 11 16:02:22 CEST 2016


yes.

But the job failed from to time:

vol 0.87! imb F 34% step 33600, will finish Sat Jul 23 17:26:12 2016

-------------------------------------------------------
Program gmx mdrun, VERSION 5.1.2
Source code file: 
/home/albert/Downloads/gromacs/gromacs-5.1.2/src/gromacs/mdlib/nbnxn_cuda/nbnxn_cuda.cu, 
line: 688

Fatal error:
cudaStreamSynchronize failed in cu_blockwait_nb: unknown error

For more information and tips for troubleshooting, please check the GROMACS
website at http://www.gromacs.org/Documentation/Errors
-------------------------------------------------------

Halting parallel program gmx mdrun on rank 0 out of 4

-------------------------------------------------------
Program gmx mdrun, VERSION 5.1.2
Source code file: 
/home/albert/Downloads/gromacs/gromacs-5.1.2/src/gromacs/mdlib/nbnxn_cuda/nbnxn_cuda.cu, 
line: 688

Fatal error:
cudaStreamSynchronize failed in cu_blockwait_nb: unknown error

For more information and tips for troubleshooting, please check the GROMACS
website at http://www.gromacs.org/Documentation/Errors
-------------------------------------------------------

Halting parallel program gmx mdrun on rank 1 out of 4
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 1 in communicator MPI_COMM_WORLD
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------




On 07/11/2016 03:41 PM, Mark Abraham wrote:
> Hi,
>
> Why did you specify using 2 MPI ranks with 8 OpenMP threads per rank on a
> node with 10 cores and 2 GPUs? You want something that fills all the cores
> (and hyperthreads) e.g.
>
> mpirun -np 2 gmx_mpi mdrun
>
> or
>
> mpirun -np 4 gmx_mpi mdrun
>
> There are likely further improvements available along the lines Szilard
> suggests.



More information about the gromacs.org_gmx-users mailing list