[gmx-users] problem using more than 1 gpu on a single node - Not all bonded interactions have been properly assigned to the domain decomposition cells
Carlos Navarro
carlos.navarro87 at gmail.com
Tue Jul 2 15:00:34 CEST 2019
Dear gmx-users,
This is my first time running gromacs in a server (I mainly work on
workstation) and I'm having some problems using efficiently more than a gpu
per job. This is my script:
#!/bin/bash -x
#SBATCH --job-name=gro16AtTPC1
#SBATCH --nodes=1
#SBATCH --ntasks-per-node=10
#SBATCH --cpus-per-task=4
#SBATCH --output=4gpu.%j
#SBATCH --error=4gpuerr.%j
#SBATCH --time=00:02:00
#SBATCH --gres=gpu:4
module load Intel/2019.3.199-GCC-8.3.0
module load ParaStationMPI/5.2.2-1
module load IntelMPI/2019.3.199
module load GROMACS/2019.1
export OMP_NUM_THREADS="${SLURM_CPUS_PER_TASK:-1}"
###############################o#
# --- DEFINE YOUR VARIABLES --- #
#################################
#
#
WORKDIR1=/p/project/chdd22/gromacs/benchmark/AtTPC1
cd $WORKDIR1
srun --gres=gpu:4 gmx mdrun -s md.tpr -deffnm test16-4gpu -resethway -dlb
auto -ntmpi 4 -pin on -pinoffset 0 &
wait
#
# --- Exit this script
#
exit
and I'm getting the following error message:
Not all bonded interactions have been properly assigned to the domain
decomposition cells
A list of missing interactions:
Bond of 26920 missing 146
U-B of 118884 missing 877
Proper Dih. of 192452 missing 2623
Improper Dih. of 3822 missing 6
LJ-14 of 167422 missing 1572
>From my understanding, gromacs is not able to distribute properly each
video card on each domain. Is there a way to solve this?
Some additional information:
System: ~200k atoms
node: 40 cores + 40 threads
gpu-per-node= 4 nvidia Tesla V100
if you need more info just let me know.
Best regards,
Carlos
--
----------
Carlos Navarro Retamal
Bioinformatic Engineering. PhD
Postdoctoral Researcher in Center for Bioinformatics and Molecular
Simulations
Universidad de Talca
Av. Lircay S/N, Talca, Chile
T: (+56) 712201 <//T:%20(+56)%20712201> 798
E: carlos.navarro87 at gmail.com or cnavarro at utalca.cl
More information about the gromacs.org_gmx-users
mailing list