[gmx-users] remd error

Bratin Kumar Das 177cy500.bratin at nitk.edu.in
Mon Jul 29 13:55:27 CEST 2019


Hi Szilard,
               Thank you for your reply. I rectified as you said. For trial
purpose i took 8 nodes or 16 nodes... (-np 8) to text whether it is running
or not. I gave the following command to run remd
*mpirun -np 8 gmx_mpi_d mdrun -v -multi 8 -replex 1000 -deffnm remd*
After giving the command it is giving following error
Program:     gmx mdrun, version 2018.4
Source file: src/gromacs/utility/futil.cpp (line 514)
MPI rank:    0 (out of 32)

File input/output error:
remd0.tpr

For more information and tips for troubleshooting, please check the GROMACS
website at http://www.gromacs.org/Documentation/Errors
 I am not able to understand why it is coming

On Thu 25 Jul, 2019, 2:31 PM Szilárd Páll, <pall.szilard at gmail.com> wrote:

> This is an MPI / job scheduler error: you are requesting 2 nodes with
> 20 processes per node (=40 total), but starting 80 ranks.
> --
> Szilárd
>
> On Thu, Jul 18, 2019 at 8:33 AM Bratin Kumar Das
> <177cy500.bratin at nitk.edu.in> wrote:
> >
> > Hi,
> >    I am running remd simulation in gromacs-2016.5. After generating the
> > multiple .tpr file in each directory by the following command
> > *for i in {0..7}; do cd equil$i; gmx grompp -f equil${i}.mdp -c em.gro -p
> > topol.top -o remd$i.tpr -maxwarn 1; cd ..; done*
> > I run *mpirun -np 80 gmx_mpi mdrun -s remd.tpr -multi 8 -replex 1000
> > -reseed 175320 -deffnm remd_equil*
> > It is giving the following error....
> > There are not enough slots available in the system to satisfy the 40
> slots
> > that were requested by the application:
> >   gmx_mpi
> >
> > Either request fewer slots for your application, or make more slots
> > available
> > for use.
> >
> --------------------------------------------------------------------------
> >
> --------------------------------------------------------------------------
> > There are not enough slots available in the system to satisfy the 40
> slots
> > that were requested by the application:
> >   gmx_mpi
> >
> > Either request fewer slots for your application, or make more slots
> > available
> > for use.
> >
> --------------------------------------------------------------------------
> > I am not understanding the error. Any suggestion will be highly
> > appriciated. The mdp file and the qsub.sh file is attached below
> >
> > qsub.sh.......................................
> > #! /bin/bash
> > #PBS -V
> > #PBS -l nodes=2:ppn=20
> > #PBS -l walltime=48:00:00
> > #PBS -N mdrun-serial
> > #PBS -j oe
> > #PBS -o output.log
> > #PBS -e error.log
> > #cd /home/bratin/Downloads/GROMACS/Gromacs_fibril
> > cd $PBS_O_WORKDIR
> > module load openmpi3.0.0
> > module load gromacs-2016.5
> > NP='cat $PBS_NODEFILE | wc -1'
> > # mpirun --machinefile $PBS_PBS_NODEFILE -np $NP 'which gmx_mpi' mdrun -v
> > -s nvt.tpr -deffnm nvt
> > #/apps/gromacs-2016.5/bin/mpirun -np 8 gmx_mpi mdrun -v -s remd.tpr
> -multi
> > 8 -replex 1000 -deffnm remd_out
> > for i in {0..7}; do cd equil$i; gmx grompp -f equil${i}.mdp -c em.gro -r
> > em.gro -p topol.top -o remd$i.tpr -maxwarn 1; cd ..; done
> >
> > for i in {0..7}; do cd equil${i}; mpirun -np 40 gmx_mpi mdrun -v -s
> > remd.tpr -multi 8 -replex 1000 -deffnm remd$i_out ; cd ..; done
> > --
> > Gromacs Users mailing list
> >
> > * Please search the archive at
> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
> posting!
> >
> > * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
> >
> > * For (un)subscribe requests visit
> > https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
> send a mail to gmx-users-request at gromacs.org.
> --
> Gromacs Users mailing list
>
> * Please search the archive at
> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
> posting!
>
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
> * For (un)subscribe requests visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
> send a mail to gmx-users-request at gromacs.org.


More information about the gromacs.org_gmx-users mailing list