[gmx-users] paralllelized gromacs run creating same file in many times -reg
Mark Abraham
mark.j.abraham at gmail.com
Thu Feb 1 14:59:19 CET 2018
Hi,
Yes, that's another point of failure - the command to launch the MPI
executable needs to provide the correct context, including linking the
right libraries.
Mark
On Thu, Feb 1, 2018 at 2:40 PM Peter Kroon <p.c.kroon at rug.nl> wrote:
> Hi,
>
> Digging through my old submission scripts show that when I ran gromacs
> with `srun` MPI communication would not work, and I would end up with a
> lot of files. If I ran with `mpirun` everything worked as expected.
> Which was why I blamed my cluster and their OpenMPI installation. I'm
> not sure this is the thing OP is experiencing.
>
>
> Peter
>
>
> On 01-02-18 12:24, Mark Abraham wrote:
> > Hi,
> >
> > The fact that the binary you ran was named gmx_mpi yet does not seem to
> be
> > compiled with MPI is suspicious - either the cmake process that was used
> > handled suffixes explicitly (which is unnecessary if all you want is the
> > standard thing) and was wrong, or we implemented the standard suffixing
> > wrong (but it's been fine for years and we didn't touch it lately), or
> > something is broken with your wrapper compiler.
> >
> > It is very likely that you have not succeeded in compiling with MPI
> > support. Inspect the report in md.log to see what it says about the MPI
> > library. If so, then you need to consult with the cluster docs/admins to
> > find out how to direct the MPI wrapper compiler to use the intended base
> > compiler (here icc) and give that to cmake. Additionally, setting cmake
> > -DGMX_MPI=on will make sure that you can't accidentally build a non-MPI
> > version of GROMACS.
> >
> > Mark
> >
> > On Thu, Feb 1, 2018 at 11:40 AM Peter Kroon <p.c.kroon at rug.nl> wrote:
> >
> >> Hi Venkat,
> >>
> >>
> >> I've seen similar behaviour with OpenMPI and a home-patched version of
> >> Gromacs. I blamed OpenMPI/the cluster and contacted the admins (but I
> >> don't remember what the result was). In the end I solved/worked around
> >> the issue by compiling Gromacs with IntelMPI.
> >>
> >>
> >> HTH
> >>
> >> Peter
> >>
> >>
> >> On 01-02-18 06:11, venkat wrote:
> >>> Hello,
> >>> I installed Gromacs-2016.4 using (compiled using:
> >>> intel-parallel-studio-2016, gcc4.8.5),
> >>> I tried gromacs run parallelized on 32 cores, i seen multiple
> instances
> >> of
> >>> mdrun, each one using one MPI process,
> >>> also many gromacs header in the log file, in directory log, ene,
> >> traj_comp
> >>> files are creating multiple times
> >>>
> >>> so, I expecting your suggestions to rectify this issue.
> >>>
> >>> thank you
> >>>
> >>> *################FOR JOB SUBMISSION FOLLOWED SCRIPT USED *
> >>> #! /bin/bash
> >>> ##PBS -l walltime=48:00:00
> >>> #PBS -N gromacs
> >>> #PBS -q workq
> >>> #PBS -l select=2:ncpus=16:mpiprocs=16
> >>> #PBS -S /bin/csh
> >>> #PBS -V
> >>> # Go to the directory from which you submitted the job
> >>>
> >>> cd $PBS_O_WORKDIR
> >>>
> >>> module purge
> >>> module load gcc4.8.5
> >>> module load gsl-2.0
> >>> module load intel-parallel-studio-2016
> >>> module load gromacs-2016.4_Plumbedpatch
> >>> module load openmpi-3.0
> >>> module load plumed-2.4
> >>>
> >>> export MPI_DEBUG=all
> >>> export MPI_IB_RAILS=2
> >>> export MPI_DSM_DISTRIBUTE=1
> >>> export MPI_VERBOSE=1
> >>> export MPI_BUFS_THRESHOLD=1
> >>> export MPI_BUFS_PER_PROC=1024
> >>> export OMP_NUM_THREADS=1
> >>>
> >>> mpirun -np 32 /app/gromacs-2016.4_plumbSWpatch/bin/gmx_mpi mdrun -v -s
> >>> topol -nsteps 5000000 &> 6ter.log
> >>>
> >>>
> >>> *##########DIRECTORY *
> >>> *ls -lrt *
> >>>
> >>> Feb 1 10:52 #ener.edr.51#
> >>> Feb 1 10:52 #ener.edr.50#
> >>> Feb 1 10:52 #ener.edr.49#
> >>> Feb 1 10:52 #ener.edr.48#
> >>> Feb 1 10:52 #ener.edr.47#
> >>> Feb 1 10:52 #ener.edr.46#
> >>> Feb 1 10:52 #ener.edr.45#
> >>> Feb 1 10:52 #ener.edr.44#
> >>> Feb 1 10:52 #ener.edr.43#
> >>> Feb 1 10:52 #ener.edr.42#
> >>> Feb 1 10:52 #ener.edr.41#
> >>> Feb 1 10:52 #ener.edr.40#
> >>> Feb 1 10:52 #ener.edr.39#
> >>> Feb 1 10:52 #ener.edr.38#
> >>> Feb 1 10:52 #ener.edr.37#
> >>> Feb 1 10:52 #ener.edr.36#
> >>> Feb 1 10:52 #ener.edr.35#
> >>> Feb 1 10:52 #ener.edr.34#
> >>> Feb 1 10:52 #ener.edr.33#
> >>> Feb 1 10:52 #ener.edr.32#
> >>> Feb 1 10:52 ener.edr
> >>> Feb 1 10:52 #traj_comp.xtc.57#
> >>> Feb 1 10:52 #traj_comp.xtc.56#
> >>> Feb 1 10:52 #traj_comp.xtc.55#
> >>> Feb 1 10:52 #traj_comp.xtc.54#
> >>> Feb 1 10:52 #traj_comp.xtc.53#
> >>> Feb 1 10:52 #traj_comp.xtc.52#
> >>> Feb 1 10:52 #traj_comp.xtc.51#
> >>> Feb 1 10:52 #traj_comp.xtc.50#
> >>> Feb 1 10:52 #traj_comp.xtc.49#
> >>> Feb 1 10:52 #traj_comp.xtc.48#
> >>> Feb 1 10:52 #traj_comp.xtc.47#
> >>> Feb 1 10:52 #traj_comp.xtc.46#
> >>> Feb 1 10:52 #traj_comp.xtc.45#
> >>> Feb 1 10:52 #traj_comp.xtc.44#
> >>> Feb 1 10:52 #traj_comp.xtc.43#
> >>> Feb 1 10:52 #traj_comp.xtc.42#
> >>> Feb 1 10:52 #traj_comp.xtc.41#
> >>> Feb 1 10:52 #traj_comp.xtc.40#
> >>> Feb 1 10:52 #traj_comp.xtc.39#
> >>> Feb 1 10:52 traj_comp.xtc
> >>> Feb 1 10:52 #md.log.70#
> >>> Feb 1 10:52 #md.log.69#
> >>> Feb 1 10:52 #md.log.68#
> >>> Feb 1 10:52 #md.log.67#
> >>> Feb 1 10:52 #md.log.66#
> >>> Feb 1 10:52 #md.log.65#
> >>> Feb 1 10:52 #md.log.64#
> >>> Feb 1 10:52 #md.log.63#
> >>> Feb 1 10:52 #md.log.62#
> >>> Feb 1 10:52 #md.log.61#
> >>> Feb 1 10:52 #md.log.60#
> >>> Feb 1 10:52 #md.log.59#
> >>> Feb 1 10:52 #md.log.58#
> >>> Feb 1 10:52 #md.log.57#
> >>> Feb 1 10:52 #md.log.56#
> >>> Feb 1 10:52 #md.log.55#
> >>> Feb 1 10:52 #md.log.54#
> >>> Feb 1 10:52 md.log
> >>> Feb 1 10:52 6ter.log
> >>>
> >>> *################KINDLY GET LOG FROM ATTACHMENT*
> >>>
> >>> 6ter.log
> >>> <
> >>
> https://drive.google.com/file/d/1p68O_azo1lV7ucS6zoXZtFNLWehv17p6/view?usp=drive_web
> >>>
> >>
> >> --
> >> Gromacs Users mailing list
> >>
> >> * Please search the archive at
> >> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
> >> posting!
> >>
> >> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
> >>
> >> * For (un)subscribe requests visit
> >> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
> >> send a mail to gmx-users-request at gromacs.org.
>
>
> --
> Gromacs Users mailing list
>
> * Please search the archive at
> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
> posting!
>
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
> * For (un)subscribe requests visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
> send a mail to gmx-users-request at gromacs.org.
More information about the gromacs.org_gmx-users
mailing list