[gmx-users] error on opening gmx_mpi
Justin Lemkul
jalemkul at vt.edu
Tue Dec 18 22:15:30 CET 2018
On Tue, Dec 18, 2018 at 4:12 PM p buscemi <pbuscemi at q.com> wrote:
> I installed 2019 beata gmx_mpi with:
> cmake .. -DGMX_BUILD_OWN_FFTW=ON -DREGRESSIONTEST_DOWNLOAD=ON -DGMX_GPU=on
> -DCMAKE_CXX_COMPILER=/usr/bin/g++-7 -DCMAKE_C_COMPILER=/usr/bin/gcc-7
> -DGMX_MPI=ON -DGMX_USE_OPENCL=ON
>
> The install completed with no errors.
> I need to take this step by step: in running minim. For minimization I used
> mpirun -np 8 mdrun_mpi -deffnm RUNname.em
> with the output:
> :-) GROMACS - mdrun_mpi, VERSION 5.1.2 (-:
> etc etc....
> GROMACS: mdrun_mpi, VERSION 5.1.2
> Executable: /usr/bin/mdrun_mpi.openmpi
> Data prefix: /usr
> Command line:
> mdrun_mpi -deffnm PVP20k1.em
>
> Back Off! I just backed up PVP20k1.em.log to ./#PVP20k1.em.log.2#
> Running on 1 node with total 64 cores, 64 logical cores
> Hardware detected on host rgb2 (the node of MPI rank 0):
> CPU info:
> Vendor: AuthenticAMD
> Brand: AMD Ryzen Threadripper 2990WX 32-Core Processor
> SIMD instructions most likely to fit this hardware: AVX_128_FMA
> SIMD instructions selected at GROMACS compile time: SSE2
>
> Compiled SIMD instructions: SSE2, GROMACS could use AVX_128_FMA on this
> machine, which is better
> Reading file PVP20k1.em.tpr, VERSION 2018.4 (single precision)
> -------------------------------------------------------
> Program mdrun_mpi, VERSION 5.1.2
> Source code file:
> /build/gromacs-z6bPBg/gromacs-5.1.2/src/gromacs/fileio/tpxio.c, line: 3345
>
> Fatal error:
> reading tpx file (PVP20k1.em.tpr) version 112 with version 103 program
> For more information and tips for troubleshooting, please check the GROMACS
> website at http://www.gromacs.org/Documentation/Errors
> -------------------------------------------------------
>
> Halting parallel program mdrun_mpi on rank 0 out of 8
> --------------------------------------------------------------------------
> MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
> with errorcode 1.
>
> NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
> You may or may not see output from other processes, depending on
> exactly when Open MPI kills them.
> ====
> I see the fatal error but minim.ndp was used while in gmx_mpi - this is
> not covered in commor errors.
> and I see the note on AVX_128_FM.. but that can wait. Is it the version of
> the MPI files ( 103 ) that is the at fault?
>
>
There's nothing wrong with the .tpr file, but you're not using the mdrun
binary you want to be. You've installed the 2019 beta but then you're using
mdrun_mpi version 5.1.2 in /usr/bin. You should be calling the same GROMACS
version for everything.
-Justin
--
==========================================
Justin A. Lemkul, Ph.D.
Assistant Professor
Office: 301 Fralin Hall
Lab: 303 Engel Hall
Virginia Tech Department of Biochemistry
340 West Campus Dr.
Blacksburg, VA 24061
jalemkul at vt.edu | (540) 231-3129
http://www.thelemkullab.com
==========================================
More information about the gromacs.org_gmx-users
mailing list