[gmx-users] error on opening gmx_mpi

p buscemi pbuscemi at q.com
Tue Dec 18 22:12:07 CET 2018

I installed 2019 beata gmx_mpi with:

The install completed with no errors.
I need to take this step by step: in running minim. For minimization I used
mpirun -np 8 mdrun_mpi -deffnm RUNname.em
with the output:
:-) GROMACS - mdrun_mpi, VERSION 5.1.2 (-:
etc etc....
GROMACS: mdrun_mpi, VERSION 5.1.2
Executable: /usr/bin/mdrun_mpi.openmpi
Data prefix: /usr
Command line:
mdrun_mpi -deffnm PVP20k1.em

Back Off! I just backed up PVP20k1.em.log to ./#PVP20k1.em.log.2#
Running on 1 node with total 64 cores, 64 logical cores
Hardware detected on host rgb2 (the node of MPI rank 0):
CPU info:
Vendor: AuthenticAMD
Brand: AMD Ryzen Threadripper 2990WX 32-Core Processor
SIMD instructions most likely to fit this hardware: AVX_128_FMA
SIMD instructions selected at GROMACS compile time: SSE2

Compiled SIMD instructions: SSE2, GROMACS could use AVX_128_FMA on this machine, which is better
Reading file PVP20k1.em.tpr, VERSION 2018.4 (single precision)
Program mdrun_mpi, VERSION 5.1.2
Source code file: /build/gromacs-z6bPBg/gromacs-5.1.2/src/gromacs/fileio/tpxio.c, line: 3345

Fatal error:
reading tpx file (PVP20k1.em.tpr) version 112 with version 103 program
For more information and tips for troubleshooting, please check the GROMACS
website at http://www.gromacs.org/Documentation/Errors

Halting parallel program mdrun_mpi on rank 0 out of 8
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
I see the fatal error but minim.ndp was used while in gmx_mpi - this is not covered in commor errors.
and I see the note on AVX_128_FM.. but that can wait. Is it the version of the MPI files ( 103 ) that is the at fault?

I need to create the proper tpr to continue

More information about the gromacs.org_gmx-users mailing list