[gmx-users] error on opening gmx_mpi
Shi Li
sli259 at g.uky.edu
Wed Dec 19 03:48:07 CET 2018
>
> Message: 3
> Date: Tue, 18 Dec 2018 15:12:00 -0600
> From: p buscemi <pbuscemi at q.com>
> To: "=?utf-8?Q?gmx-users=40gromacs.org?=" <gmx-users at gromacs.org>
> Subject: [gmx-users] error on opening gmx_mpi
> Message-ID:
> <1545164001.local-b6243977-9380-v1.5.3-420ce003 at getmailspring.com>
> Content-Type: text/plain; charset="utf-8"
>
> I installed 2019 beata gmx_mpi with:
> cmake .. -DGMX_BUILD_OWN_FFTW=ON -DREGRESSIONTEST_DOWNLOAD=ON -DGMX_GPU=on -DCMAKE_CXX_COMPILER=/usr/bin/g++-7 -DCMAKE_C_COMPILER=/usr/bin/gcc-7 -DGMX_MPI=ON -DGMX_USE_OPENCL=ON
>
> The install completed with no errors.
> I need to take this step by step: in running minim. For minimization I used
> mpirun -np 8 mdrun_mpi -deffnm RUNname.em
> with the output:
> :-) GROMACS - mdrun_mpi, VERSION 5.1.2 (-:
> etc etc....
> GROMACS: mdrun_mpi, VERSION 5.1.2
> Executable: /usr/bin/mdrun_mpi.openmpi
> Data prefix: /usr
It looked like you didn’t run the new installed GROMACS. What is the output when you input gmx_mpi? It should be version 2018 instead of 5.1.2.
Have you put the gromacs in your PATH or source the GMXRC?
Shi
> Command line:
> mdrun_mpi -deffnm PVP20k1.em
>
> Back Off! I just backed up PVP20k1.em.log to ./#PVP20k1.em.log.2#
> Running on 1 node with total 64 cores, 64 logical cores
> Hardware detected on host rgb2 (the node of MPI rank 0):
> CPU info:
> Vendor: AuthenticAMD
> Brand: AMD Ryzen Threadripper 2990WX 32-Core Processor
> SIMD instructions most likely to fit this hardware: AVX_128_FMA
> SIMD instructions selected at GROMACS compile time: SSE2
>
> Compiled SIMD instructions: SSE2, GROMACS could use AVX_128_FMA on this machine, which is better
> Reading file PVP20k1.em.tpr, VERSION 2018.4 (single precision)
> -------------------------------------------------------
> Program mdrun_mpi, VERSION 5.1.2
> Source code file: /build/gromacs-z6bPBg/gromacs-5.1.2/src/gromacs/fileio/tpxio.c, line: 3345
>
> Fatal error:
> reading tpx file (PVP20k1.em.tpr) version 112 with version 103 program
> For more information and tips for troubleshooting, please check the GROMACS
> website at http://www.gromacs.org/Documentation/Errors
> -------------------------------------------------------
>
> Halting parallel program mdrun_mpi on rank 0 out of 8
> --------------------------------------------------------------------------
> MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
> with errorcode 1.
>
> NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
> You may or may not see output from other processes, depending on
> exactly when Open MPI kills them.
> ====
> I see the fatal error but minim.ndp was used while in gmx_mpi - this is not covered in commor errors.
> and I see the note on AVX_128_FM.. but that can wait. Is it the version of the MPI files ( 103 ) that is the at fault?
>
> I need to create the proper tpr to continue
>
>
> ------------------------------
>
More information about the gromacs.org_gmx-users
mailing list