[gmx-users] error on opening gmx_mpi

p buscemi pbuscemi at q.com
Wed Dec 19 17:44:24 CET 2018


Shi,

reinstalling the mpi version using gmx 18.4 did not help........any ideas ?
hms at rgb2 ~/Desktop/PVP20k $ mpirun -np 8 mdrun_mpi -deffnm PVP20k1.em
:-) GROMACS - mdrun_mpi, VERSION 5.1.2 (-:

GROMACS: mdrun_mpi, VERSION 5.1.2
Executable: /usr/bin/mdrun_mpi.openmpi
Data prefix: /usr
Command line:
mdrun_mpi -deffnm PVP20k1.em

Back Off! I just backed up PVP20k1.em.log to ./#PVP20k1.em.log.5#
Running on 1 node with total 64 cores, 64 logical cores
Hardware detected on host rgb2 (the node of MPI rank 0):
CPU info:
Vendor: AuthenticAMD
Brand: AMD Ryzen Threadripper 2990WX 32-Core Processor
SIMD instructions most likely to fit this hardware: AVX_128_FMA
SIMD instructions selected at GROMACS compile time: SSE2

Compiled SIMD instructions: SSE2, GROMACS could use AVX_128_FMA on this machine, which is better
Reading file PVP20k1.em.tpr, VERSION 2018.4 (single precision)
-------------------------------------------------------
Program mdrun_mpi, VERSION 5.1.2
Source code file: /build/gromacs-z6bPBg/gromacs-5.1.2/src/gromacs/fileio/tpxio.c, line: 3345

Fatal error:
reading tpx file (PVP20k1.em.tpr) version 112 with version 103 program
For more information and tips for troubleshooting, please check the GROMACS
website at http://www.gromacs.org/Documentation/Errors

Sent from Mailspring (https://getmailspring.com/), the best free email app for work
On Dec 19 2018, at 10:04 am, p buscemi <pbuscemi at q.com> wrote:
> here is the output from the gmx_mpi command. I would think the correct version of mdrun would be installed. maybe I could point to it in my PATH ??
> -------------------------------------------------------------------------------------
>
> hms at rgb2 ~/Desktop/PVP20k $ gmx_mpi
> :-) GROMACS - gmx_mpi, 2019-beta1 (-:
> GROMACS is written by:
> et etc.............
> GROMACS: gmx_mpi, version 2019-beta1
> Executable: /usr/local/gromacs/bin/gmx_mpi
> Data prefix: /usr/local/gromacs
> Working dir: /home/hms/Desktop/PVP20k
> Command line:
> gmx_mpi
>
> SYNOPSIS
> gmx [-[no]h] [-[no]quiet] [-[no]version] [-[no]copyright] [-nice <int>]
> [-[no]backup]
>
>
>
> On Dec 18 2018, at 9:51 pm, paul buscemi <pbuscemi at q.com> wrote:
> > Shi, Thanks fo the note
> >
> > Yes, somehow - there is a version of gromacs 5 that is being summoned. I’ve got clean up my act a bit.
> > A suggestion was made to try to use the mpi version because of the CPU I am using. gmx v18.3 was installed , but I removed its build and built the 19.1 beta mpi version in a separate directory. Apparently there are some remnants being called. But v 5 has never been installed on this particular computer, so I have no idea were gromacs -5.1.2 is coming from. it may be easier purge everything and start again.
> > Paul
> > > On Dec 18, 2018, at 8:48 PM, Shi Li <sli259 at g.uky.edu> wrote:
> > > >
> > > > Message: 3
> > > > Date: Tue, 18 Dec 2018 15:12:00 -0600
> > > > From: p buscemi <pbuscemi at q.com>
> > > > To: "=?utf-8?Q?gmx-users=40gromacs.org?=" <gmx-users at gromacs.org>
> > > > Subject: [gmx-users] error on opening gmx_mpi
> > > > Message-ID:
> > > > <1545164001.local-b6243977-9380-v1.5.3-420ce003 at getmailspring.com>
> > > > Content-Type: text/plain; charset="utf-8"
> > > >
> > > > I installed 2019 beata gmx_mpi with:
> > > > cmake .. -DGMX_BUILD_OWN_FFTW=ON -DREGRESSIONTEST_DOWNLOAD=ON -DGMX_GPU=on -DCMAKE_CXX_COMPILER=/usr/bin/g++-7 -DCMAKE_C_COMPILER=/usr/bin/gcc-7 -DGMX_MPI=ON -DGMX_USE_OPENCL=ON
> > > >
> > > > The install completed with no errors.
> > > > I need to take this step by step: in running minim. For minimization I used
> > > > mpirun -np 8 mdrun_mpi -deffnm RUNname.em
> > > > with the output:
> > > > :-) GROMACS - mdrun_mpi, VERSION 5.1.2 (-:
> > > > etc etc....
> > > > GROMACS: mdrun_mpi, VERSION 5.1.2
> > > > Executable: /usr/bin/mdrun_mpi.openmpi
> > > > Data prefix: /usr
> > >
> > >
> > > It looked like you didn’t run the new installed GROMACS. What is the output when you input gmx_mpi? It should be version 2018 instead of 5.1.2.
> > > Have you put the gromacs in your PATH or source the GMXRC?
> > >
> > > Shi
> > >
> > > > Command line:
> > > > mdrun_mpi -deffnm PVP20k1.em
> > > >
> > > > Back Off! I just backed up PVP20k1.em.log to ./#PVP20k1.em.log.2#
> > > > Running on 1 node with total 64 cores, 64 logical cores
> > > > Hardware detected on host rgb2 (the node of MPI rank 0):
> > > > CPU info:
> > > > Vendor: AuthenticAMD
> > > > Brand: AMD Ryzen Threadripper 2990WX 32-Core Processor
> > > > SIMD instructions most likely to fit this hardware: AVX_128_FMA
> > > > SIMD instructions selected at GROMACS compile time: SSE2
> > > >
> > > > Compiled SIMD instructions: SSE2, GROMACS could use AVX_128_FMA on this machine, which is better
> > > > Reading file PVP20k1.em.tpr, VERSION 2018.4 (single precision)
> > > > -------------------------------------------------------
> > > > Program mdrun_mpi, VERSION 5.1.2
> > > > Source code file: /build/gromacs-z6bPBg/gromacs-5.1.2/src/gromacs/fileio/tpxio.c, line: 3345
> > > >
> > > > Fatal error:
> > > > reading tpx file (PVP20k1.em.tpr) version 112 with version 103 program
> > > > For more information and tips for troubleshooting, please check the GROMACS
> > > > website at http://www.gromacs.org/Documentation/Errors
> > > > -------------------------------------------------------
> > > >
> > > > Halting parallel program mdrun_mpi on rank 0 out of 8
> > > > --------------------------------------------------------------------------
> > > > MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
> > > > with errorcode 1.
> > > >
> > > > NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
> > > > You may or may not see output from other processes, depending on
> > > > exactly when Open MPI kills them.
> > > > ====
> > > > I see the fatal error but minim.ndp was used while in gmx_mpi - this is not covered in commor errors.
> > > > and I see the note on AVX_128_FM.. but that can wait. Is it the version of the MPI files ( 103 ) that is the at fault?
> > > >
> > > > I need to create the proper tpr to continue
> > > >
> > > > ------------------------------
> > > --
> > > Gromacs Users mailing list
> > >
> > > * Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!
> > > * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
> > > * For (un)subscribe requests visit
> > > https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a mail to gmx-users-request at gromacs.org.
> >
> >
> > --
> > Gromacs Users mailing list
> >
> > * Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!
> > * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
> > * For (un)subscribe requests visit
> > https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a mail to gmx-users-request at gromacs.org.
> >
>
>



More information about the gromacs.org_gmx-users mailing list