[gmx-users] error on opening gmx_mpi

p buscemi pbuscemi at q.com
Wed Dec 19 19:42:00 CET 2018


I've finallly got the ducks in order, The command
mpirun -np 8 gmx_mpi mdrun -deffnm PVP20k.nvt auto maps the process as:
On host xxx 2 GPUs auto-selected for this run.
Mapping of GPU IDs to the 8 GPU tasks in the 8 ranks on this node:
PP:0,PP:0,PP:0,PP:0,PP:1,PP:1,PP:1,PP:1

this for the 32 core 2990 AMD TR. DLB is taking some time and I will be tuning the system today. but it works. Results for 80k atoms will be reported.
Thank you all.
Paul

Sent from Mailspring (https://getmailspring.com/), the best free email app for work
On Dec 19 2018, at 11:36 am, p buscemi <pbuscemi at q.com> wrote:
> Getting closer...
> ( thinking a bit about the initial command structure does help....)
>
> Now using the command:
> gmx_mpi mdrun -deffnm PVP20k.nvt -nb gpu -ntomp 16 -npme 4
> :-) GROMACS - gmx mdrun, 2018.4 (-:
>
> gets past the v5 issue but a new nastygram is sent...
> "The -dd or -npme option request a parallel simulation, but gmx mdrun was not
> started through mpirun/mpiexec or only one rank was requested through
> mpirun/mpiexec"
>
> Inclusion of mpirun was attempted but I've not hit the proper format.
> Paul
>
>
>
>
> Sent from Mailspring (https://getmailspring.com/), the best free email app for work
> On Dec 19 2018, at 11:08 am, pbuscemi at q.com wrote:
> > Thank you - both - very much again.
> >
> > The "mpir_run -npx gmx -mdrun....." command was lifted from a Feb 2018
> > response from Szilard , to a multi gpu, user which he used as an example.
> >
> > I'll crank on your pointers right now.
> > Paul
> > -----Original Message-----
> > From: gromacs.org_gmx-users-bounces at maillist.sys.kth.se
> > <gromacs.org_gmx-users-bounces at maillist.sys.kth.se> On Behalf Of Justin
> > Lemkul
> > Sent: Wednesday, December 19, 2018 10:47 AM
> > To: Discussion list for GROMACS users <gmx-users at gromacs.org>
> > Subject: Re: [gmx-users] error on opening gmx_mpi
> >
> > On Wed, Dec 19, 2018 at 11:44 AM p buscemi <pbuscemi at q.com> wrote:
> > > Shi,
> > > reinstalling the mpi version using gmx 18.4 did not help........any
> > ideas ?
> > > hms at rgb2 ~/Desktop/PVP20k $ mpirun -np 8 mdrun_mpi -deffnm PVP20k1.em
> > > :-) GROMACS - mdrun_mpi, VERSION 5.1.2 (-:
> > >
> > >
> > You're just calling the same (incorrect) command again. You installed
> > "gmx_mpi" version 2018.4 but then your command uses "mdrun_mpi" instead.
> > Apparently you have version 5.1.2 (perhaps from a package manager, based
> > on the fact that it's installed in /usr/bin instead of
> > /usr/local/gromacs/bin) that is being found in your PATH.
> >
> > If you've got multiple versions installed, either source GMXRC properly or
> > use full PATH information in your commands. Above all, use the right
> > binary name to start :)
> >
> > -Justin
> > --
> > ==========================================
> > Justin A. Lemkul, Ph.D.
> > Assistant Professor
> > Office: 301 Fralin Hall
> > Lab: 303 Engel Hall
> >
> > Virginia Tech Department of Biochemistry
> > 340 West Campus Dr.
> > Blacksburg, VA 24061
> >
> > jalemkul at vt.edu | (540) 231-3129
> > http://www.thelemkullab.com
> >
> >
> > ==========================================
> > --
> > Gromacs Users mailing list
> >
> > * Please search the archive at
> > http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
> > posting!
> >
> > * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
> > * For (un)subscribe requests visit
> > https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send
> > a mail to gmx-users-request at gromacs.org.
> >
>
>



More information about the gromacs.org_gmx-users mailing list