[gmx-users] problem: gromacs run on gpu

Mark Abraham mark.j.abraham at gmail.com
Thu Jul 13 09:22:17 CEST 2017


Hi,

Probably you have some strange character after "on" if you edited the file
on Windows or pasted the line from elsewhere

Mark

On Thu, 13 Jul 2017 07:22 Alex <nedomacho at gmail.com> wrote:

> Can you try to open the script in vi, delete the mdrun line and then
> manually retype it?
>
>
> On 7/12/2017 11:03 PM, leila karami wrote:
> > Dear Gromacs users,
> >
> > I am doing md simulation on Gromacs 5.1.3. on GPU in Rocks cluster
> system using
> > command:
> >
> > gmx_mpi mdrun -nb gpu -v -deffnm gpu -ntomp 16 -gpu_id 0 -pin on
> >
> > All things are ok.
> >
> > When I use this command in a script to do md simulation by queuing
> system:
> >
> >
> -----------------------------------------------------------------------------------------------------
> > #!/bin/bash
> > #$ -S /bin/bash
> > #$ -q gpu.q
> > #$ -cwd
> > #$ -N cell_1
> > #$ -e error_1.dat
> > #$ -o output_1.dat
> > echo "Job started at date"
> > gmx_mpi mdrun -nb gpu -v -deffnm gpu -ntomp 16 -gpu_id 0 -pin on
> > echo "Job Ended at date"
> >
> -----------------------------------------------------------------------------------------------------
> >
> > I encountered with following error:
> >
> > Program:     gmx mdrun, VERSION 5.1.3
> > Source file: src/gromacs/commandline/cmdlineparser.cpp (line 234)
> > Function:    void gmx::CommandLineParser::parse(int*, char**)
> >
> > Error in user input:
> > Invalid command-line options
> >    In command-line option -pin
> >      Invalid value: on
> >
> > For more information and tips for troubleshooting, please check the
> GROMACS
> > website at http://www.gromacs.org/Documentation/Errors
> > -------------------------------------------------------
> > Halting program gmx mdrun
> >
> --------------------------------------------------------------------------
> > MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
> > with errorcode 1.
> >
> > NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
> > You may or may not see output from other processes, depending on
> > exactly when Open MPI kills them.
> >
> >
> -------------------------------------------------------------------------------------------------------
> >
> > How to resolve this error?
> > Any help will be highly appreciated
>
> --
> Gromacs Users mailing list
>
> * Please search the archive at
> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
> posting!
>
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
> * For (un)subscribe requests visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
> send a mail to gmx-users-request at gromacs.org.
>


More information about the gromacs.org_gmx-users mailing list