[gmx-users] Reg No error in output

Justin Lemkul jalemkul at vt.edu
Fri Feb 7 22:44:46 CET 2014



On 2/7/14, 9:44 AM, vidhya sankar wrote:
> Dear Justin Thank you for your Previous reply
>
> When i Run the Gromacs job in cluster the Job has been terminated withou any error . The md.log files does not show any error So i am not able to find out such abnormal termination . i have used the following script
> Should i remove the characrer  &>/dev/null  at the End of mdrun  command line

It would certainly be faster to try, and get an answer in a few moments, than 
wait several hours for a response...

> I do not know Why this Happens? how to Avoid this ? is there is any memorry problem
>

Your job has terminated prematurely and you have zero indication of what went 
wrong.  I'm not going to waste time (yours or mine) guessing.

-Justin

> I am sure of that there is No error in input Because it Already  run  succesfully.
>
>
>
> ##!/bin/bash
> #PBS -N boojob
> #PBS -l nodes=compute-0-2:ppn=8
> #PBS -l walltime=900:10:5
> date
> #PBS -l pmem=4000MB
> # PBS -m ae  boopathi at gmail.com
> cd $PBS_O_WORKDIR
> TMPDIR=/scratch/boo_vs
> mkdir $TMPDIR
> hostname
> echo "files copied from" $PBS_O_WORKDIR
> echo "to computing directory" $TMPDIR
> cd $TMPDIR
>
> cp $PBS_O_WORKDIR/A1-40a.tpr  $TMPDIR/
>
> mpirun=/opt/openmpi/bin/mpirun
>
> LD_LIBRARY_PATH=/share/apps/gromacsplu/lib
>
> source="/share/apps/gromacsplu//bin"
> MDRUN="/share/apps/gromacsplu/bin/mdrun_mpi_d"
> $mpirun -np 8 $MDRUN -s A1-40a.tpr  -nt 1  -plumed plumed.dat   -v   -deffnm   A1-40a &>/dev/null
> #exit
> cp --force $TMPDIR/* $PBS_O_WORKDIR/out2/
> rm -rf $TMPDIR
> date
>

-- 
==================================================

Justin A. Lemkul, Ph.D.
Postdoctoral Fellow

Department of Pharmaceutical Sciences
School of Pharmacy
Health Sciences Facility II, Room 601
University of Maryland, Baltimore
20 Penn St.
Baltimore, MD 21201

jalemkul at outerbanks.umaryland.edu | (410) 706-7441
http://mackerell.umaryland.edu/~jalemkul

==================================================


More information about the gromacs.org_gmx-users mailing list