[gmx-users] simulation killed
mehmet kıytak
mehmet63900 at hotmail.com
Mon Jan 30 14:15:29 CET 2012
hi! Mark
warning this..
WARNING 1 [file ../MDP/EM/em_steep_0.mdp]:
You are using full electrostatics treatment PME for a system without
charges.
This costs a lot of performance for just processing zeros, consider using
Cut-off instead.
Largest charge group radii for Van der Waals: 0.164, 0.113 nm
NOTE 2 [file ../MDP/EM/em_steep_0.mdp]:
The sum of the two largest charge group radii (0.276360) is larger than
rlist (1.000000) - rvdw (0.900000)
ı think no important warning ..isn't it ?..
thanks..........
Date: Mon, 30 Jan 2012 22:25:48 +1100
From: Mark.Abraham at anu.edu.au
To: gmx-users at gromacs.org
Subject: Re: [gmx-users] simulation killed
On 30/01/2012 9:56 PM, murat özçelik wrote:
hi! lina my script this.... please tell me where is
wrong.......thanks....
Probably your use of -maxwarn is erroneous, unless you can write
down why it is valid.
Mark
#!/bin/bash
# Set some environment variables
FREE_ENERGY=/home/mkiytak/Free_Energy1
echo "Free energy home directory set to $FREE_ENERGY"
MDP=$FREE_ENERGY/MDP
echo ".mdp files are stored in $MDP"
LAMBDA=0
# A new directory will be created for each value of lambda and
# at each step in the workflow for maximum organization.
mkdir Lambda_$LAMBDA
cd Lambda_$LAMBDA
#################################
# ENERGY MINIMIZATION 1: STEEP #
#################################
echo "Starting minimization for lambda = $LAMBDA..."
mkdir EM_1
cd EM_1
# Iterative calls to grompp and mdrun to run the simulations
grompp -f $MDP/EM/em_steep_$LAMBDA.mdp -c
$FREE_ENERGY/HIS1/solv.gro -p $FREE_ENERGY/HIS1/topol.top -o
min$LAMBDA.tpr -maxwarn 3
mdrun -nt 8 -deffnm min$LAMBDA
############################# ####
# ENERGY MINIMIZATION 2: L-BFGS #
#################################
cd ../
mkdir EM_2
cd EM_2
grompp -f $MDP/EM/em_l-bfgs_$LAMBDA.mdp -c
../EM_1/min$LAMBDA.gro -p $FREE_ENERGY/HIS1/topol.top -o
min$LAMBDA.tpr -maxwarn 3
# Run L-BFGS in serial (cannot be run in parallel)
mdrun -nt 1 -deffnm min$LAMBDA
echo "Minimization complete."
#####################
# NVT EQUILIBRATION #
#####################
echo "Starting constant volume equilibration..."
cd ../
mkdir NVT
cd NVT
grompp -f $MDP/NVT/nvt_$LAMBDA.mdp -c ../EM_2/min$LAMBDA.gro -p
$FREE_ENERGY/HIS1/topol.top -o nvt$LAMBDA.tpr -maxwarn 3
mdrun -nt 8 -deffnm nvt$LAMBDA
echo "Constant volume equilibration complete."
#####################
# NPT EQUILIBRATION #
#####################
echo "Starting constant pressure equilibration..."
cd ../
mkdir NPT
cd NPT
grompp -f $MDP/NPT/npt_$ LAMBDA.mdp -c ../NVT/nvt$LAMBDA.gro -p
$FREE_ENERGY/HIS1/topol.top -t ../NVT/nvt$LAMBDA.cpt -o
npt$LAMBDA.tpr -maxwarn 3
mdrun -nt 8 -deffnm npt$LAMBDA
echo "Constant pressure equilibration complete."
#################
# PRODUCTION MD #
#################
echo "Starting production MD simulation..."
cd ../
mkdir Production_MD
cd Production_MD
grompp -f $MDP/Production_MD/md_$LAMBDA.mdp -c
../NPT/npt$LAMBDA.gro -p $FREE_ENERGY/HIS1/topol.top -t
../NPT/npt$LAMBDA.cpt -o md$LAMBDA.tpr -maxwarn 3
mdrun -nt 8 -deffnm md$LAMBDA
echo "Production MD complete."
# End
echo "Ending. Job completed for lambda = $LAMBDA"
> Date: Mon, 30 Jan 2012 13:54:18 +0800
> Subject: Re: [gmx-users] simulation killed
> From: lina.lastname at gmail.com
> To: gmx-users at gromacs.org
>
> On Mon, Jan 30, 2012 at 1:18 AM, murat özçelik
<mehmet63900 at hotm ail.com> wrote:
> > hi again.... capacity of my harddisk 600 GB.... ı
try again ..the program
> > gave me below message...
> >
> > Reading file md0.tpr, VERSION 4.5.4 (single
precision)
> > Starting 8 threads
> >
> > NOTE: The load imbalance in PME FFT and solve is
1211%.
> > For optimal PME load balancing
> > PME grid_x (1152) and grid_y (1152) should be
divisible by
> > #PME_nodes_x (8)
> > and PME grid_y (1152) and grid_z (1152) should
be divisible by
> > #PME_nodes_y (1)
> >
> >
> > Making 1D domain decomposition 8 x 1 x 1
> > starting mdrun 'Protein in water'
>
> > 2500000 steps, 5000.0 ps.
> > ./job_0.sh: line 95: 15777 Killed
mdrun -nt 8 -deffnm
>
> What's inside your job_0.sh?
>
> something wrong your script.
>
> > md$LAMBDA
> >
> > Production MD complete.
> > Ending. Job completed for lambda = 0
> > mkiytak at babil:~/JOB1$
> >
> >
> > How can I solve this problem....thanks for your
help.....
> >
> >
> >
> >
> >
> >
> >
> >> Date: Sun, 29 Jan 2012 10:43:53 -0600
> >> From: pcl at uab.edu
> >> To: gmx-users at gromacs.org
> >> Subject: Re: [gmx-users] simulation killed
> >
> >>
> >> something killed your job but it wasn't gromacs.
> >> Your system has run time or memory requirements
that your job exceeded.
> >>
> >> On 2012-01-29 06:40:29PM +0200, mehmet kıytak
wrote:
> >> > hi! all
> >> >
> >> > ı have to a big problem..ı am doing free
energy calculation for a ligand
> >> > (L
> >> > histidine ) when ı perform mdrun .. my
simulation stop ... the program
> >> > gave
> >> > me this message..
> >> >
> >> > Reading file md0.tpr, VERSION 4.5.4 (single
precision)
> >> > Starting 8 threads
> >> > Making 1D domain decomposition 8 x 1 x 1
> > & gt; > starting mdrun 'Protein in water'
> >
> >> > 2500000 steps, 5000.0 ps.
> >> > ./job_0.sh: line 95: 15457 Killed mdrun -nt
8 -deffnm
> >> > md$LAMBDA
> >> > Production MD complete.
> >> > Ending. Job completed for lambda = 0
> >> > mkiytak at babil:~/JOB1$
> >> >
> >> >
> >> > my mdp file..
> >> > ;
> >> > ; Run control
> >> > integrator = sd ; Langevin dynamics
> >> > tinit = 0
> >> > dt = 0.002
> >> > nsteps = 2500000 ; 5 ns
> >> > nstcomm = 100
> >> > ; Output control
> >> > nstxout = 500
> >> > nstvout = 500
> >> > nstfout = 0
> >> > nstlog = 500
> >> > nstenergy = 500
> >> > nstxtcout = 0
> >> > xtc-precision = 1000
> >> > ; Neighborsearching and short-range
nonbonded interactions
> >> > nstlist = 10
> >> > ns_type = grid
> >> > pbc = xyz
> >> > rlist = 1.0
> >> > ; Electrostatics
> >> > coulombtype = PME
> >> > rcoulomb = 1.0
> >> > ; van der Waals
> >> > vdw-type = switch
> >> > rvdw-switch = 0.8
> >> > rvdw = 0.9
> >> > ; Apply long range dispersion corrections
for Energy and Pressure
> >> > DispCorr = EnerPres
> >> > ; Spacing for the PME/PPPM FFT grid
> >> > fourierspacing = 0.12
> >> > ; EWALD/PME/PPPM parameters
> >> > pme_order = 6
> >> > ewald_rtol = 1e-06
> >> > epsilon_surface = 0
> >> > optimize_fft = no
> >> &g t; ; Temperature coupling
> >
> >> > ; tcoupl is implicitly handled by the sd
integrator
> >> > tc_grps = system
> >> > tau_t = 1.0
> >> > ref_t = 300
> >> > ; Pressure coupling is on for NPT
> >> > Pcoupl = Parrinello-Rahman
> >> > tau_p = 0.5
> >> > compressibility = 4.5e-05
> >> > ref_p = 1.0
> >> > ; Free energy control st uff
> >> > free_energy = yes
> >> > init_lambda = 0.00
> >> > delta_lambda = 0
> >> > foreign_lambda = 0.05
> >> > sc-alpha = 0.5
> >> > sc-power = 1.0
> >> > sc-sigma = 0.3
> >> > couple-moltype = system
> >> > couple-lambda0 = vdw ; only van der Waals
interactions
> >> > couple-lambda1 = non e ; turn off
everything, in this case
> >
> >> > only vdW
> >> > couple-intramol = no
> >> > nstdhdl = 10
> >> > ; Do not generate velocities
> >> > gen_vel = no
> >> > ; options for bonds
> >> > constraints = h-bonds ; we only have C-H
bonds here
> >> > ; Type of constraint algorithm
> >> > constraint-algorithm = lincs
> >> > ; Constrain the starting configuration
> >> > ; since we are continuing from NPT
> >> > continuation = yes
> >> > ; Highest order in the expansion of the
constraint coupling matrix
> >> > lincs-order = 12
> >> >
> >> > PLEASE HELP ME... WHY THE PROGRAM STOP (
KİLLED) SORRY FOR BAD ENGLİSH..
> >>
> >> > --
> >> > gmx-users mailing list
gmx-users at gromacs.org
> >> >
http://lists.gromacs.org/mailman/listinfo/gmx-users
> >> > Please search the archive at
> >> >
http://www.gromacs.org/Support/Mailing_Lists/Search before
posting!
> >> > Please don't post (un)subscribe requests to
the list. Use the
> >> > www interface or send it to
gmx-users-request at gromacs.org.
> >> > Can't post? Read
http://www.gromacs.org/Support/Mailing_Lists
> >>
> >>
> >> --
> >> ===================
===============================================
> >> Peter C. Lai | University of Alabama-Birmingham
> >> Programmer/Analyst | KAUL 752A
> >> Genetics, Div. of Research | 705 South 20th
Street
> >> pcl at uab.edu | Birmingham AL 35294-4461
> >> (205) 690-0808 |
> >>
==================================================================
> >>
> >> --
> >> gmx-users mailing list gmx-users at gromacs.org
> >>
http://lists.gromacs.org/mailman/listinfo/gmx-users
> >> Please search the archive at
> >>
http://www.gromacs.org/Support/Mailing_Lists/Search before
posting!
> >> Please don't post (un)s ubscribe requests to the
list. Use the
> >> www interface or send it to
gmx-users-request at gromacs.org.
> >> Can't post? Read
http://www.gromacs.org/Support/Mailing_Lists
> >
> > --
> > gmx-users mailing list gmx-users at gromacs.org
> > http://lists.gromacs.org/mailman/listinfo/gmx-users
> > Please search the archive at
> > http://www.gromacs.org/Support/Mailing_Lists/Search
before posting!
> > Please don't post (un)subscribe requests to the
list. Use the
> > www interface or send it to
gmx-users-request at gromacs.org.
> > Can't post? Read
http://www.gromacs.org/Support/Mailing_Lists
> --
> gmx-users mailing list gmx-users at gromacs.org
> http://lists.gromacs.org/mailman/listinfo/gmx-users
> Please search the archive at
http://www.gromacs.org/Support/Mailing_Lists/Search before
posting!
> Please don't post (un)subscribe requests to the list. Use
the
> www interface or send it to
gmx-users-request at gromacs.org.
> Can't post? Read
http://www.gromacs.org/Support/Mailing_Lists
--
gmx-users mailing list gmx-users at gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
Please don't post (un)subscribe requests to the list. Use the
www interface or send it to gmx-users-request at gromacs.org.
Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://maillist.sys.kth.se/pipermail/gromacs.org_gmx-users/attachments/20120130/42e85157/attachment.html>
More information about the gromacs.org_gmx-users
mailing list