[gmx-users] help to solve this problem
Justin Lemkul
jalemkul at vt.edu
Thu Nov 20 02:06:06 CET 2014
On 11/19/14 2:15 PM, valiente wrote:
> Dear Justin:
> when I try in my computer the simulation runs well, the problem is when I try to
> run in the cluster. Follow in attach my log files generated in the cluster
> (PlmII+Inhibitor_free.log) and locally (PlmII+Inhibitor-local.log).
The list does not accept attachments.
If your job runs locally, but not on the cluster, there is likely a problem in
the way Gromacs was installed or the way you are executing mdrun. That is a
question for sysadmin.
-Justin
> With kind regards,
> Pedro
> On 19/11/14 13:15, Justin Lemkul wrote:
>>
>>
>> On 11/19/14 12:37 PM, valiente wrote:
>>> Dear gromacs users:
>>> I'm trying to run a MD simulation using gromacs 4.6.5 in a cluster but when I
>>> submit my job using the following script:
>>>
>>> /bin/bash
>>> #Name of your job
>>> #PBS -N PlmII+Inhibitor_free
>>> #number of nodes you are using
>>> #PBS -l nodes=8
>>> #time
>>> #PBS -l walltime=48:00:00
>>> #which queue you are submitting to
>>> #PBS -q qwork at mp2
>>>
>>>
>>> # go to our working directory
>>> cd $PBS_O_WORKDIR
>>>
>>> # add module
>>> module rm mvapich2_intel64/1.6_ofed
>>> module add openmpi_intel64/1.4.3_ofed
>>> module rm gromacs64/4.5.4_ompi
>>> module add gromacs64/4.6.5_ompi
>>>
>>> # choose mpi-tasks per node *** there are 24 cores/node
>>> export ppn=24
>>>
>>> # set your executable file
>>> myExe=`which mdrun_mpi`
>>>
>>> # start the application
>>> export RUN_NAME=PlmII+Inhibitor_free
>>> export P4_GLOBMEMSIZE=200000000
>>> #export OPTIONS=" -cpi PlmII+Inhibitor_free.cpt -dlb yes"
>>> #export OPTIONS=" -cpi prot_md.cpt -dlb yes"
>>>
>>> #actual command line
>>> mpiexec -n $[PBS_NUM_NODES*ppn] -npernode $ppn $myExe -deffnm $RUN_NAME $OPTIONS
>>>
>>> echo "Job finished at: `date`"
>>>
>>> #########################################################
>>>
>>> 5000000 steps, 50000.0 ps.
>>> [cp1854:24892] *** Process received signal ***
>>> [cp1854:24892] Signal: Segmentation fault (11)
>>> [cp1854:24892] Signal code: (128)
>>> [cp1854:24892] Failing at address: (nil)
>>> [cp1854:24892] [ 0] /lib64/libpthread.so.0() [0x3e6480f710]
>>> [cp1854:24892] [ 1]
>>> /opt/gromacs64/4.6.5/bin/../lib/libmd_mpi.so.8(nbnxn_kernel_simd_4xn_tab_comb_lb_energrp+0x4183)
>>>
>>> [0x2b4214b3b7c3]
>>> [cp1854:24892] [ 2]
>>> /opt/gromacs64/4.6.5/bin/../lib/libmd_mpi.so.8(nbnxn_kernel_simd_4xn+0x4b1)
>>> [0x2b4214bb9951]
>>> [cp1854:24892] [ 3]
>>> /opt/intel/composerxe-2011.5.220/compiler/lib/intel64/libiomp5.so(__kmp_invoke_microtask+0x93)
>>>
>>> [0x2b4217dfbb83]
>>> [cp1854:24892] *** End of error message ***
>>> --------------------------------------------------------------------------
>>> mpiexec noticed that process rank 0 with PID 24892 on node cp1854 exited on
>>> signal 11 (Segmentation fault).
>>>
>>
>> Your simulation is crashing. Check the .log file for any hints, but otherwise
>> run locally or interactively to diagnose.
>>
>> -Justin
>>
>
>
>
>
--
==================================================
Justin A. Lemkul, Ph.D.
Ruth L. Kirschstein NRSA Postdoctoral Fellow
Department of Pharmaceutical Sciences
School of Pharmacy
Health Sciences Facility II, Room 629
University of Maryland, Baltimore
20 Penn St.
Baltimore, MD 21201
jalemkul at outerbanks.umaryland.edu | (410) 706-7441
http://mackerell.umaryland.edu/~jalemkul
==================================================
More information about the gromacs.org_gmx-users
mailing list