[gmx-users] orca and Segmentation fault

xi zhao zhaoxiitc2002 at yahoo.com.cn
Mon Nov 14 12:42:58 CET 2011



./configure --without-qmmm-orca --without--qmmm-gaussian --enable-mpi 
make 
make install 
I installed  gromacs with Parallel  mode,  is not threading. when I run  " mpirun -np 1 mdrun_dd -v -s pyp.tpr&" or mdrun_dd -nt 1 -v -s pyp.tpr
it still"
Back Off! I just backed up md.log to ./#md.log.20#
Getting Loaded...
Reading file pyp.tpr, VERSION 4.5.1 (single precision)
Loaded with Money
QM/MM calculation requested.
there we go!
Layer 0
nr of QM atoms 22
QMlevel: B3LYP/3-21G
orca initialised...
Back Off! I just backed up traj.trr to ./#traj.trr.1#
Back Off! I just backed up traj.xtc to ./#traj.xtc.1#
Back Off! I just backed up ener.edr to ./#ener.edr.2#
starting mdrun 'PHOTOACTIVE YELLOW PROTEIN in water'
500 steps,      0.5 ps.
Calling 'orca pyp.inp >> pyp.out'
Error : multiplicity (Mult:=2*S+1) is zero
-------------------------------------------------------
Program mdrun_dd, VERSION 4.5.1
Source code file: qm_orca.c, line: 393
Fatal error:
Call to 'orca pyp.inp >> pyp.out' failed
For more information and tips for troubleshooting, please check the GROMACS
website at http://www.gromacs.org/Documentation/Errors
-------------------------------------------------------
"The Carpenter Goes Bang Bang" (The Breeders)
Halting program mdrun_dd
gcq#129: "The Carpenter Goes Bang Bang" (The Breeders)
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD 
with errorcode -1.
NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun has exited due to process rank 0 with PID 18080 on
node localhost.localdomai exiting without calling "finalize". This may
have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here)."


--- 11年11月14日,周一, Christoph Riplinger <cri at thch.uni-bonn.de> 写道:


发件人: Christoph Riplinger <cri at thch.uni-bonn.de>
主题: Re: [gmx-users] orca and Segmentation fault
收件人: "Discussion list for GROMACS users" <gmx-users at gromacs.org>
日期: 2011年11月14日,周一,下午6:51



Dear xi zhao,

In your case GMX is applying threading and thus not only a single QM job is requested, but eight. You should run mdrun with -nt 1. Then GMX uses only one single CPU and ORCA is called only once (but can be used in parallel).
Hope that helps,
Christoph

On 11/14/2011 10:34 AM, xi zhao wrote: 





dear sir : 
 I installed gromacs-orca for qm/mm
./configure --without-qmmm-orca --without--qmmm-gaussian --enable-mpi 
make 
make install 
MPI is lam-mpi
 
when run mdrun , the error 
" Back Off! I just backed up md.log to ./#md.log.12#
Getting Loaded...
Reading file pyp.tpr, VERSION 4.5.1 (single precision)
Loaded with Money
QM/MM calculation requested.
QM/MM calculation requested.
QM/MM calculation requested.
QM/MM calculation requested.
QM/MM calculation requested.
QM/MM calculation requested.
QM/MM calculation requested.
QM/MM calculation requested.
there we go!
there we go!
there we go!
there we go!
there we go!
there we go!
there we go!
Layer 0
nr of QM atoms 22
QMlevel: B3LYP/3-21G
orca initialised...
[localhost:17937] *** Process received signal ***
[localhost:17939] *** Process received signal ***
[localhost:17937] Signal: Segmentation fault (11)
[localhost:17937] Signal code: Address not mapped (1)
[localhost:17937] Failing at address: 0x10
[localhost:17940] *** Process received signal ***
[localhost:17939] Signal: Segmentation fault (11)
[localhost:17939] Signal code: Address not mapped (1)
[localhost:17939] Failing at address: 0x10
[localhost:17936] *** Process received signal ***
[localhost:17936] Signal: Segmentation fault (11)
[localhost:17936] Signal code: Address not mapped (1)
[localhost:17936] Failing at address: 0x10
[localhost:17934] *** Process received signal ***
[localhost:17938] *** Process received signal ***
[localhost:17934] Signal: Segmentation fault (11)
[localhost:17934] Signal code: Address not mapped (1)
[localhost:17934] Failing at address: 0x10
[localhost:17940] Signal: Segmentation fault (11)
[localhost:17940] Signal code: Address not mapped (1)
[localhost:17940] Failing at address: 0x10
[localhost:17938] Signal: Segmentation fault (11)
[localhost:17938] Signal code: Address not mapped (1)
[localhost:17938] Failing at address: 0x10
[localhost:17939] [ 0] /lib64/tls/libpthread.so.0 [0x35c5a0c430]
[localhost:17939] [ 1] mdrun_dd [0x534a93]
[localhost:17939] [ 2] mdrun_dd(init_QMMMrec+0x3e1) [0x5353f1]
[localhost:17939] [ 3] mdrun_dd(mdrunner+0x112c) [0x43200c]
[localhost:17939] [ 4] mdrun_dd(main+0xe69) [0x43d619]
[localhost:17939] [ 5] /lib64/tls/libc.so.6(__libc_start_main+0xdb) [0x35c4f1c3fb]
[localhost:17939] [ 6] mdrun_dd [0x41bcfa]
[localhost:17939] *** End of error message ***
[localhost:17936] [ 0] /lib64/tls/libpthread.so.0 [0x35c5a0c430]
[localhost:17936] [ 1] mdrun_dd [0x534a93]
[localhost:17936] [ 2] mdrun_dd(init_QMMMrec+0x3e1) [0x5353f1]
[localhost:17936] [ 3] mdrun_dd(mdrunner+0x112c) [0x43200c]
[localhost:17936] [ 4] mdrun_dd(main+0xe69) [0x43d619]
[localhost:17936] [ 5] /lib64/tls/libc.so.6(__libc_start_main+0xdb) [0x35c4f1c3fb]
[localhost:17936] [ 6] mdrun_dd [0x41bcfa]
[localhost:17936] *** End of error message ***
[localhost:17940] [ 0] /lib64/tls/libpthread.so.0 [0x35c5a0c430]
[localhost:17940] [ 1] mdrun_dd [0x534a93]
[localhost:17940] [ 2] mdrun_dd(init_QMMMrec+0x3e1) [0x5353f1]
[localhost:17940] [ 3] mdrun_dd(mdrunner+0x112c) [0x43200c]
[localhost:17940] [ 4] mdrun_dd(main+0xe69) [0x43d619]
[localhost:17940] [ 5] /lib64/tls/libc.so.6(__libc_start_main+0xdb) [0x35c4f1c3fb]
[localhost:17940] [ 6] mdrun_dd [0x41bcfa]
[localhost:17940] *** End of error message ***
there we go!
[localhost:17935] *** Process received signal ***
[localhost:17935] Signal: Segmentation fault (11)
[localhost:17935] Signal code: Address not mapped (1)
[localhost:17935] Failing at address: 0x10
[localhost:17935] [ 0] /lib64/tls/libpthread.so.0 [0x35c5a0c430]
[localhost:17935] [ 1] mdrun_dd [0x534a93]
[localhost:17935] [ 2] mdrun_dd(init_QMMMrec+0x3e1) [0x5353f1]
[localhost:17935] [ 3] mdrun_dd(mdrunner+0x112c) [0x43200c]
[localhost:17935] [ 4] mdrun_dd(main+0xe69) [0x43d619]
[localhost:17935] [ 5] /lib64/tls/libc.so.6(__libc_start_main+0xdb) [0x35c4f1c3fb]
[localhost:17935] [ 6] mdrun_dd [0x41bcfa]
[localhost:17935] *** End of error message ***
[localhost:17934] [ 0] /lib64/tls/libpthread.so.0 [0x35c5a0c430]
[localhost:17934] [ 1] mdrun_dd [0x534a93]
[localhost:17934] [ 2] mdrun_dd(init_QMMMrec+0x3e1) [0x5353f1]
[localhost:17934] [ 3] mdrun_dd(mdrunner+0x112c) [0x43200c]
[localhost:17934] [ 4] mdrun_dd(main+0xe69) [0x43d619]
[localhost:17934] [ 5] /lib64/tls/libc.so.6(__libc_start_main+0xdb) [0x35c4f1c3fb]
[localhost:17934] [ 6] mdrun_dd [0x41bcfa]
[localhost:17934] *** End of error message ***
[localhost:17937] [ 0] /lib64/tls/libpthread.so.0 [0x35c5a0c430]
[localhost:17937] [ 1] mdrun_dd [0x534a93]
[localhost:17937] [ 2] mdrun_dd(init_QMMMrec+0x3e1) [0x5353f1]
[localhost:17937] [ 3] mdrun_dd(mdrunner+0x112c) [0x43200c]
[localhost:17937] [ 4] mdrun_dd(main+0xe69) [0x43d619]
[localhost:17937] [ 5] /lib64/tls/libc.so.6(__libc_start_main+0xdb) [0x35c4f1c3fb]
[localhost:17937] [ 6] mdrun_dd [0x41bcfa]
[localhost:17937] *** End of error message ***
[localhost:17938] [ 0] /lib64/tls/libpthread.so.0 [0x35c5a0c430]
[localhost:17938] [ 1] mdrun_dd [0x534a93]
[localhost:17938] [ 2] mdrun_dd(init_QMMMrec+0x3e1) [0x5353f1]
[localhost:17938] [ 3] mdrun_dd(mdrunner+0x112c) [0x43200c]
[localhost:17938] [ 4] mdrun_dd(main+0xe69) [0x43d619]
[localhost:17938] [ 5] /lib64/tls/libc.so.6(__libc_start_main+0xdb) [0x35c4f1c3fb]
[localhost:17938] [ 6] mdrun_dd [0x41bcfa]
[localhost:17938] *** End of error message ***
Making 2D domain decomposition 4 x 2 x 1
--------------------------------------------------------------------------
mpirun noticed that process rank 3 with PID 17936 on node localhost.localdomai exited on signal 11 (Segmentation fault).




-----下面为附件内容-----


-- 
gmx-users mailing list    gmx-users at gromacs.org
http://lists.gromacs.org/mailman/listinfo/gmx-users
Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to gmx-users-request at gromacs.org.
Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://maillist.sys.kth.se/pipermail/gromacs.org_gmx-users/attachments/20111114/d28a7701/attachment.html>


More information about the gromacs.org_gmx-users mailing list