[gmx-users] QM/MM Calculation with Orca

Minos Matsoukas minmatsoukas at gmail.com
Fri Jun 8 15:01:58 CEST 2012


Dear GROMACS Users,

I am using Gromacs with ORCA for a qm calculation, but I can only use
1 processor for mdrun.

If I place the line !PAL4 in the topol.ORCAINFO and run mdrun -nt 1,
it works, and ORCA parallelizes correctly for 4 processors, although
mdrun only runs one thread.

If I place the line !PAL4 in the topol.ORCAINFO and run mdrun -nt 4,
it doesn’t work. mdrun crashes with a segmentation fault error :

Back Off! I just backed up md.log to ./#md.log.35#
Reading file topol.tpr, VERSION 4.5.5 (single precision)
Starting 4 threads
QM/MM calculation requested.
QM/MM calculation requested.
QM/MM calculation requested.
QM/MM calculation requested.
there we go!
there we go!
there we go!
there we go!
Layer 0
nr of QM atoms 73
QMlevel: B3LYP/STO-3G

/opt/soft/orca_2_9_1_linux_x86-64/opt/soft/orca_2_9_1_linux_x86-64...
orca initialised...
Segmentation fault (core dumped)


Is this because mdrun can only run with 1 thread when used with ORCA ?

Here’s some information :
OS: CentOS 6.2 x86_64 kernel 2.6.32-220
GROMACS: 4.5.5 compiled with options : --with-qmmm-orca --without-qmmm-gaussian
ORCA: 2.9.1 for x86_64
OPENMPI: 1.4.3

My minimization mdp file is:
------------
;
;	Input file
;
title               =  Yo			; a string
cpp                 =  /lib/cpp			; c-preprocessor
integrator          =  cg 			;
;integrator               = l-bfgs

rlist               =  1.0 			; cut-off for ns
rvdw                =  1.0 			; cut-off for vdw
rcoulomb            =  1.5 			; cut-off for coulomb
;       Energy minimizing stuff
;
nsteps              =  200
emtol               =  10
emstep              =  0.1
;define              = -DPOSRES
constraints         = none
QMMM                     = yes
QMMM-grps                = MOL
QMMMscheme               = normal
QMbasis                  = STO-3g
QMmethod                 = b3lyp
QMcharge                 = 1
QMmult                   = 1
-------------

Thanks in advance



More information about the gromacs.org_gmx-users mailing list