[gmx-users] Re: gromacs QM/MM compilation with gaussian (Txema Mercero)

Txema Mercero jm.mercero at ehu.es
Wed Feb 16 14:39:30 CET 2011


I get the same error which I attach this time:



-------------------------
Back Off! I just backed up md.log to ./#md.log.8#
Reading file topol.tpr, VERSION 4.5.3 (single precision)
QM/MM calculation requested.
there we go!
Layer 0
nr of QM atoms 24
QMlevel: RHF/6-31G

number of CPUs for gaussian = 1
memory for gaussian = 50000000
accuracy in l510 = 8
NOT using cp-mcscf in l1003
Level of SA at start = 0
/software/g03Gromacs/g03gaussian initialised...

Back Off! I just backed up traj.trr to ./#traj.trr.3#

Back Off! I just backed up ener.edr to ./#ener.edr.3#

Steepest Descents:
   Tolerance (Fmax)   =  1.00000e+02
   Number of steps    =         1000
*** glibc detected ***
/software/gromacs-4.5.3-shared-ifort11_gauss/bin/mdrun: malloc():
memory corruption: 0x00000000077711b0 ***
======= Backtrace: =========
/lib64/libc.so.6[0x3ddcc724ac]
/lib64/libc.so.6(__libc_calloc+0xc0)[0x3ddcc73ce0]
/software/gromacs-4.5.3-shared-ifort11_gauss/lib/libgmx.so.6(save_calloc+0x32)[0x2b0e5ba08462]
/software/gromacs-4.5.3-shared-ifort11_gauss/lib/libmd.so.6(call_gaussian+0x81)[0x2b0e5b38cfd1]
/software/gromacs-4.5.3-shared-ifort11_gauss/lib/libmd.so.6(call_QMroutine+0x25)[0x2b0e5b384265]
/software/gromacs-4.5.3-shared-ifort11_gauss/lib/libmd.so.6(calculate_QMMM+0x665)[0x2b0e5b383bb5]
/software/gromacs-4.5.3-shared-ifort11_gauss/lib/libmd.so.6(do_force_lowlevel+0xd9)[0x2b0e5b2fcbd9]
/software/gromacs-4.5.3-shared-ifort11_gauss/lib/libmd.so.6(do_force+0xdaf)[0x2b0e5b35ae6f]
/software/gromacs-4.5.3-shared-ifort11_gauss/lib/libmd.so.6(do_steep+0x7d6)[0x2b0e5b314a26]
/software/gromacs-4.5.3-shared-ifort11_gauss/bin/mdrun[0x4149ba]
/software/gromacs-4.5.3-shared-ifort11_gauss/bin/mdrun[0x41dc03]
/lib64/libc.so.6(__libc_start_main+0xf4)[0x3ddcc1d974]
/software/gromacs-4.5.3-shared-ifort11_gauss/bin/mdrun(do_nm+0x4f1)[0x407069]
======= Memory map: ========
00400000-0046d000 r-xp 00000000 00:18 20054244
  /software/gromacs-4.5.3-shared-ifort11_gauss/bin/mdrun
0066d000-00672000 rw-p 0006d000 00:18 20054244
  /software/gromacs-4.5.3-shared-ifort11_gauss/bin/mdrun
00672000-00673000 rw-p 00672000 00:00 0
0770a000-077b6000 rw-p 0770a000 00:00 0                                  [heap]
3ddc800000-3ddc81c000 r-xp 00000000 08:02 5751044
  /lib64/ld-2.5.so
3ddca1b000-3ddca1c000 r--p 0001b000 08:02 5751044
  /lib64/ld-2.5.so
3ddca1c000-3ddca1d000 rw-p 0001c000 08:02 5751044
  /lib64/ld-2.5.so
3ddcc00000-3ddcd4c000 r-xp 00000000 08:02 5750905
  /lib64/libc-2.5.so
3ddcd4c000-3ddcf4c000 ---p 0014c000 08:02 5750905
  /lib64/libc-2.5.so
3ddcf4c000-3ddcf50000 r--p 0014c000 08:02 5750905
  /lib64/libc-2.5.so
3ddcf50000-3ddcf51000 rw-p 00150000 08:02 5750905
  /lib64/libc-2.5.so
3ddcf51000-3ddcf56000 rw-p 3ddcf51000 00:00 0
3ddd000000-3ddd082000 r-xp 00000000 08:02 5750931
  /lib64/libm-2.5.so
3ddd082000-3ddd281000 ---p 00082000 08:02 5750931
  /lib64/libm-2.5.so
3ddd281000-3ddd282000 r--p 00081000 08:02 5750931
  /lib64/libm-2.5.so
3ddd282000-3ddd283000 rw-p 00082000 08:02 5750931
  /lib64/libm-2.5.so
3ddd400000-3ddd402000 r-xp 00000000 08:02 5750939
  /lib64/libdl-2.5.so
3ddd402000-3ddd602000 ---p 00002000 08:02 5750939
  /lib64/libdl-2.5.so
3ddd602000-3ddd603000 r--p 00002000 08:02 5750939
  /lib64/libdl-2.5.so
3ddd603000-3ddd604000 rw-p 00003000 08:02 5750939
  /lib64/libdl-2.5.so
3ddd800000-3ddd816000 r-xp 00000000 08:02 5751046
  /lib64/libpthread-2.5.so
3ddd816000-3ddda15000 ---p 00016000 08:02 5751046
  /lib64/libpthread-2.5.so
3ddda15000-3ddda16000 r--p 00015000 08:02 5751046
  /lib64/libpthread-2.5.so
3ddda16000-3ddda17000 rw-p 00016000 08:02 5751046
  /lib64/libpthread-2.5.so
3ddda17000-3ddda1b000 rw-p 3ddda17000 00:00 0
3dddc00000-3dddc14000 r-xp 00000000 08:02 10306767
  /usr/lib64/libz.so.1.2.3
3dddc14000-3ddde13000 ---p 00014000 08:02 10306767
  /usr/lib64/libz.so.1.2.3
3ddde13000-3ddde14000 rw-p 00013000 08:02 10306767
  /usr/lib64/libz.so.1.2.3
3ddfc00000-3ddfc15000 r-xp 00000000 08:02 5751016
  /lib64/libnsl-2.5.so
3ddfc15000-3ddfe14000 ---p 00015000 08:02 5751016
  /lib64/libnsl-2.5.so
3ddfe14000-3ddfe15000 r--p 00014000 08:02 5751016
  /lib64/libnsl-2.5.so
3ddfe15000-3ddfe16000 rw-p 00015000 08:02 5751016
  /lib64/libnsl-2.5.so
3ddfe16000-3ddfe18000 rw-p 3ddfe16000 00:00 0
3de4400000-3de4533000 r-xp 00000000 08:02 10310609
  /usr/lib64/libxml2.so.2.6.26
3de4533000-3de4733000 ---p 00133000 08:02 10310609
  /usr/lib64/libxml2.so.2.6.26
3de4733000-3de473c000 rw-p 00133000 08:02 10310609
  /usr/lib64/libxml2.so.2.6.26
3de473c000-3de473d000 rw-p 3de473c000 00:00 0
3dea200000-3dea20d000 r-xp 00000000 08:02 5751072
  /lib64/libgcc_s-4.1.2-20080825.so.1
3dea20d000-3dea40d000 ---p 0000d000 08:02 5751072
  /lib64/libgcc_s-4.1.2-20080825.so.1
3dea40d000-3dea40e000 rw-p 0000d000 08:02 5751072
  /lib64/libgcc_s-4.1.2-20080825.so.1
2b0e5affe000-2b0e5b000000 rw-p 2b0e5affe000 00:00 0
2b0e5b000000-2b0e5b07e000 r-xp 00000000 00:18 20054237
  /software/gromacs-4.5.3-shared-ifort11_gauss/lib/libgmxpreprocess.so.6.0.0
2b0e5b07e000-2b0e5b27d000 ---p 0007e000 00:18 20054237
  /software/gromacs-4.5.3-shared-ifort11_gauss/lib/libgmxpreprocess.so.6.0.0
2b0e5b27d000-2b0e5b280000 rw-p 0007d000 00:18 20054237
  /software/gromacs-4.5.3-shared-ifort11_gauss/lib/libgmxpreprocess.so.6.0.0
2b0e5b280000-2b0e5b2ae000 rw-p 2b0e5b280000 00:00 0
2b0e5b2ae000-2b0e5b3b9000 r-xp 00000000 00:18 20054231
  /software/gromacs-4.5.3-shared-ifort11_gauss/lib/libmd.so.6.0.0
2b0e5b3b9000-2b0e5b5b9000 ---p 0010b000 00:18 20054231
  /software/gromacs-4.5.3-shared-ifort11_gauss/lib/libmd.so.6.0.0
2b0e5b5b9000-2b0e5b5bc000 rw-p 0010b000 00:18 20054231
  /software/gromacs-4.5.3-shared-ifort11_gauss/lib/libmd.so.6.0.0
2b0e5b5bc000-2b0e5b5bd000 rw-p 2b0e5b5bc000 00:00 0
2b0e5b5bd000-2b0e5b74c000 r-xp 00000000 00:18 49184804
  /software/fftw-3.2.2_intel_11.0.069/lib/libfftw3f.so.3.2.4
2b0e5b74c000-2b0e5b94b000 ---p 0018f000 00:18 49184804
  /software/fftw-3.2.2_intel_11.0.069/lib/libfftw3f.so.3.2.4
2b0e5b94b000-2b0e5b958000 rw-p 0018e000 00:18 49184804
  /software/fftw-3.2.2_intel_11.0.069/lib/libfftw3f.so.3.2.4
2b0e5b958000-2b0e5b959000 rw-p 2b0e5b958000 00:00 0
2b0e5b96d000-2b0e5b96e000 rw-p 2b0e5b96d000 00:00 0
2b0e5b96e000-2b0e5bbbb000 r-xp 00000000 00:18 20054224
  /software/gromacs-4.5.3-shared-ifort11_gauss/lib/libgmx.so.6.0.0
2b0e5bbbb000-2b0e5bdbb000 ---p 0024d000 00:18 20054224
  /software/gromacs-4.5.3-shared-ifort11_gauss/lib/libgmx.so.6.0.0
2b0e5bdbb000-2b0e5bdcd000 rw-p 0024d000 00:18 20054224
  /software/gromacs-4.5.3-shared-ifort11_gauss/lib/libgmx.so.6.0.0
2b0e5bdcd000-2b0e5bdce000 rw-p 2b0e5bdcd000 00:00 0
2b0e5bdce000-2b0e5c0b7000 r-xp 00000000 00:17 6031605
  /opt/intel/Compiler/11.1/072/mkl/lib/em64t/libmkl_intel_lp64.so
2b0e5c0b7000-2b0e5c1b6000 ---p 002e9000 00:17 6031605
  /opt/intel/Compiler/11.1/072/mkl/lib/em64t/libmkl_intel_lp64.so
2b0e5c1b6000-2b0e5c1c1000 rw-p 002e8000 00:17 6031605
  /opt/intel/Compiler/11.1/072/mkl/lib/em64t/libmkl_intel_lp64.so
2b0e5c1c1000-2b0e5c1c9000 rw-p 2b0e5c1c1000 00:00 0
2b0e5c1c9000-2b0e5c8c6000 r-xp 00000000 00:17 6031615
  /opt/intel/Compiler/11.1/072/mkl/lib/em64t/libmkl_sequential.so
2b0e5c8c6000-2b0e5c9c6000 ---p 006fd000 00:17 6031615
  /opt/intel/Compiler/11.1/072/mkl/lib/em64t/libmkl_sequential.so
2b0e5c9c6000-2b0e5c9d2000 rw-p 006fd000 00:17 6031615
  /opt/intel/Compiler/11.1/072/mkl/lib/em64t/libmkl_sequential.so
2b0e5c9d2000-2b0e5c9dd000 rw-p 2b0e5c9d2000 00:00 0
2b0e5c9dd000-2b0e5cc7a000 r-xp 00000000 00:17 6031599
  /opt/intel/Compiler/11.1/072/mkl/lib/em64t/libmkl_core.so
2b0e5cc7a000-2b0e5cd7a000 ---p 0029d000 00:17 6031599
  /opt/intel/Compiler/11.1/072/mkl/lib/em64t/libmkl_core.so
2b0e5cd7a000-2b0e5cd7f000 rw-p 0029d000 00:17 6031599
  /opt/intel/Compiler/11.1/072/mkl/lib/em64t/libmkl_core.so
2b0e5cd7f000-2b0e5cd92000 rw-p 2b0e5cd7f000 00:00 0
2b0e5cd92000-2b0e5cfe2000 r-xp 00000000 00:17 6783119
  /opt/intel/Compiler/11.1/073/lib/intel64/libimf.so
2b0e5cfe2000-2b0e5d0e1000 ---p 00250000 00:17 6783119
  /opt/intel/Compiler/11.1/073/lib/intel64/libimf.so
2b0e5d0e1000-2b0e5d126000 rw-p 0024f000 00:17 6783119
  /opt/intel/Compiler/11.1/073/lib/intel64/libimf.so
2b0e5d126000-2b0e5d23c000 r-xp 00000000 00:17 6783139
  /opt/intel/Compiler/11.1/073/lib/intel64/libsvml.so
2b0e5d23c000-2b0e5d33b000 ---p 00116000 00:17 6783139
  /opt/intel/Compiler/11.1/073/lib/intel64/libsvml.so
2b0e5d33b000-2b0e5d33c000 rw-p 00115000 00:17 6783139
  /opt/intel/Compiler/11.1/073/lib/intel64/libsvml.so
2b0e5d33c000-2b0e5d377000 r-xp 00000000 00:17 6783121
  /opt/intel/Compiler/11.1/073/lib/intel64/libintlc.so.5
2b0e5d377000-2b0e5d476000 ---p 0003b000 00:17 6783121
  /opt/intel/Compiler/11.1/073/lib/intel64/libintlc.so.5
2b0e5d476000-2b0e5d479000 rw-p 0003a000 00:17 6783121
  /opt/intel/Compiler/11.1/073/lib/intel64/libintlc.so.5
2b0e5d479000-2b0e5d8b9000 rw-p 2b0e5d479000 00:00 0
2b0e60000000-2b0e60021000 rw-p 2b0e60000000 00:00 0
2b0e60021000-2b0e64000000 ---p 2b0e60021000 00:00 0
7fff4fa96000-7fff4faaa000 rwxp 7ffffffe9000 00:00 0                      [stack]
7fff4faaa000-7fff4faac000 rw-p 7fffffffd000 00:00 0
ffffffffff600000-ffffffffffe00000 ---p 00000000 00:00 0                  [vdso]
Aborted


On Wed, Feb 16, 2011 at 2:08 PM, Gerrit Groenhof <ggroenh at gwdg.de> wrote:
>
> Are you trying to run with more than one thread?
> If so, try mdrun -nt 1
>
> Gerrit
>
>>
>>   1. Re: gromacs QM/MM compilation with gaussian (Txema Mercero)
>>   2. Re: Periodic Boundary Conditions g_mindist -pi (ifat shub)
>>   3. Re: Re: Periodic Boundary Conditions g_mindist -pi (Mark Abraham)
>>
>>
>> ----------------------------------------------------------------------
>>
>> Message: 1
>> Date: Wed, 16 Feb 2011 12:23:49 +0100
>> From: Txema Mercero <jm.mercero at ehu.es>
>> Subject: [gmx-users] Re: gromacs QM/MM compilation with gaussian
>> To: gmx-users at gromacs.org
>> Cc: Edu Ogando <edu.ogando at ehu.es>, Jon I?aki Mujika
>>       <joni.mujika at ehu.es>
>> Message-ID:
>>       <AANLkTi=O7w8CBEz90u-VzK+q+46EhH+RrTrtHgVUQ8Tz at mail.gmail.com>
>> Content-Type: text/plain; charset=ISO-8859-1
>>
>> Hi there!
>>
>> We are trying to compile gromacs with Gaussian 03 rev D.02 (we also
>> have g09). We followed the instructions in
>> http://wwwuser.gwdg.de/~ggroenh/roadmap.pdf despite that they do no
>> fit exactly  with the g03 rev d03 version,for instance, FrcNCN is no
>> in l710 but in utilam.F
>>
>> Despite of that, we compiled gromacs and  apparently everything was
>> fine, but we get a segmentation fault when we run gromacs. We have the
>> following questions
>>
>> 1.- Is it possible to get a more detailed/or specific instructions?
>> 2.- I think that three variables GAUSS_EXE, GAUSS_DIR and DEVEL_DIR
>> should be defined. Where should GAUSS_EXE and GAUSS_DIR point exactly?
>>
>> Thanks for your attention, any help will be appreciated.
>>
>> Regards,
>>
>> Txema Mercero
>> IZO/SGI
>> UPV/EHU
>>
>>
>> ------------------------------
>>
>> Message: 2
>> Date: Wed, 16 Feb 2011 13:30:29 +0200
>> From: ifat shub <shubifat at gmail.com>
>> Subject: [gmx-users] Re: Periodic Boundary Conditions g_mindist -pi
>> To: gmx-users at gromacs.org
>> Message-ID:
>>       <AANLkTi=71Moj0kH4=TR3_gDh54b1WsnEi0HUNDx++MQq at mail.gmail.com>
>> Content-Type: text/plain; charset="iso-8859-1"
>>
>> Hi Tsjerk,
>> Thank you for your reply.
>> I am aware of the trajconv option but I wanted to know if there is a way to
>> avoid these kind of jumps over the periodic boundaries during the mdrun and
>> not post process?
>> Thanks,
>> Ifat
>>
>> message: 4
>>> Date: Wed, 16 Feb 2011 11:19:14 +0200
>>> From: ifat shub <shubifat at gmail.com>
>>> Subject: [gmx-users] Periodic Boundary Conditions g_mindist -pi
>>> To: gmx-users at gromacs.org
>>> Message-ID:
>>>       <AANLkTi=sgjfTmrf-0nVmZZOgfs+xhyxv5u6GGFNHR6hp at mail.gmail.com>
>>> Content-Type: text/plain; charset="iso-8859-1"
>>>
>>> Hi,
>>>
>>>
>>>
>>> I am running a simulation on the complex 1aik.pdb in 310K. I wanted to see
>>> if the complex is seeing its next periodic image, so I used the g_mindist
>>> command with the -pi option. My command line was:
>>>
>>> g_mindist -f run.xtc -s run.gro -n index.ndx -od tmp.xvg -pi
>>>
>>> The output (see below) was stable until ~344ps when there is a  jump in the
>>> max internal distance (third column) from ~6nm to ~22nm. After the jump the
>>> numbers are reduced back to ~6nm and remained stable until the run is
>>> completed at 1ns.
>>>
>>> Does anyone know how to explain this jump? Is this a real problem or just a
>>> visualization artifact? Is there a way to avoid such jumps?
>>>
>>>
>>>
>>> Here is the mdp file I used:
>>>
>>> ------run.mdp------
>>>
>>> integrator      = md
>>>
>>> nsteps          = 1000000
>>>
>>> dt              = 0.001
>>>
>>> coulombtype     = pme
>>>
>>> vdw-type        = cut-off
>>>
>>> tcoupl          = Berendsen
>>>
>>> tc-grps         = protein non-protein
>>>
>>> tau-t           = 0.1 0.1
>>>
>>> ref-t           = 310 310
>>>
>>> nstxout         = 100
>>>
>>> nstvout         = 0
>>>
>>> nstxtcout       = 100
>>>
>>> nstenergy           = 100
>>>
>>> comm_mode     = Linear ; Angular
>>>
>>> comm_grps        = Protein
>>>
>>> xtc_grps               = Protein
>>>
>>> energygrps         = Protein
>>>
>>> ------------------
>>>
>>>
>>>
>>> Thanks,
>>>
>>> Ifat
>>>
>>>
>>>
>>> The output:
>>>
>>> 343.7   10.813 5.924   16.445 16.445 16.445
>>>
>>> 343.8   10.809 5.949   16.445 16.445 16.445
>>>
>>> 343.9   10.804 5.959   16.445 16.445 16.445
>>>
>>> 344      10.808 5.974   16.445 16.445 16.445
>>>
>>> 344.1   0.18     21.982 16.445 16.445 16.445
>>>
>>> 344.2   10.778 5.977   16.445 16.445 16.445
>>>
>>> 344.3   10.768 5.996   16.445 16.445 16.445
>>>
>>> 344.4   10.764 6.016   16.445 16.445 16.445
>>>
>>> 344.5   10.722 6.029   16.445 16.445 16.445
>>>
>>> 344.6   10.774 6.01     16.445 16.445 16.445
>>>
>>> 344.7   0.174   21.984 16.445 16.445 16.445
>>>
>>> 344.8   0.176   21.98   16.445 16.445 16.445
>>>
>>> 344.9   0.17     22.002 16.445 16.445 16.445
>>>
>>> 345      0.173   21.981 16.445 16.445 16.445
>>>
>>> 345.1   0.191   21.954 16.445 16.445 16.445
>>>
>>> 345.2   0.183   21.958 16.445 16.445 16.445
>>>
>>> 345.3   0.181   22.012 16.445 16.445 16.445
>>>
>>> 345.4   0.17     22.054 16.445 16.445 16.445
>>>
>>> 345.5   0.168   22.054 16.445 16.445 16.445
>>>
>>> 345.6   0.189   22.039 16.445 16.445 16.445
>>>
>>> 345.7   0.171   22.007 16.445 16.445 16.445
>>>
>>> 345.8   0.186   22.031 16.445 16.445 16.445
>>>
>>> 345.9   0.171   22.077 16.445 16.445 16.445
>>>
>>> 346      0.187   21.99   16.445 16.445 16.445
>>>
>>> 346.1   0.173   21.984 16.445 16.445 16.445
>>>
>>> 346.2   0.181   22.02   16.445 16.445 16.445
>>>
>>> 346.3   10.82   5.984   16.445 16.445 16.445
>>>
>>> 346.4   10.81   6.002   16.445 16.445 16.445
>>>
>>> 346.5   10.819 6.008   16.445 16.445 16.445
>>>
>>> 346.6   10.813 5.996   16.445 16.445 16.445
>>>
>>> 346.7   10.781 6.006   16.445 16.445 16.445
>>>
>>> 346.8   10.793 6.026   16.445 16.445 16.445
>>>
>>> 346.9   10.745 5.985   16.445 16.445 16.445
>>>
>>> 347      10.762 5.999   16.445 16.445 16.445
>>>
>>> 347.1   10.781 5.984   16.445 16.445 16.445
>>>
>>> 347.2   10.784 6.002   16.445 16.445 16.445
>>> -------------- next part --------------
>>> An HTML attachment was scrubbed...
>>> URL:
>>> http://lists.gromacs.org/pipermail/gmx-users/attachments/20110216/8495d798/attachment-0001.html
>>>
>>> ------------------------------
>>>
>>> Message: 5
>>> Date: Wed, 16 Feb 2011 10:43:56 +0100
>>> From: Tsjerk Wassenaar <tsjerkw at gmail.com>
>>> Subject: Re: [gmx-users] Periodic Boundary Conditions g_mindist -pi
>>> To: Discussion list for GROMACS users <gmx-users at gromacs.org>
>>> Message-ID:
>>>       <AANLkTinr05YyKV7rqzCSwEch=aBXFur+adKeNMxEMLEa at mail.gmail.com>
>>> Content-Type: text/plain; charset=ISO-8859-1
>>>
>>> Hi Ifat,
>>>
>>> I guess this is a jump over the periodic boundaries. You should remove
>>> jumps from the trajectory (-pbc nojump) before running g_mindist -pi.
>>>
>>> Cheers,
>>>
>>> Tsjerk
>>>
>>> On Wed, Feb 16, 2011 at 10:19 AM, ifat shub <shubifat at gmail.com> wrote:
>>>> Hi,
>>>>
>>>>
>>>>
>>>> I am running a simulation on the complex 1aik.pdb in 310K. I wanted to
>>> see
>>>> if the complex is seeing its next periodic image, so I used the g_mindist
>>>> command with the -pi option. My command line was:
>>>>
>>>> g_mindist -f run.xtc -s run.gro -n index.ndx -od tmp.xvg -pi
>>>>
>>>> The output (see below) was stable until ~344ps when there is a  jump in
>>> the
>>>> max internal distance (third column) from ~6nm to ~22nm. After the jump
>>> the
>>>> numbers are reduced back to ~6nm and remained stable until the run is
>>>> completed at 1ns.
>>>>
>>>> Does anyone know how to explain this jump? Is this a real problem or just
>>> a
>>>> visualization artifact? Is there a way to avoid such jumps?
>>>>
>>>>
>>>>
>>>> Here is the mdp file I used:
>>>>
>>>> ------run.mdp------
>>>>
>>>> integrator      = md
>>>>
>>>> nsteps          = 1000000
>>>>
>>>> dt              = 0.001
>>>>
>>>> coulombtype     = pme
>>>>
>>>> vdw-type        = cut-off
>>>>
>>>> tcoupl          = Berendsen
>>>>
>>>> tc-grps         = protein non-protein
>>>>
>>>> tau-t           = 0.1 0.1
>>>>
>>>> ref-t           = 310 310
>>>>
>>>> nstxout         = 100
>>>>
>>>> nstvout         = 0
>>>>
>>>> nstxtcout       = 100
>>>>
>>>> nstenergy           = 100
>>>>
>>>> comm_mode     = Linear ; Angular
>>>>
>>>> comm_grps        = Protein
>>>>
>>>> xtc_grps               = Protein
>>>>
>>>> energygrps         = Protein
>>>>
>>>> ------------------
>>>>
>>>>
>>>>
>>>> Thanks,
>>>>
>>>> Ifat
>>>>
>>>>
>>>>
>>>> The output:
>>>>
>>>> 343.7   10.813 5.924   16.445 16.445 16.445
>>>>
>>>> 343.8   10.809 5.949   16.445 16.445 16.445
>>>>
>>>> 343.9   10.804 5.959   16.445 16.445 16.445
>>>>
>>>> 344      10.808 5.974   16.445 16.445 16.445
>>>>
>>>> 344.1   0.18     21.982 16.445 16.445 16.445
>>>>
>>>> 344.2   10.778 5.977   16.445 16.445 16.445
>>>>
>>>> 344.3   10.768 5.996   16.445 16.445 16.445
>>>>
>>>> 344.4   10.764 6.016   16.445 16.445 16.445
>>>>
>>>> 344.5   10.722 6.029   16.445 16.445 16.445
>>>>
>>>> 344.6   10.774 6.01     16.445 16.445 16.445
>>>>
>>>> 344.7   0.174   21.984 16.445 16.445 16.445
>>>>
>>>> 344.8   0.176   21.98   16.445 16.445 16.445
>>>>
>>>> 344.9   0.17     22.002 16.445 16.445 16.445
>>>>
>>>> 345      0.173   21.981 16.445 16.445 16.445
>>>>
>>>> 345.1   0.191   21.954 16.445 16.445 16.445
>>>>
>>>> 345.2   0.183   21.958 16.445 16.445 16.445
>>>>
>>>> 345.3   0.181   22.012 16.445 16.445 16.445
>>>>
>>>> 345.4   0.17     22.054 16.445 16.445 16.445
>>>>
>>>> 345.5   0.168   22.054 16.445 16.445 16.445
>>>>
>>>> 345.6   0.189   22.039 16.445 16.445 16.445
>>>>
>>>> 345.7   0.171   22.007 16.445 16.445 16.445
>>>>
>>>> 345.8   0.186   22.031 16.445 16.445 16.445
>>>>
>>>> 345.9   0.171   22.077 16.445 16.445 16.445
>>>>
>>>> 346      0.187   21.99   16.445 16.445 16.445
>>>>
>>>> 346.1   0.173   21.984 16.445 16.445 16.445
>>>>
>>>> 346.2   0.181   22.02   16.445 16.445 16.445
>>>>
>>>> 346.3   10.82   5.984   16.445 16.445 16.445
>>>>
>>>> 346.4   10.81   6.002   16.445 16.445 16.445
>>>>
>>>> 346.5   10.819 6.008   16.445 16.445 16.445
>>>>
>>>> 346.6   10.813 5.996   16.445 16.445 16.445
>>>>
>>>> 346.7   10.781 6.006   16.445 16.445 16.445
>>>>
>>>> 346.8   10.793 6.026   16.445 16.445 16.445
>>>>
>>>> 346.9   10.745 5.985   16.445 16.445 16.445
>>>>
>>>> 347      10.762 5.999   16.445 16.445 16.445
>>>>
>>>> 347.1   10.781 5.984   16.445 16.445 16.445
>>>>
>>>> 347.2   10.784 6.002   16.445 16.445 16.445
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> --
>>>> gmx-users mailing list    gmx-users at gromacs.org
>>>> http://lists.gromacs.org/mailman/listinfo/gmx-users
>>>> Please search the archive at
>>>> http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
>>>> Please don't post (un)subscribe requests to the list. Use the
>>>> www interface or send it to gmx-users-request at gromacs.org.
>>>> Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>>>>
>>>
>>>
>>>
>>> --
>>> Tsjerk A. Wassenaar, Ph.D.
>>>
>>> post-doctoral researcher
>>> Molecular Dynamics Group
>>> * Groningen Institute for Biomolecular Research and Biotechnology
>>> * Zernike Institute for Advanced Materials
>>> University of Groningen
>>> The Netherlands
>>>
>>>
>>> ------------------------------
>>>
>>> --
>>> gmx-users mailing list
>>> gmx-users at gromacs.org
>>> http://lists.gromacs.org/mailman/listinfo/gmx-users
>>> Please search the archive at
>>> http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
>>>
>>> End of gmx-users Digest, Vol 82, Issue 125
>>> ******************************************
>>>
>> -------------- next part --------------
>> An HTML attachment was scrubbed...
>> URL: http://lists.gromacs.org/pipermail/gmx-users/attachments/20110216/daa11467/attachment-0001.html
>>
>> ------------------------------
>>
>> Message: 3
>> Date: Wed, 16 Feb 2011 23:00:32 +1100
>> From: Mark Abraham <Mark.Abraham at anu.edu.au>
>> Subject: Re: [gmx-users] Re: Periodic Boundary Conditions g_mindist
>>       -pi
>> To: Discussion list for GROMACS users <gmx-users at gromacs.org>
>> Message-ID: <4D5BBC60.5080600 at anu.edu.au>
>> Content-Type: text/plain; charset="iso-8859-1"
>>
>> On 16/02/2011 10:30 PM, ifat shub wrote:
>>>
>>> Hi Tsjerk,
>>> Thank you for your reply.
>>> I am aware of the trajconv option but I wanted to know if there is a
>>> way to avoid these kind of jumps over the periodic boundaries during
>>> the mdrun and not post process?
>>
>> No. mdrun does not know in advance what your visualization requirements
>> are, and frankly there are better things to do with expensive compute
>> cluster time. Post-processing a small number of frames elsewhere is much
>> better use of resources.
>>
>> Mark
>>
>>
>>>    message: 4
>>>    Date: Wed, 16 Feb 2011 11:19:14 +0200
>>>    From: ifat shub <shubifat at gmail.com <mailto:shubifat at gmail.com>>
>>>    Subject: [gmx-users] Periodic Boundary Conditions g_mindist -pi
>>>    To: gmx-users at gromacs.org <mailto:gmx-users at gromacs.org>
>>>    Message-ID:
>>>    <AANLkTi=sgjfTmrf-0nVmZZOgfs+xhyxv5u6GGFNHR6hp at mail.gmail.com
>>>    <mailto:sgjfTmrf-0nVmZZOgfs%2Bxhyxv5u6GGFNHR6hp at mail.gmail.com>>
>>>    Content-Type: text/plain; charset="iso-8859-1"
>>>
>>>    Hi,
>>>
>>>
>>>
>>>    I am running a simulation on the complex 1aik.pdb in 310K. I
>>>    wanted to see
>>>    if the complex is seeing its next periodic image, so I used the
>>>    g_mindist
>>>    command with the -pi option. My command line was:
>>>
>>>    g_mindist -f run.xtc -s run.gro -n index.ndx -od tmp.xvg -pi
>>>
>>>    The output (see below) was stable until ~344ps when there is a
>>>     jump in the
>>>    max internal distance (third column) from ~6nm to ~22nm. After the
>>>    jump the
>>>    numbers are reduced back to ~6nm and remained stable until the run is
>>>    completed at 1ns.
>>>
>>>    Does anyone know how to explain this jump? Is this a real problem
>>>    or just a
>>>    visualization artifact? Is there a way to avoid such jumps?
>>>
>>>
>>>
>>>    Here is the mdp file I used:
>>>
>>>    ------run.mdp------
>>>
>>>    integrator      = md
>>>
>>>    nsteps          = 1000000
>>>
>>>    dt              = 0.001
>>>
>>>    coulombtype     = pme
>>>
>>>    vdw-type        = cut-off
>>>
>>>    tcoupl          = Berendsen
>>>
>>>    tc-grps         = protein non-protein
>>>
>>>    tau-t           = 0.1 0.1
>>>
>>>    ref-t           = 310 310
>>>
>>>    nstxout         = 100
>>>
>>>    nstvout         = 0
>>>
>>>    nstxtcout       = 100
>>>
>>>    nstenergy           = 100
>>>
>>>    comm_mode     = Linear ; Angular
>>>
>>>    comm_grps        = Protein
>>>
>>>    xtc_grps               = Protein
>>>
>>>    energygrps         = Protein
>>>
>>>    ------------------
>>>
>>>
>>>
>>>    Thanks,
>>>
>>>    Ifat
>>>
>>>
>>>
>>>    The output:
>>>
>>>    343.7   10.813 5.924   16.445 16.445 16.445
>>>
>>>    343.8   10.809 5.949   16.445 16.445 16.445
>>>
>>>    343.9   10.804 5.959   16.445 16.445 16.445
>>>
>>>    344      10.808 5.974   16.445 16.445 16.445
>>>
>>>    344.1   0.18     21.982 16.445 16.445 16.445
>>>
>>>    344.2   10.778 5.977   16.445 16.445 16.445
>>>
>>>    344.3   10.768 5.996   16.445 16.445 16.445
>>>
>>>    344.4   10.764 6.016   16.445 16.445 16.445
>>>
>>>    344.5   10.722 6.029   16.445 16.445 16.445
>>>
>>>    344.6   10.774 6.01     16.445 16.445 16.445
>>>
>>>    344.7   0.174   21.984 16.445 16.445 16.445
>>>
>>>    344.8   0.176   21.98   16.445 16.445 16.445
>>>
>>>    344.9   0.17     22.002 16.445 16.445 16.445
>>>
>>>    345      0.173   21.981 16.445 16.445 16.445
>>>
>>>    345.1   0.191   21.954 16.445 16.445 16.445
>>>
>>>    345.2   0.183   21.958 16.445 16.445 16.445
>>>
>>>    345.3   0.181   22.012 16.445 16.445 16.445
>>>
>>>    345.4   0.17     22.054 16.445 16.445 16.445
>>>
>>>    345.5   0.168   22.054 16.445 16.445 16.445
>>>
>>>    345.6   0.189   22.039 16.445 16.445 16.445
>>>
>>>    345.7   0.171   22.007 16.445 16.445 16.445
>>>
>>>    345.8   0.186   22.031 16.445 16.445 16.445
>>>
>>>    345.9   0.171   22.077 16.445 16.445 16.445
>>>
>>>    346      0.187   21.99   16.445 16.445 16.445
>>>
>>>    346.1   0.173   21.984 16.445 16.445 16.445
>>>
>>>    346.2   0.181   22.02   16.445 16.445 16.445
>>>
>>>    346.3   10.82   5.984   16.445 16.445 16.445
>>>
>>>    346.4   10.81   6.002   16.445 16.445 16.445
>>>
>>>    346.5   10.819 6.008   16.445 16.445 16.445
>>>
>>>    346.6   10.813 5.996   16.445 16.445 16.445
>>>
>>>    346.7   10.781 6.006   16.445 16.445 16.445
>>>
>>>    346.8   10.793 6.026   16.445 16.445 16.445
>>>
>>>    346.9   10.745 5.985   16.445 16.445 16.445
>>>
>>>    347      10.762 5.999   16.445 16.445 16.445
>>>
>>>    347.1   10.781 5.984   16.445 16.445 16.445
>>>
>>>    347.2   10.784 6.002   16.445 16.445 16.445
>>>    -------------- next part --------------
>>>    An HTML attachment was scrubbed...
>>>    URL:
>>>    http://lists.gromacs.org/pipermail/gmx-users/attachments/20110216/8495d798/attachment-0001.html
>>>
>>>    ------------------------------
>>>
>>>    Message: 5
>>>    Date: Wed, 16 Feb 2011 10:43:56 +0100
>>>    From: Tsjerk Wassenaar <tsjerkw at gmail.com <mailto:tsjerkw at gmail.com>>
>>>    Subject: Re: [gmx-users] Periodic Boundary Conditions g_mindist -pi
>>>    To: Discussion list for GROMACS users <gmx-users at gromacs.org
>>>    <mailto:gmx-users at gromacs.org>>
>>>    Message-ID:
>>>    <AANLkTinr05YyKV7rqzCSwEch=aBXFur+adKeNMxEMLEa at mail.gmail.com
>>>    <mailto:aBXFur%2BadKeNMxEMLEa at mail.gmail.com>>
>>>    Content-Type: text/plain; charset=ISO-8859-1
>>>
>>>    Hi Ifat,
>>>
>>>    I guess this is a jump over the periodic boundaries. You should remove
>>>    jumps from the trajectory (-pbc nojump) before running g_mindist -pi.
>>>
>>>    Cheers,
>>>
>>>    Tsjerk
>>>
>>>    On Wed, Feb 16, 2011 at 10:19 AM, ifat shub <shubifat at gmail.com
>>>    <mailto:shubifat at gmail.com>> wrote:
>>>> Hi,
>>>>
>>>>
>>>>
>>>> I am running a simulation on the complex 1aik.pdb in 310K. I
>>>    wanted to see
>>>> if the complex is seeing its next periodic image, so I used the
>>>    g_mindist
>>>> command with the -pi option. My command line was:
>>>>
>>>> g_mindist -f run.xtc -s run.gro -n index.ndx -od tmp.xvg -pi
>>>>
>>>> The output (see below) was stable until ~344ps when there is a
>>>     jump in the
>>>> max internal distance (third column) from ~6nm to ~22nm. After
>>>    the jump the
>>>> numbers are reduced back to ~6nm and remained stable until the
>>>    run is
>>>> completed at 1ns.
>>>>
>>>> Does anyone know how to explain this jump? Is this a real
>>>    problem or just a
>>>> visualization artifact? Is there a way to avoid such jumps?
>>>>
>>>>
>>>>
>>>> Here is the mdp file I used:
>>>>
>>>> ------run.mdp------
>>>>
>>>> integrator      = md
>>>>
>>>> nsteps          = 1000000
>>>>
>>>> dt              = 0.001
>>>>
>>>> coulombtype     = pme
>>>>
>>>> vdw-type        = cut-off
>>>>
>>>> tcoupl          = Berendsen
>>>>
>>>> tc-grps         = protein non-protein
>>>>
>>>> tau-t           = 0.1 0.1
>>>>
>>>> ref-t           = 310 310
>>>>
>>>> nstxout         = 100
>>>>
>>>> nstvout         = 0
>>>>
>>>> nstxtcout       = 100
>>>>
>>>> nstenergy           = 100
>>>>
>>>> comm_mode     = Linear ; Angular
>>>>
>>>> comm_grps        = Protein
>>>>
>>>> xtc_grps               = Protein
>>>>
>>>> energygrps         = Protein
>>>>
>>>> ------------------
>>>>
>>>>
>>>>
>>>> Thanks,
>>>>
>>>> Ifat
>>>>
>>>>
>>>>
>>>> The output:
>>>>
>>>> 343.7   10.813 5.924   16.445 16.445 16.445
>>>>
>>>> 343.8   10.809 5.949   16.445 16.445 16.445
>>>>
>>>> 343.9   10.804 5.959   16.445 16.445 16.445
>>>>
>>>> 344      10.808 5.974   16.445 16.445 16.445
>>>>
>>>> 344.1   0.18     21.982 16.445 16.445 16.445
>>>>
>>>> 344.2   10.778 5.977   16.445 16.445 16.445
>>>>
>>>> 344.3   10.768 5.996   16.445 16.445 16.445
>>>>
>>>> 344.4   10.764 6.016   16.445 16.445 16.445
>>>>
>>>> 344.5   10.722 6.029   16.445 16.445 16.445
>>>>
>>>> 344.6   10.774 6.01     16.445 16.445 16.445
>>>>
>>>> 344.7   0.174   21.984 16.445 16.445 16.445
>>>>
>>>> 344.8   0.176   21.98   16.445 16.445 16.445
>>>>
>>>> 344.9   0.17     22.002 16.445 16.445 16.445
>>>>
>>>> 345      0.173   21.981 16.445 16.445 16.445
>>>>
>>>> 345.1   0.191   21.954 16.445 16.445 16.445
>>>>
>>>> 345.2   0.183   21.958 16.445 16.445 16.445
>>>>
>>>> 345.3   0.181   22.012 16.445 16.445 16.445
>>>>
>>>> 345.4   0.17     22.054 16.445 16.445 16.445
>>>>
>>>> 345.5   0.168   22.054 16.445 16.445 16.445
>>>>
>>>> 345.6   0.189   22.039 16.445 16.445 16.445
>>>>
>>>> 345.7   0.171   22.007 16.445 16.445 16.445
>>>>
>>>> 345.8   0.186   22.031 16.445 16.445 16.445
>>>>
>>>> 345.9   0.171   22.077 16.445 16.445 16.445
>>>>
>>>> 346      0.187   21.99   16.445 16.445 16.445
>>>>
>>>> 346.1   0.173   21.984 16.445 16.445 16.445
>>>>
>>>> 346.2   0.181   22.02   16.445 16.445 16.445
>>>>
>>>> 346.3   10.82   5.984   16.445 16.445 16.445
>>>>
>>>> 346.4   10.81   6.002   16.445 16.445 16.445
>>>>
>>>> 346.5   10.819 6.008   16.445 16.445 16.445
>>>>
>>>> 346.6   10.813 5.996   16.445 16.445 16.445
>>>>
>>>> 346.7   10.781 6.006   16.445 16.445 16.445
>>>>
>>>> 346.8   10.793 6.026   16.445 16.445 16.445
>>>>
>>>> 346.9   10.745 5.985   16.445 16.445 16.445
>>>>
>>>> 347      10.762 5.999   16.445 16.445 16.445
>>>>
>>>> 347.1   10.781 5.984   16.445 16.445 16.445
>>>>
>>>> 347.2   10.784 6.002   16.445 16.445 16.445
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> --
>>>> gmx-users mailing list gmx-users at gromacs.org
>>>    <mailto:gmx-users at gromacs.org>
>>>> http://lists.gromacs.org/mailman/listinfo/gmx-users
>>>> Please search the archive at
>>>> http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
>>>> Please don't post (un)subscribe requests to the list. Use the
>>>> www interface or send it to gmx-users-request at gromacs.org
>>>    <mailto:gmx-users-request at gromacs.org>.
>>>> Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>>>>
>>>
>>>
>>>
>>>    --
>>>    Tsjerk A. Wassenaar, Ph.D.
>>>
>>>    post-doctoral researcher
>>>    Molecular Dynamics Group
>>>    * Groningen Institute for Biomolecular Research and Biotechnology
>>>    * Zernike Institute for Advanced Materials
>>>    University of Groningen
>>>    The Netherlands
>>>
>>>
>>>    ------------------------------
>>>
>>>    --
>>>    gmx-users mailing list
>>>    gmx-users at gromacs.org <mailto:gmx-users at gromacs.org>
>>>    http://lists.gromacs.org/mailman/listinfo/gmx-users
>>>    Please search the archive at
>>>    http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
>>>
>>>    End of gmx-users Digest, Vol 82, Issue 125
>>>    ******************************************
>>>
>>>
>>
>> -------------- next part --------------
>> An HTML attachment was scrubbed...
>> URL: http://lists.gromacs.org/pipermail/gmx-users/attachments/20110216/1ab1794a/attachment.html
>>
>> ------------------------------
>>
>> --
>> gmx-users mailing list
>> gmx-users at gromacs.org
>> http://lists.gromacs.org/mailman/listinfo/gmx-users
>> Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
>>
>> End of gmx-users Digest, Vol 82, Issue 127
>> ******************************************
>
> --
> gmx-users mailing list    gmx-users at gromacs.org
> http://lists.gromacs.org/mailman/listinfo/gmx-users
> Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
> Please don't post (un)subscribe requests to the list. Use the
> www interface or send it to gmx-users-request at gromacs.org.
> Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
>



More information about the gromacs.org_gmx-users mailing list