[gmx-users] Re: gromacs QM/MM compilation with gaussian (Txema Mercero)

Gerrit Groenhof ggroenh at gwdg.de
Wed Feb 16 14:08:34 CET 2011


Are you trying to run with more than one thread?
If so, try mdrun -nt 1 

Gerrit

> 
>   1. Re: gromacs QM/MM compilation with gaussian (Txema Mercero)
>   2. Re: Periodic Boundary Conditions g_mindist -pi (ifat shub)
>   3. Re: Re: Periodic Boundary Conditions g_mindist -pi (Mark Abraham)
> 
> 
> ----------------------------------------------------------------------
> 
> Message: 1
> Date: Wed, 16 Feb 2011 12:23:49 +0100
> From: Txema Mercero <jm.mercero at ehu.es>
> Subject: [gmx-users] Re: gromacs QM/MM compilation with gaussian
> To: gmx-users at gromacs.org
> Cc: Edu Ogando <edu.ogando at ehu.es>, Jon I?aki Mujika
> 	<joni.mujika at ehu.es>
> Message-ID:
> 	<AANLkTi=O7w8CBEz90u-VzK+q+46EhH+RrTrtHgVUQ8Tz at mail.gmail.com>
> Content-Type: text/plain; charset=ISO-8859-1
> 
> Hi there!
> 
> We are trying to compile gromacs with Gaussian 03 rev D.02 (we also
> have g09). We followed the instructions in
> http://wwwuser.gwdg.de/~ggroenh/roadmap.pdf despite that they do no
> fit exactly  with the g03 rev d03 version,for instance, FrcNCN is no
> in l710 but in utilam.F
> 
> Despite of that, we compiled gromacs and  apparently everything was
> fine, but we get a segmentation fault when we run gromacs. We have the
> following questions
> 
> 1.- Is it possible to get a more detailed/or specific instructions?
> 2.- I think that three variables GAUSS_EXE, GAUSS_DIR and DEVEL_DIR
> should be defined. Where should GAUSS_EXE and GAUSS_DIR point exactly?
> 
> Thanks for your attention, any help will be appreciated.
> 
> Regards,
> 
> Txema Mercero
> IZO/SGI
> UPV/EHU
> 
> 
> ------------------------------
> 
> Message: 2
> Date: Wed, 16 Feb 2011 13:30:29 +0200
> From: ifat shub <shubifat at gmail.com>
> Subject: [gmx-users] Re: Periodic Boundary Conditions g_mindist -pi
> To: gmx-users at gromacs.org
> Message-ID:
> 	<AANLkTi=71Moj0kH4=TR3_gDh54b1WsnEi0HUNDx++MQq at mail.gmail.com>
> Content-Type: text/plain; charset="iso-8859-1"
> 
> Hi Tsjerk,
> Thank you for your reply.
> I am aware of the trajconv option but I wanted to know if there is a way to
> avoid these kind of jumps over the periodic boundaries during the mdrun and
> not post process?
> Thanks,
> Ifat
> 
> message: 4
>> Date: Wed, 16 Feb 2011 11:19:14 +0200
>> From: ifat shub <shubifat at gmail.com>
>> Subject: [gmx-users] Periodic Boundary Conditions g_mindist -pi
>> To: gmx-users at gromacs.org
>> Message-ID:
>>       <AANLkTi=sgjfTmrf-0nVmZZOgfs+xhyxv5u6GGFNHR6hp at mail.gmail.com>
>> Content-Type: text/plain; charset="iso-8859-1"
>> 
>> Hi,
>> 
>> 
>> 
>> I am running a simulation on the complex 1aik.pdb in 310K. I wanted to see
>> if the complex is seeing its next periodic image, so I used the g_mindist
>> command with the -pi option. My command line was:
>> 
>> g_mindist -f run.xtc -s run.gro -n index.ndx -od tmp.xvg -pi
>> 
>> The output (see below) was stable until ~344ps when there is a  jump in the
>> max internal distance (third column) from ~6nm to ~22nm. After the jump the
>> numbers are reduced back to ~6nm and remained stable until the run is
>> completed at 1ns.
>> 
>> Does anyone know how to explain this jump? Is this a real problem or just a
>> visualization artifact? Is there a way to avoid such jumps?
>> 
>> 
>> 
>> Here is the mdp file I used:
>> 
>> ------run.mdp------
>> 
>> integrator      = md
>> 
>> nsteps          = 1000000
>> 
>> dt              = 0.001
>> 
>> coulombtype     = pme
>> 
>> vdw-type        = cut-off
>> 
>> tcoupl          = Berendsen
>> 
>> tc-grps         = protein non-protein
>> 
>> tau-t           = 0.1 0.1
>> 
>> ref-t           = 310 310
>> 
>> nstxout         = 100
>> 
>> nstvout         = 0
>> 
>> nstxtcout       = 100
>> 
>> nstenergy           = 100
>> 
>> comm_mode     = Linear ; Angular
>> 
>> comm_grps        = Protein
>> 
>> xtc_grps               = Protein
>> 
>> energygrps         = Protein
>> 
>> ------------------
>> 
>> 
>> 
>> Thanks,
>> 
>> Ifat
>> 
>> 
>> 
>> The output:
>> 
>> 343.7   10.813 5.924   16.445 16.445 16.445
>> 
>> 343.8   10.809 5.949   16.445 16.445 16.445
>> 
>> 343.9   10.804 5.959   16.445 16.445 16.445
>> 
>> 344      10.808 5.974   16.445 16.445 16.445
>> 
>> 344.1   0.18     21.982 16.445 16.445 16.445
>> 
>> 344.2   10.778 5.977   16.445 16.445 16.445
>> 
>> 344.3   10.768 5.996   16.445 16.445 16.445
>> 
>> 344.4   10.764 6.016   16.445 16.445 16.445
>> 
>> 344.5   10.722 6.029   16.445 16.445 16.445
>> 
>> 344.6   10.774 6.01     16.445 16.445 16.445
>> 
>> 344.7   0.174   21.984 16.445 16.445 16.445
>> 
>> 344.8   0.176   21.98   16.445 16.445 16.445
>> 
>> 344.9   0.17     22.002 16.445 16.445 16.445
>> 
>> 345      0.173   21.981 16.445 16.445 16.445
>> 
>> 345.1   0.191   21.954 16.445 16.445 16.445
>> 
>> 345.2   0.183   21.958 16.445 16.445 16.445
>> 
>> 345.3   0.181   22.012 16.445 16.445 16.445
>> 
>> 345.4   0.17     22.054 16.445 16.445 16.445
>> 
>> 345.5   0.168   22.054 16.445 16.445 16.445
>> 
>> 345.6   0.189   22.039 16.445 16.445 16.445
>> 
>> 345.7   0.171   22.007 16.445 16.445 16.445
>> 
>> 345.8   0.186   22.031 16.445 16.445 16.445
>> 
>> 345.9   0.171   22.077 16.445 16.445 16.445
>> 
>> 346      0.187   21.99   16.445 16.445 16.445
>> 
>> 346.1   0.173   21.984 16.445 16.445 16.445
>> 
>> 346.2   0.181   22.02   16.445 16.445 16.445
>> 
>> 346.3   10.82   5.984   16.445 16.445 16.445
>> 
>> 346.4   10.81   6.002   16.445 16.445 16.445
>> 
>> 346.5   10.819 6.008   16.445 16.445 16.445
>> 
>> 346.6   10.813 5.996   16.445 16.445 16.445
>> 
>> 346.7   10.781 6.006   16.445 16.445 16.445
>> 
>> 346.8   10.793 6.026   16.445 16.445 16.445
>> 
>> 346.9   10.745 5.985   16.445 16.445 16.445
>> 
>> 347      10.762 5.999   16.445 16.445 16.445
>> 
>> 347.1   10.781 5.984   16.445 16.445 16.445
>> 
>> 347.2   10.784 6.002   16.445 16.445 16.445
>> -------------- next part --------------
>> An HTML attachment was scrubbed...
>> URL:
>> http://lists.gromacs.org/pipermail/gmx-users/attachments/20110216/8495d798/attachment-0001.html
>> 
>> ------------------------------
>> 
>> Message: 5
>> Date: Wed, 16 Feb 2011 10:43:56 +0100
>> From: Tsjerk Wassenaar <tsjerkw at gmail.com>
>> Subject: Re: [gmx-users] Periodic Boundary Conditions g_mindist -pi
>> To: Discussion list for GROMACS users <gmx-users at gromacs.org>
>> Message-ID:
>>       <AANLkTinr05YyKV7rqzCSwEch=aBXFur+adKeNMxEMLEa at mail.gmail.com>
>> Content-Type: text/plain; charset=ISO-8859-1
>> 
>> Hi Ifat,
>> 
>> I guess this is a jump over the periodic boundaries. You should remove
>> jumps from the trajectory (-pbc nojump) before running g_mindist -pi.
>> 
>> Cheers,
>> 
>> Tsjerk
>> 
>> On Wed, Feb 16, 2011 at 10:19 AM, ifat shub <shubifat at gmail.com> wrote:
>>> Hi,
>>> 
>>> 
>>> 
>>> I am running a simulation on the complex 1aik.pdb in 310K. I wanted to
>> see
>>> if the complex is seeing its next periodic image, so I used the g_mindist
>>> command with the -pi option. My command line was:
>>> 
>>> g_mindist -f run.xtc -s run.gro -n index.ndx -od tmp.xvg -pi
>>> 
>>> The output (see below) was stable until ~344ps when there is a  jump in
>> the
>>> max internal distance (third column) from ~6nm to ~22nm. After the jump
>> the
>>> numbers are reduced back to ~6nm and remained stable until the run is
>>> completed at 1ns.
>>> 
>>> Does anyone know how to explain this jump? Is this a real problem or just
>> a
>>> visualization artifact? Is there a way to avoid such jumps?
>>> 
>>> 
>>> 
>>> Here is the mdp file I used:
>>> 
>>> ------run.mdp------
>>> 
>>> integrator      = md
>>> 
>>> nsteps          = 1000000
>>> 
>>> dt              = 0.001
>>> 
>>> coulombtype     = pme
>>> 
>>> vdw-type        = cut-off
>>> 
>>> tcoupl          = Berendsen
>>> 
>>> tc-grps         = protein non-protein
>>> 
>>> tau-t           = 0.1 0.1
>>> 
>>> ref-t           = 310 310
>>> 
>>> nstxout         = 100
>>> 
>>> nstvout         = 0
>>> 
>>> nstxtcout       = 100
>>> 
>>> nstenergy           = 100
>>> 
>>> comm_mode     = Linear ; Angular
>>> 
>>> comm_grps        = Protein
>>> 
>>> xtc_grps               = Protein
>>> 
>>> energygrps         = Protein
>>> 
>>> ------------------
>>> 
>>> 
>>> 
>>> Thanks,
>>> 
>>> Ifat
>>> 
>>> 
>>> 
>>> The output:
>>> 
>>> 343.7   10.813 5.924   16.445 16.445 16.445
>>> 
>>> 343.8   10.809 5.949   16.445 16.445 16.445
>>> 
>>> 343.9   10.804 5.959   16.445 16.445 16.445
>>> 
>>> 344      10.808 5.974   16.445 16.445 16.445
>>> 
>>> 344.1   0.18     21.982 16.445 16.445 16.445
>>> 
>>> 344.2   10.778 5.977   16.445 16.445 16.445
>>> 
>>> 344.3   10.768 5.996   16.445 16.445 16.445
>>> 
>>> 344.4   10.764 6.016   16.445 16.445 16.445
>>> 
>>> 344.5   10.722 6.029   16.445 16.445 16.445
>>> 
>>> 344.6   10.774 6.01     16.445 16.445 16.445
>>> 
>>> 344.7   0.174   21.984 16.445 16.445 16.445
>>> 
>>> 344.8   0.176   21.98   16.445 16.445 16.445
>>> 
>>> 344.9   0.17     22.002 16.445 16.445 16.445
>>> 
>>> 345      0.173   21.981 16.445 16.445 16.445
>>> 
>>> 345.1   0.191   21.954 16.445 16.445 16.445
>>> 
>>> 345.2   0.183   21.958 16.445 16.445 16.445
>>> 
>>> 345.3   0.181   22.012 16.445 16.445 16.445
>>> 
>>> 345.4   0.17     22.054 16.445 16.445 16.445
>>> 
>>> 345.5   0.168   22.054 16.445 16.445 16.445
>>> 
>>> 345.6   0.189   22.039 16.445 16.445 16.445
>>> 
>>> 345.7   0.171   22.007 16.445 16.445 16.445
>>> 
>>> 345.8   0.186   22.031 16.445 16.445 16.445
>>> 
>>> 345.9   0.171   22.077 16.445 16.445 16.445
>>> 
>>> 346      0.187   21.99   16.445 16.445 16.445
>>> 
>>> 346.1   0.173   21.984 16.445 16.445 16.445
>>> 
>>> 346.2   0.181   22.02   16.445 16.445 16.445
>>> 
>>> 346.3   10.82   5.984   16.445 16.445 16.445
>>> 
>>> 346.4   10.81   6.002   16.445 16.445 16.445
>>> 
>>> 346.5   10.819 6.008   16.445 16.445 16.445
>>> 
>>> 346.6   10.813 5.996   16.445 16.445 16.445
>>> 
>>> 346.7   10.781 6.006   16.445 16.445 16.445
>>> 
>>> 346.8   10.793 6.026   16.445 16.445 16.445
>>> 
>>> 346.9   10.745 5.985   16.445 16.445 16.445
>>> 
>>> 347      10.762 5.999   16.445 16.445 16.445
>>> 
>>> 347.1   10.781 5.984   16.445 16.445 16.445
>>> 
>>> 347.2   10.784 6.002   16.445 16.445 16.445
>>> 
>>> 
>>> 
>>> 
>>> 
>>> --
>>> gmx-users mailing list    gmx-users at gromacs.org
>>> http://lists.gromacs.org/mailman/listinfo/gmx-users
>>> Please search the archive at
>>> http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
>>> Please don't post (un)subscribe requests to the list. Use the
>>> www interface or send it to gmx-users-request at gromacs.org.
>>> Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>>> 
>> 
>> 
>> 
>> --
>> Tsjerk A. Wassenaar, Ph.D.
>> 
>> post-doctoral researcher
>> Molecular Dynamics Group
>> * Groningen Institute for Biomolecular Research and Biotechnology
>> * Zernike Institute for Advanced Materials
>> University of Groningen
>> The Netherlands
>> 
>> 
>> ------------------------------
>> 
>> --
>> gmx-users mailing list
>> gmx-users at gromacs.org
>> http://lists.gromacs.org/mailman/listinfo/gmx-users
>> Please search the archive at
>> http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
>> 
>> End of gmx-users Digest, Vol 82, Issue 125
>> ******************************************
>> 
> -------------- next part --------------
> An HTML attachment was scrubbed...
> URL: http://lists.gromacs.org/pipermail/gmx-users/attachments/20110216/daa11467/attachment-0001.html
> 
> ------------------------------
> 
> Message: 3
> Date: Wed, 16 Feb 2011 23:00:32 +1100
> From: Mark Abraham <Mark.Abraham at anu.edu.au>
> Subject: Re: [gmx-users] Re: Periodic Boundary Conditions g_mindist
> 	-pi
> To: Discussion list for GROMACS users <gmx-users at gromacs.org>
> Message-ID: <4D5BBC60.5080600 at anu.edu.au>
> Content-Type: text/plain; charset="iso-8859-1"
> 
> On 16/02/2011 10:30 PM, ifat shub wrote:
>> 
>> Hi Tsjerk,
>> Thank you for your reply.
>> I am aware of the trajconv option but I wanted to know if there is a 
>> way to avoid these kind of jumps over the periodic boundaries during 
>> the mdrun and not post process?
> 
> No. mdrun does not know in advance what your visualization requirements 
> are, and frankly there are better things to do with expensive compute 
> cluster time. Post-processing a small number of frames elsewhere is much 
> better use of resources.
> 
> Mark
> 
> 
>>    message: 4
>>    Date: Wed, 16 Feb 2011 11:19:14 +0200
>>    From: ifat shub <shubifat at gmail.com <mailto:shubifat at gmail.com>>
>>    Subject: [gmx-users] Periodic Boundary Conditions g_mindist -pi
>>    To: gmx-users at gromacs.org <mailto:gmx-users at gromacs.org>
>>    Message-ID:
>>    <AANLkTi=sgjfTmrf-0nVmZZOgfs+xhyxv5u6GGFNHR6hp at mail.gmail.com
>>    <mailto:sgjfTmrf-0nVmZZOgfs%2Bxhyxv5u6GGFNHR6hp at mail.gmail.com>>
>>    Content-Type: text/plain; charset="iso-8859-1"
>> 
>>    Hi,
>> 
>> 
>> 
>>    I am running a simulation on the complex 1aik.pdb in 310K. I
>>    wanted to see
>>    if the complex is seeing its next periodic image, so I used the
>>    g_mindist
>>    command with the -pi option. My command line was:
>> 
>>    g_mindist -f run.xtc -s run.gro -n index.ndx -od tmp.xvg -pi
>> 
>>    The output (see below) was stable until ~344ps when there is a
>>     jump in the
>>    max internal distance (third column) from ~6nm to ~22nm. After the
>>    jump the
>>    numbers are reduced back to ~6nm and remained stable until the run is
>>    completed at 1ns.
>> 
>>    Does anyone know how to explain this jump? Is this a real problem
>>    or just a
>>    visualization artifact? Is there a way to avoid such jumps?
>> 
>> 
>> 
>>    Here is the mdp file I used:
>> 
>>    ------run.mdp------
>> 
>>    integrator      = md
>> 
>>    nsteps          = 1000000
>> 
>>    dt              = 0.001
>> 
>>    coulombtype     = pme
>> 
>>    vdw-type        = cut-off
>> 
>>    tcoupl          = Berendsen
>> 
>>    tc-grps         = protein non-protein
>> 
>>    tau-t           = 0.1 0.1
>> 
>>    ref-t           = 310 310
>> 
>>    nstxout         = 100
>> 
>>    nstvout         = 0
>> 
>>    nstxtcout       = 100
>> 
>>    nstenergy           = 100
>> 
>>    comm_mode     = Linear ; Angular
>> 
>>    comm_grps        = Protein
>> 
>>    xtc_grps               = Protein
>> 
>>    energygrps         = Protein
>> 
>>    ------------------
>> 
>> 
>> 
>>    Thanks,
>> 
>>    Ifat
>> 
>> 
>> 
>>    The output:
>> 
>>    343.7   10.813 5.924   16.445 16.445 16.445
>> 
>>    343.8   10.809 5.949   16.445 16.445 16.445
>> 
>>    343.9   10.804 5.959   16.445 16.445 16.445
>> 
>>    344      10.808 5.974   16.445 16.445 16.445
>> 
>>    344.1   0.18     21.982 16.445 16.445 16.445
>> 
>>    344.2   10.778 5.977   16.445 16.445 16.445
>> 
>>    344.3   10.768 5.996   16.445 16.445 16.445
>> 
>>    344.4   10.764 6.016   16.445 16.445 16.445
>> 
>>    344.5   10.722 6.029   16.445 16.445 16.445
>> 
>>    344.6   10.774 6.01     16.445 16.445 16.445
>> 
>>    344.7   0.174   21.984 16.445 16.445 16.445
>> 
>>    344.8   0.176   21.98   16.445 16.445 16.445
>> 
>>    344.9   0.17     22.002 16.445 16.445 16.445
>> 
>>    345      0.173   21.981 16.445 16.445 16.445
>> 
>>    345.1   0.191   21.954 16.445 16.445 16.445
>> 
>>    345.2   0.183   21.958 16.445 16.445 16.445
>> 
>>    345.3   0.181   22.012 16.445 16.445 16.445
>> 
>>    345.4   0.17     22.054 16.445 16.445 16.445
>> 
>>    345.5   0.168   22.054 16.445 16.445 16.445
>> 
>>    345.6   0.189   22.039 16.445 16.445 16.445
>> 
>>    345.7   0.171   22.007 16.445 16.445 16.445
>> 
>>    345.8   0.186   22.031 16.445 16.445 16.445
>> 
>>    345.9   0.171   22.077 16.445 16.445 16.445
>> 
>>    346      0.187   21.99   16.445 16.445 16.445
>> 
>>    346.1   0.173   21.984 16.445 16.445 16.445
>> 
>>    346.2   0.181   22.02   16.445 16.445 16.445
>> 
>>    346.3   10.82   5.984   16.445 16.445 16.445
>> 
>>    346.4   10.81   6.002   16.445 16.445 16.445
>> 
>>    346.5   10.819 6.008   16.445 16.445 16.445
>> 
>>    346.6   10.813 5.996   16.445 16.445 16.445
>> 
>>    346.7   10.781 6.006   16.445 16.445 16.445
>> 
>>    346.8   10.793 6.026   16.445 16.445 16.445
>> 
>>    346.9   10.745 5.985   16.445 16.445 16.445
>> 
>>    347      10.762 5.999   16.445 16.445 16.445
>> 
>>    347.1   10.781 5.984   16.445 16.445 16.445
>> 
>>    347.2   10.784 6.002   16.445 16.445 16.445
>>    -------------- next part --------------
>>    An HTML attachment was scrubbed...
>>    URL:
>>    http://lists.gromacs.org/pipermail/gmx-users/attachments/20110216/8495d798/attachment-0001.html
>> 
>>    ------------------------------
>> 
>>    Message: 5
>>    Date: Wed, 16 Feb 2011 10:43:56 +0100
>>    From: Tsjerk Wassenaar <tsjerkw at gmail.com <mailto:tsjerkw at gmail.com>>
>>    Subject: Re: [gmx-users] Periodic Boundary Conditions g_mindist -pi
>>    To: Discussion list for GROMACS users <gmx-users at gromacs.org
>>    <mailto:gmx-users at gromacs.org>>
>>    Message-ID:
>>    <AANLkTinr05YyKV7rqzCSwEch=aBXFur+adKeNMxEMLEa at mail.gmail.com
>>    <mailto:aBXFur%2BadKeNMxEMLEa at mail.gmail.com>>
>>    Content-Type: text/plain; charset=ISO-8859-1
>> 
>>    Hi Ifat,
>> 
>>    I guess this is a jump over the periodic boundaries. You should remove
>>    jumps from the trajectory (-pbc nojump) before running g_mindist -pi.
>> 
>>    Cheers,
>> 
>>    Tsjerk
>> 
>>    On Wed, Feb 16, 2011 at 10:19 AM, ifat shub <shubifat at gmail.com
>>    <mailto:shubifat at gmail.com>> wrote:
>>> Hi,
>>> 
>>> 
>>> 
>>> I am running a simulation on the complex 1aik.pdb in 310K. I
>>    wanted to see
>>> if the complex is seeing its next periodic image, so I used the
>>    g_mindist
>>> command with the -pi option. My command line was:
>>> 
>>> g_mindist -f run.xtc -s run.gro -n index.ndx -od tmp.xvg -pi
>>> 
>>> The output (see below) was stable until ~344ps when there is a
>>     jump in the
>>> max internal distance (third column) from ~6nm to ~22nm. After
>>    the jump the
>>> numbers are reduced back to ~6nm and remained stable until the
>>    run is
>>> completed at 1ns.
>>> 
>>> Does anyone know how to explain this jump? Is this a real
>>    problem or just a
>>> visualization artifact? Is there a way to avoid such jumps?
>>> 
>>> 
>>> 
>>> Here is the mdp file I used:
>>> 
>>> ------run.mdp------
>>> 
>>> integrator      = md
>>> 
>>> nsteps          = 1000000
>>> 
>>> dt              = 0.001
>>> 
>>> coulombtype     = pme
>>> 
>>> vdw-type        = cut-off
>>> 
>>> tcoupl          = Berendsen
>>> 
>>> tc-grps         = protein non-protein
>>> 
>>> tau-t           = 0.1 0.1
>>> 
>>> ref-t           = 310 310
>>> 
>>> nstxout         = 100
>>> 
>>> nstvout         = 0
>>> 
>>> nstxtcout       = 100
>>> 
>>> nstenergy           = 100
>>> 
>>> comm_mode     = Linear ; Angular
>>> 
>>> comm_grps        = Protein
>>> 
>>> xtc_grps               = Protein
>>> 
>>> energygrps         = Protein
>>> 
>>> ------------------
>>> 
>>> 
>>> 
>>> Thanks,
>>> 
>>> Ifat
>>> 
>>> 
>>> 
>>> The output:
>>> 
>>> 343.7   10.813 5.924   16.445 16.445 16.445
>>> 
>>> 343.8   10.809 5.949   16.445 16.445 16.445
>>> 
>>> 343.9   10.804 5.959   16.445 16.445 16.445
>>> 
>>> 344      10.808 5.974   16.445 16.445 16.445
>>> 
>>> 344.1   0.18     21.982 16.445 16.445 16.445
>>> 
>>> 344.2   10.778 5.977   16.445 16.445 16.445
>>> 
>>> 344.3   10.768 5.996   16.445 16.445 16.445
>>> 
>>> 344.4   10.764 6.016   16.445 16.445 16.445
>>> 
>>> 344.5   10.722 6.029   16.445 16.445 16.445
>>> 
>>> 344.6   10.774 6.01     16.445 16.445 16.445
>>> 
>>> 344.7   0.174   21.984 16.445 16.445 16.445
>>> 
>>> 344.8   0.176   21.98   16.445 16.445 16.445
>>> 
>>> 344.9   0.17     22.002 16.445 16.445 16.445
>>> 
>>> 345      0.173   21.981 16.445 16.445 16.445
>>> 
>>> 345.1   0.191   21.954 16.445 16.445 16.445
>>> 
>>> 345.2   0.183   21.958 16.445 16.445 16.445
>>> 
>>> 345.3   0.181   22.012 16.445 16.445 16.445
>>> 
>>> 345.4   0.17     22.054 16.445 16.445 16.445
>>> 
>>> 345.5   0.168   22.054 16.445 16.445 16.445
>>> 
>>> 345.6   0.189   22.039 16.445 16.445 16.445
>>> 
>>> 345.7   0.171   22.007 16.445 16.445 16.445
>>> 
>>> 345.8   0.186   22.031 16.445 16.445 16.445
>>> 
>>> 345.9   0.171   22.077 16.445 16.445 16.445
>>> 
>>> 346      0.187   21.99   16.445 16.445 16.445
>>> 
>>> 346.1   0.173   21.984 16.445 16.445 16.445
>>> 
>>> 346.2   0.181   22.02   16.445 16.445 16.445
>>> 
>>> 346.3   10.82   5.984   16.445 16.445 16.445
>>> 
>>> 346.4   10.81   6.002   16.445 16.445 16.445
>>> 
>>> 346.5   10.819 6.008   16.445 16.445 16.445
>>> 
>>> 346.6   10.813 5.996   16.445 16.445 16.445
>>> 
>>> 346.7   10.781 6.006   16.445 16.445 16.445
>>> 
>>> 346.8   10.793 6.026   16.445 16.445 16.445
>>> 
>>> 346.9   10.745 5.985   16.445 16.445 16.445
>>> 
>>> 347      10.762 5.999   16.445 16.445 16.445
>>> 
>>> 347.1   10.781 5.984   16.445 16.445 16.445
>>> 
>>> 347.2   10.784 6.002   16.445 16.445 16.445
>>> 
>>> 
>>> 
>>> 
>>> 
>>> --
>>> gmx-users mailing list gmx-users at gromacs.org
>>    <mailto:gmx-users at gromacs.org>
>>> http://lists.gromacs.org/mailman/listinfo/gmx-users
>>> Please search the archive at
>>> http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
>>> Please don't post (un)subscribe requests to the list. Use the
>>> www interface or send it to gmx-users-request at gromacs.org
>>    <mailto:gmx-users-request at gromacs.org>.
>>> Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>>> 
>> 
>> 
>> 
>>    --
>>    Tsjerk A. Wassenaar, Ph.D.
>> 
>>    post-doctoral researcher
>>    Molecular Dynamics Group
>>    * Groningen Institute for Biomolecular Research and Biotechnology
>>    * Zernike Institute for Advanced Materials
>>    University of Groningen
>>    The Netherlands
>> 
>> 
>>    ------------------------------
>> 
>>    --
>>    gmx-users mailing list
>>    gmx-users at gromacs.org <mailto:gmx-users at gromacs.org>
>>    http://lists.gromacs.org/mailman/listinfo/gmx-users
>>    Please search the archive at
>>    http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
>> 
>>    End of gmx-users Digest, Vol 82, Issue 125
>>    ******************************************
>> 
>> 
> 
> -------------- next part --------------
> An HTML attachment was scrubbed...
> URL: http://lists.gromacs.org/pipermail/gmx-users/attachments/20110216/1ab1794a/attachment.html
> 
> ------------------------------
> 
> -- 
> gmx-users mailing list
> gmx-users at gromacs.org
> http://lists.gromacs.org/mailman/listinfo/gmx-users
> Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
> 
> End of gmx-users Digest, Vol 82, Issue 127
> ******************************************




More information about the gromacs.org_gmx-users mailing list