[gmx-users] Re: change in rename of 1POPC to 1LIG though coordinate and atom same in 1LIG of 1POPC, during solvation of system

Sangita Kachhap sangita at imtech.res.in
Wed Jun 6 14:52:29 CEST 2012


> Send gmx-users mailing list submissions to
> 	gmx-users at gromacs.org
>
> To subscribe or unsubscribe via the World Wide Web, visit
> 	http://lists.gromacs.org/mailman/listinfo/gmx-users
> or, via email, send a message with subject or body 'help' to
> 	gmx-users-request at gromacs.org
>
> You can reach the person managing the list at
> 	gmx-users-owner at gromacs.org
>
> When replying, please edit your Subject line so it is more specific
> than "Re: Contents of gmx-users digest..."
>
>
> Today's Topics:
>
>    1. Re: Question regarding genion (Justin A. Lemkul)
>    2. Re: change in rename of 1POPC to 1LIG though coordinate	and
>       atom same in 1LIG of 1POPC, during solvation of system
>       (Justin A. Lemkul)
>    3. Re: Scaling/performance on Gromacs 4 (Manu Vajpai)
>    4. Atomtype OW_tip4p not found (Amir Abbasi)
>
>
> ----------------------------------------------------------------------
>
> Message: 1
> Date: Wed, 06 Jun 2012 06:04:43 -0400
> From: "Justin A. Lemkul" <jalemkul at vt.edu>
> Subject: Re: [gmx-users] Question regarding genion
> To: Discussion list for GROMACS users <gmx-users at gromacs.org>
> Message-ID: <4FCF2B3B.30304 at vt.edu>
> Content-Type: text/plain; charset=ISO-8859-15; format=flowed
>
>
>
> On 6/6/12 5:38 AM, Matthias Ernst wrote:
>> Hi,
>>
>> I have to questions regarding genion.
>>
>> 1) Is there a possibility to tell genion in advance which group of molecules
>> to
>> replace by ions (for me, solvent is always the choice so I want to skript it
>> but
>> I did not find any parameters for this)?
>>
>
> http://www.gromacs.org/Documentation/How-tos/Using_Commands_in_Scripts
>
>> 2) I want to neutralize a charged system. Therefore, as I found out, I can use
>> the -neutral option. But it seems to me that this option does not work if I do
>> not specify a concentration (system has a charge of -52):
>> genion -s system_in_solvent.tpr -o solventions.gro -p topol_water.top -neutral
>> [snap]
>> Reading file system_in_solvent.tpr, VERSION 4.5.4 (single precision)
>> Using a coulomb cut-off of 1 nm
>> No ions to add and no potential to calculate.
>>
>> If I use genion {parameters as above} -conc 0.0 it also won't add ions but if
>> I
>> try e.g. genion {parameters as above} -c 0.0001, it will add 52 NA and 0 CL
>> ions
>> which corresponds to a neutral system (with -c 0.001, it will add 53 NA and 1
>> CL
>> ions, meaning resulting salt concentration is > 0). I use the amber99sb
>> forcefield.
>> Is this behaviour desired and do I miss the point of the -neutral option not
>> working without specifying a concentration?
>>
>
> I have also found that -neutral must always be used in conjunction with -conc.
> It would be nice if this were not the case.
>
> -Justin
>
> --
> ========================================
>
> Justin A. Lemkul, Ph.D.
> Research Scientist
> Department of Biochemistry
> Virginia Tech
> Blacksburg, VA
> jalemkul[at]vt.edu | (540) 231-9080
> http://www.bevanlab.biochem.vt.edu/Pages/Personal/justin
>
> ========================================
>
>
> ------------------------------
>
> Message: 2
> Date: Wed, 06 Jun 2012 06:08:55 -0400
> From: "Justin A. Lemkul" <jalemkul at vt.edu>
> Subject: Re: [gmx-users] change in rename of 1POPC to 1LIG though
> 	coordinate	and atom same in 1LIG of 1POPC, during solvation of system
> To: Discussion list for GROMACS users <gmx-users at gromacs.org>
> Message-ID: <4FCF2C37.4040404 at vt.edu>
> Content-Type: text/plain; charset=UTF-8; format=flowed
>
>
>
> On 6/6/12 3:09 AM, Sangita Kachhap wrote:
>>
>> Hello all
>> I have to do MD simulation of membrane protein having docked ligand in POPC
>> lipid bilayer.
>> I am geeting error during solvation of system:
>> Resname of 1POPC in system_shrink1.gro converted into 1LIG
>>
>>
>> I have done following:
>>
>> GROMACS COMMAND
>>
>> 1) Generate topol.top using GROMOS96 53A6 parameter set
>> pdb2gmx -f 3gd8-mod.pdb -o 3gd8-mod-processed.gro -water spc
>>
>>
>> at prompt select 14
>>
>> 2) Download:
>>      * popc128.pdb - the structure of a 128-lipid POPC bilayer
>>      * popc.itp - the moleculetype definition for POPC
>>      * lipid.itp - Berger lipid parameters
>>
>> from http://moose.bio.ucalgary.ca/index.php?page=Structures_and_Topologies
>>
>> 3) Modify topol.top with:
>> #include "gromos53a6.ff/forcefield.itp"
>>
>> to:
>>
>> #include "gromos53a6_lipid.ff/forcefield.itp"
>>
>>
>>                  &
>>
>> ; Include Position restraint file
>> #ifdef POSRES
>> #include "posre.itp"
>> #endif
>> ; Include ligand topology
>> #include "ligand-full.itp"
>>
>> ; Include POPC chain topology
>> #include "popc.itp"
>>
>> ; Include water topology
>> #include "gromos53a6_lipid.ff/spc.itp"
>>
>> and at the end add LIG  1 in [molecules]
>>
>> 4) cp files
>> aminoacids.rtp
>> aminoacids.hdb
>> aminoacids.c.tdb
>> aminoacids.n.tdb
>> aminoacids.r2b
>> aminoacids.vsd
>> ff_dum.itp
>> ffnonbonded.itp
>> ffbonded.itp
>> forcefield.itp
>> ions.itp
>> spc.itp
>> watermodels.dat
>>
>> from gromacs top to directory named gromos53a6_lipid.ff in working directory.
>> Append parameter ([ atomtypes ], [ nonbond_params ], and [ pairtypes ])from
>> lipid.itp to ffnonbonded.itp&  ffbonded.itp and create a forcefield.doc file
>> that contains a description of the force field parameters contain "GROMOS96
>> 53A6
>> force field, extended to include Berger lipid parameters".
>> Delete line ";; parameters for lipid-GROMOS interactions." and its subsequent
>> line, change HW as H of [ nonbond_params ]
>>
>>
>> 5) Generate .tpr for POPC
>> grompp -f minim.mdp -c popc128a.pdb -p topol_popc.top -o em.tpr -maxwarn 1
>> (change OW1, HW2, HW3 to OW, HW and HW2 respectively)
>>
>>
>> 6) Remove periodicity
>> trjconv -s em.tpr -f popc128a.pdb -o popc128a_whole.gro -pbc mol -ur compact
>> (at command prompt select 0)
>>
>>
>> 7) Oriant the protein within the same coordinate as written in end of
>> popc128a_whole.gro
>> editconf -f 3gd8-mod-processed.gro -o 3gd8-mod-processe_newbox.gro -c -box
>> 6.23910 6.17970 6.91950
>>
>>
>> 8) Pack lipid around protein
>> cat 3gd8-mod-processe_newbox.gro popc128a_whole.gro>  system.gro
>>
>> Remove unnecessary lines (the box vectors from the KALP structure, the header
>> information from the DPPC structure and update the second line of the
>> coordinate file (total number of atoms) accordingly.
>>
>> 9) Modify topol.top to add positional restrain on protein
>>
>> ; Include Position restraint file
>> #ifdef POSRES
>> #include "posre.itp"
>> #endif
>>
>> ; Strong position restraints for InflateGRO
>> #ifdef STRONG_POSRES
>> #include "strong_posre.itp"
>> #endif
>>
>> ; Include DPPC chain topology
>> #include "dppc.itp"
>>
>> ; Include water topology
>> #include "gromos53a6_lipid.ff/spc.itp"
>>
>>               &
>> Genrate new positional restraint
>> genrestr -f 3gd8-mod-processe_newbox.gro -o strong_posre.itp -fc 100000 100000
>> 100000
>> for system (protein + ligand)
>> Add a line "define = -DSTRONG_POSRES" to .mdp file
>>
>>
>> 10) addion POPC 128 to topol.top
>>
>>
>> 11) Scale down lipid
>> perl inflategro.pl system.gro 0.95 POPC 0 system_shrink1.gro 5
>> area_shrink1.dat
>>
>>
>>
>> 12) Solvate with water
>>
>> Copy vdwradii.dat from Gromacs top to working directory and change the value
>> of
>> C from 0.15 to 0.375(to avoid addition of water in lipid hydrohphobic core)
>>
>> genbox -cp system_shrink1.gro -cs spc216.gro -o system_shrink1_solv.gro -p
>> topol.top
>>
>>
>> Upto 11th step .gro file is OK conatin protein resid 32-254, ligand 1LIG, POPC
>> resid 1-128 and solvent
>>
>> After 12th step in gro file protein is there 32-254, Ligand 1LIG but POPC
>> resid
>> 2-128 because resid 1 of POPC is converted to 1LIG though all cordinate and
>> atom
>> name are same of 1POPC in 1LIG.
>>
>>
>>
>> Anybody please suggest me why this change in rename is occuring.
>>
>
> Based on the description, you say in step (3) that you add "LIG 1" to the end of
> [molecules], but then in (12) you give the order as protein, ligand, then POPC.
>   The order of the coordinate file and [molecules] must match, otherwise funny
> things happen.  If you have protein, ligand, and POPC, you must list the
> moleculetype names in that order in [molecules].
>





Thanks for reply
In step 3 I added "LIG    1" to the end of [molecules] because when I used
command "pdb2gmx -f 3gd8-mod.pdb -o 3gd8-mod-processed.gro -water spc" to
generate topol.top it already contain "Protein_chain_A  1" in [molecules] so I
added only "LIG   1"


This is end of topol.top after solvation

[ molecules ]
; Compound        #mols
Protein_chain_A     1
LIG                 1
POPC             128
SOL              1829





> -Justin
>
> --
> ========================================
>
> Justin A. Lemkul, Ph.D.
> Research Scientist
> Department of Biochemistry
> Virginia Tech
> Blacksburg, VA
> jalemkul[at]vt.edu | (540) 231-9080
> http://www.bevanlab.biochem.vt.edu/Pages/Personal/justin
>
> ========================================
>
>
> ------------------------------
>
> Message: 3
> Date: Wed, 6 Jun 2012 16:49:08 +0530
> From: Manu Vajpai <manuvajpai at gmail.com>
> Subject: Re: [gmx-users] Scaling/performance on Gromacs 4
> To: Discussion list for GROMACS users <gmx-users at gromacs.org>
> Message-ID:
> 	<CAHv4Ge1b-7pMQ5vy+FFrAwY8VY7jYVf4Z9okSnvRUXA2_Um=Fw at mail.gmail.com>
> Content-Type: text/plain; charset="iso-8859-1"
>
> Apologies for reviving such an old thread. For clarifications, interlagos
> and bulldozer both have a modular architecture, as mentioned earlier. Each
> bulldozer module has 2 integer cores and one floating point unit shared
> between the two cores. So, although you have 64 cores (counting integer
> cores) reported by the os, the number of floating point units is still 32.
> Moreover, each FP unit can process two threads when it is possible, but
> since gromacs is so compute intensive I am guessing it is saturated by just
> one. Hence you are not observing  a scale-up by moving from 32 to 64
> threads.
>
> Regards,
> Manu Vajpai
> IIT Kanpur
>
> On Fri, Mar 16, 2012 at 4:24 PM, Szil?rd P?ll <szilard.pall at cbr.su.se>wrote:
>
>> Hi Sara,
>>
>> The bad performance you are seeing is most probably caused by the
>> combination of the new AMD "Interlagos" CPUs, compiler, operating
>> system and it is very likely the the old Gromacs version also
>> contributes.
>>
>> In practice these new CPUs don't perform as well as expected, but that
>> is partly due to compilers and operating systems not having full
>> support for the new architecture. However, based on the quite
>> extensive benchmarking I've done, the with such a large system should
>> be considerably better than what your numbers show.
>>
>> This is what you should try:
>> - compile Gromacs with gcc 4.6 using the "-march=bdver1" optimization flag;
>> - have at least 3.0 or preferably newer Linux kernel;
>> - if you're not required to use 4.0.x, use 4.5.
>>
>> Note that you have to be careful with drawing conclusions from
>> benchmarking on small number of cores with large systems; you will get
>> artifacts from caching effects.
>>
>>
>> And now a bit of fairly technical explanation, for more details ask Google
>> ;)
>>
>> The machine you are using has AMD Interlagos CPUs based on the
>> Bulldozer micro-architecture. This is a new architecture, a departure
>> from previous AMD processors and in fact quite different from most
>> current CPUs. "Bulldozer cores" are not the traditional physical
>> cores. In fact the hardware unit is the "module" which consists of two
>> "half cores" (at least when it comes to floating point units). and
>> enable a special type of multithreading called "clustered
>> multithreading". This is slightly similar to the Intel cores with
>> Hyper-Threading.
>>
>>
>> Cheers,
>> --
>> Szil?rd
>>
>>
>>
>> On Mon, Feb 20, 2012 at 5:12 PM, Sara Campos <srrcampos at gmail.com> wrote:
>> > Dear GROMACS users
>> >
>> > My group has had access to a quad processor, 64 core machine (4 x Opteron
>> > 6274 @ 2.2 GHz with 16 cores)
>> > and I made some performance tests, using the following specifications:
>> >
>> > System size: 299787 atoms
>> > Number of MD steps: 1500
>> > Electrostatics treatment: PME
>> > Gromacs version: 4.0.4
>> > MPI: LAM
>> > Command ran: mpirun -ssi rpi tcp C mdrun_mpi ...
>> >
>> > #CPUS          Time (s)   Steps/s
>> > 64             195.000     7.69
>> > 32             192.000     7.81
>> > 16             275.000     5.45
>> > 8              381.000     3.94
>> > 4              751.000     2.00
>> > 2             1001.000     1.50
>> > 1             2352.000     0.64
>> >
>> > The scaling is not good. But the weirdest is the 64 processors performing
>> > the same as 32. I see the plots from Dr. Hess on the GROMACS 4 paper on
>> JCTC
>> > and I do not understand why this is happening. Can anyone help?
>> >
>> > Thanks in advance,
>> > Sara
>> >
>> > --
>> > gmx-users mailing list    gmx-users at gromacs.org
>> > http://lists.gromacs.org/mailman/listinfo/gmx-users
>> > Please search the archive at
>> > http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
>> > Please don't post (un)subscribe requests to the list. Use the
>> > www interface or send it to gmx-users-request at gromacs.org.
>> > Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>> --
>> gmx-users mailing list    gmx-users at gromacs.org
>> http://lists.gromacs.org/mailman/listinfo/gmx-users
>> Please search the archive at
>> http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
>> Please don't post (un)subscribe requests to the list. Use the
>> www interface or send it to gmx-users-request at gromacs.org.
>> Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>>
> -------------- next part --------------
> An HTML attachment was scrubbed...
> URL:
> http://lists.gromacs.org/pipermail/gmx-users/attachments/20120606/4a86cf4c/attachment-0001.html
>
> ------------------------------
>
> Message: 4
> Date: Wed, 6 Jun 2012 04:19:05 -0700 (PDT)
> From: Amir Abbasi <amir.abbasi69 at yahoo.com>
> Subject: [gmx-users] Atomtype OW_tip4p not found
> To: "gmx-users at gromacs.org" <gmx-users at gromacs.org>
> Message-ID: <1338981545.609.YahooMailNeo at web113012.mail.gq1.yahoo.com>
> Content-Type: text/plain; charset="iso-8859-1"
>
> Hi!
> I use tleap to generate topology file of a RNA molecule with parmbsc0 ff. Then I
> generate .gro and .top files of that with amb2gmx.pl.
> I'm manually add this lines to .top file
>
> ; Include water topology
> #include "amber99sb.ff/tip4p.itp"
>
> ; Include topology for ions
> #include "amber99sb.ff/ions.itp"
> ?but after running this code:
>
> grompp -f ions.mdp -c ribozyme_solv.gro -p ribozyme.top -o ions.tpr
>
> ?I've got this error message:
>
> Program grompp, VERSION 4.5.5
> Source code file: /build/buildd/gromacs-4.5.5/src/kernel/toppush.c, line: 1166
>
> Fatal error:
> Atomtype OW_tip4p not found
>
>
> what should I do?
>
> Regards,
> Amir
> -------------- next part --------------
> An HTML attachment was scrubbed...
> URL:
> http://lists.gromacs.org/pipermail/gmx-users/attachments/20120606/a1430df2/attachment.html
>
> ------------------------------
>
> --
> gmx-users mailing list
> gmx-users at gromacs.org
> http://lists.gromacs.org/mailman/listinfo/gmx-users
> Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/Search
> before posting!
>
> End of gmx-users Digest, Vol 98, Issue 35
> *****************************************
>


Sangita Kachhap
SRF
BIC,IMTECH
CHANDIGARH

______________________________________________________________________
सूक्ष्मजीव प्रौद्योगिकी संस्थान (वैज्ञानिक औद्योगिक अनुसंधान परिषद)
Institute of Microbial Technology (A CONSTITUENT ESTABLISHMENT OF CSIR)
सैक्टर 39 ए, चण्डीगढ़ / Sector 39-A, Chandigarh
पिन कोड/PIN CODE :160036
दूरभाष/EPABX :0172 6665 201-202



More information about the gromacs.org_gmx-users mailing list