[gmx-users] Re: gmx-users Digest, Vol 58, Issue 31

Prof. Ettore Bismuto ettore.bismuto at unina2.it
Thu Feb 5 11:22:52 CET 2009


----- Original Message ----- 
From: <gmx-users-request at gromacs.org>
To: <gmx-users at gromacs.org>
Sent: Thursday, February 05, 2009 10:44 AM
Subject: gmx-users Digest, Vol 58, Issue 31


> Send gmx-users mailing list submissions to
> gmx-users at gromacs.org
>
> To subscribe or unsubscribe via the World Wide Web, visit
> http://www.gromacs.org/mailman/listinfo/gmx-users
> or, via email, send a message with subject or body 'help' to
> gmx-users-request at gromacs.org
>
> You can reach the person managing the list at
> gmx-users-owner at gromacs.org
>
> When replying, please edit your Subject line so it is more specific
> than "Re: Contents of gmx-users digest..."
>
>
> Today's Topics:
>
>   1. Rtp for polymer. (varsha gautham)
>   2. Re: Rtp for polymer. (Mark Abraham)
>   3. Re: Rtp for polymer. (Andrea Muntean)
>   4. Gromacs and ssh problem (Bernhard Knapp)
>   5. RE: micelle disaggregated in serial, but not parallel, runs
>      using sd integrator (Berk Hess)
>   6. Re: micelle disaggregated in serial, but not parallel, runs
>      using sd integrator (Ran Friedman)
>
>
> ----------------------------------------------------------------------
>
> Message: 1
> Date: Thu, 5 Feb 2009 11:02:30 +0530
> From: varsha gautham <varsha.gautham88 at gmail.com>
> Subject: [gmx-users] Rtp for polymer.
> To: gmx-users at gromacs.org
> Message-ID:
> <756f9c230902042132r773b5180t8ca395651d35cc7b at mail.gmail.com>
> Content-Type: text/plain; charset="iso-8859-1"
>
> Hello  justin,
>
> Am sorry to say that its not useful.But what i mean is that the rtp
> constructed manually is building up the polymer and generating topology
> files and gro files.
> But when i look into the gro file with vmd the connectivity between each 
> of
> the monomer is not built.That is my polymer consists of 10 monomer units
> with ben and primary amine unit as a block polymer.
>
> How can i include the connectivity information between these two in a rtp
> file?I can include that in session called bonds. But how to do that??And i
> want some help
> regarding the workflow that is how gromacs interpreting each of the rtp, 
> atp
> ,ffnb.itp,ffbon.itp??I want to know that in a sequential order.are there 
> any
> materials available other than manual..If so please let me know
>
> Thanks in advance.
>
> -krithika
> -------------- next part --------------
> An HTML attachment was scrubbed...
> URL: 
> http://www.gromacs.org/pipermail/gmx-users/attachments/20090205/b73d8ab3/attachment-0001.html
>
> ------------------------------
>
> Message: 2
> Date: Thu, 05 Feb 2009 17:06:04 +1100
> From: Mark Abraham <Mark.Abraham at anu.edu.au>
> Subject: Re: [gmx-users] Rtp for polymer.
> To: Discussion list for GROMACS users <gmx-users at gromacs.org>
> Message-ID: <498A81CC.50406 at anu.edu.au>
> Content-Type: text/plain; charset=ISO-8859-1; format=flowed
>
> varsha gautham wrote:
>> Hello  justin,
>>
>> Am sorry to say that its not useful.But what i mean is that the rtp
>> constructed manually is building up the polymer and generating topology
>> files and gro files.
>> But when i look into the gro file with vmd the connectivity between each
>> of the monomer is not built.
>
> VMD merely guesses the connectivity - there's none specified in a .gro
> file, which is purely a set of coordinates. A .top file defines a
> topology. If your input structure doesn't conform to VMD's heuristics,
> then it won't show a bond.
>
> You can't judge the success of your .rtp construct without looking at
> the *topology* pdb2gmx produced.
>
>> That is my polymer consists of 10 monomer
>> units with ben and primary amine unit as a block polymer.
>>
>> How can i include the connectivity information between these two in a
>> rtp file?I can include that in session called bonds. But how to do
>> that??And i want some help
>> regarding the workflow that is how gromacs interpreting each of the rtp,
>> atp ,ffnb.itp,ffbon.itp??I want to know that in a sequential order.are
>> there any materials available other than manual..If so please let me know
>
> Try http://wiki.gromacs.org/index.php/.top_file and links thereon.
>
> Mark
>
>
> ------------------------------
>
> Message: 3
> Date: Thu, 5 Feb 2009 09:39:38 +0100
> From: Andrea Muntean <andreamuntean at gmail.com>
> Subject: Re: [gmx-users] Rtp for polymer.
> To: Discussion list for GROMACS users <gmx-users at gromacs.org>
> Message-ID:
> <b7d3859e0902050039r516bcce9w7adf5769236c1167 at mail.gmail.com>
> Content-Type: text/plain; charset=ISO-8859-1
>
> I connect the monomers by adding a bond in the rtp entry, in which the
> first atom of the next residue is given with + (you can also connect
> with the previous residue with - in front of the last atom). Sorry to
> explain this in a very confuse way. Better look at the example bellow
> (for polystyrene):
>
> [PS]
> [atoms]
> CH2     CH2     0       0
> CH      CH1     0       0
> CB      C       0       0
> CG1     CR1     0       0
> CG2     CR1     0       0
> CG3     CR1     0       0
> CG4     CR1     0       0
> CG5     CR1     0       0
>
> [bonds]
> ;-CH    CH2     gb_ (you put the appropriate bond from the ff*_bon.itp 
> file)
> CH2     CH      gb_
> CH      CB      gb_
> CB      CG1     gb_
> CG1     CG2     gb_
> CG2     CG3     gb_
> CG3     CG4     gb_
> CG4     CG5     gb_
> CG5     CB      gb_
> CH      +CH2    gb_
>
> [angles]
> ...
> [impropers]
> ...
> [dihedrals]
> ...
> (angles, impropers and dihedrals were defined accordingly, but not listed 
> here)
>
> I hope that will help.
>
> Andrea
> 2009/2/5 varsha gautham <varsha.gautham88 at gmail.com>:
>> Hello  justin,
>>
>> Am sorry to say that its not useful.But what i mean is that the rtp
>> constructed manually is building up the polymer and generating topology
>> files and gro files.
>> But when i look into the gro file with vmd the connectivity between each 
>> of
>> the monomer is not built.That is my polymer consists of 10 monomer units
>> with ben and primary amine unit as a block polymer.
>>
>> How can i include the connectivity information between these two in a rtp
>> file?I can include that in session called bonds. But how to do that??And 
>> i
>> want some help
>> regarding the workflow that is how gromacs interpreting each of the rtp, 
>> atp
>> ,ffnb.itp,ffbon.itp??I want to know that in a sequential order.are there 
>> any
>> materials available other than manual..If so please let me know
>>
>> Thanks in advance.
>>
>> -krithika
>>
>> _______________________________________________
>> gmx-users mailing list    gmx-users at gromacs.org
>> http://www.gromacs.org/mailman/listinfo/gmx-users
>> Please search the archive at http://www.gromacs.org/search before 
>> posting!
>> Please don't post (un)subscribe requests to the list. Use the
>> www interface or send it to gmx-users-request at gromacs.org.
>> Can't post? Read http://www.gromacs.org/mailing_lists/users.php
>>
>
>
> ------------------------------
>
> Message: 4
> Date: Thu, 05 Feb 2009 10:07:07 +0100
> From: Bernhard Knapp <bernhard.knapp at meduniwien.ac.at>
> Subject: [gmx-users] Gromacs and ssh problem
> To: gmx-users at gromacs.org
> Message-ID: <498AAC3B.3050604 at meduniwien.ac.at>
> Content-Type: text/plain; charset=us-ascii; format=flowed
>
> Hi
>
> I installed Gromacs successfully on Fedora 8 nodes. Afterwards I ran a
> successful small simulation. Thereafter I moved the node to our
> server-room did the following:
> - set ip adress, subnetmask and gateway
> - changed the ssh port in /etc/ssh/sshd_config since we use port
> forwarding on our router and /usr/sbin/semanage port -a -t
> inetd_child_port_t -p tcp 5101
> - changed the firewall settings to additionally allow the new port
> - changed the hostname via hostname command
>
> Then I started exactly the same simulation (same command, same data) as
> before (before the network configuration) and Gromacs comes up with:
>
> ssh: quoVadis01: Name or service not known
> --------------------------------------------------------------------------
> A daemon (pid 5039) died unexpectedly with status 255 while attempting
> to launch so we are aborting.
>
> There may be more information reported by the environment (see above).
>
> This may be because the daemon was unable to find all the needed shared
> libraries on the remote node. You may set your LD_LIBRARY_PATH to have the
> location of the shared libraries on the remote nodes and this will
> automatically be forwarded to the remote nodes.
> --------------------------------------------------------------------------
> --------------------------------------------------------------------------
> mpirun noticed that the job aborted, but has no info as to the process
> that caused that situation.
> --------------------------------------------------------------------------
> mpirun: clean termination accomplished
>
>
> How is possible that some network configurations screw up Gromacs? The
> simulation is running in parallel on the 4 local cores of the maschine
> no network interaction is necessary at all. Can anybody tell me where
> the problem is. The error message above is obviously incorrect since it
> was already working and the library paths are ok ...
>
>
> cheers
> Bernhard
>
>
>
> ------------------------------
>
> Message: 5
> Date: Thu, 5 Feb 2009 10:25:09 +0100
> From: Berk Hess <gmx3 at hotmail.com>
> Subject: RE: [gmx-users] micelle disaggregated in serial, but not
> parallel, runs using sd integrator
> To: Discussion list for GROMACS users <gmx-users at gromacs.org>
> Message-ID: <BLU134-W513D4AD5766AECD691F1AC8EC00 at phx.gbl>
> Content-Type: text/plain; charset="iso-8859-1"
>
>
> Hi,
>
> I don't know why I did not add checks for ld-seed before.
> Now grompp gives a note when continutation=yes and ld-seed!=-1.
>
> tpbconv will now generate a new ld-seed when reading a trajectory
> (but you should not use tpbconv, use a checkpoint file instead).
>
> But yesterday I forgot to tell that there is a bug in the checkpointing
> of mdrun in 4.0 - 4.0.3. Without domain decomposition the initial box
> size would always be stored in the checkpoint file, which causes problems
> with NPT simulations. NPT simulations with domain decomposition and
> all NVT simulations were fine.
>
> Gromacs 4.0.4 will all bugs fixed and extra checks should be released 
> today.
>
> Berk
>
>> Date: Wed, 4 Feb 2009 19:30:47 -0500
>> From: chris.neale at utoronto.ca
>> To: gmx-users at gromacs.org
>> Subject: [gmx-users] micelle disaggregated in serial, but not parallel, 
>> runs using sd integrator
>>
>> It appears as if you were correct Berk. I will report on the results of 
>> my 24h test tomorrow, but I also set up another system
>> that used ld_seed=1993 and ran in 20 ps segments instead of the 200 ps 
>> segments that I was previously using. This system shows
>> signs of disaggregation on the 200 ps time-scale as opposed to the 2 ns 
>> time-scale that I observed for 200 ps segments.
>>
>> I don't know how you figured that one out, but I am very grateful.
>>
>> Now that I see the trajectories, it does make sense that any net movement 
>> applied to an individual molecule by the noise will
>> lead to directed movement over many separate segments.
>>
>> I think this is probably worth a note in the grompp output for sd runs 
>> when a user sets ld_seed to something other than -1 and
>> utilizes the -t option (or some other indication that this is intended as 
>> a continuation).
>>
>> Chris.
>>
>> -- original message --
>>
>> Thank you Berk,
>>
>> I will repeat my runs using the checkpoint file and report my findings 
>> back to this list. Thank you for this advice.
>>
>> Chris.
>>
>> -- original message --
>>
>> Hi,
>>
>> In this manner you use the same random seed and thus noise for all parts.
>> In most cases this will not lead to serious artifacts with SD,
>> but you can never be sure.
>> When checkpoints are used, you do not repeat random numbers.
>> This also gives a difference between serial and parallel in 4.0.
>> With serial you get exactly the same noise per atom, in parallel not,
>> since atoms migrate from one node to another (with domain decompostion).
>>
>> If you do not use checkpoints, use ld_seed=-1 and do not use tpbconv.
>>
>> Berk
>>
>>
>> _______________________________________________
>> gmx-users mailing list    gmx-users at gromacs.org
>> http://www.gromacs.org/mailman/listinfo/gmx-users
>> Please search the archive at http://www.gromacs.org/search before 
>> posting!
>> Please don't post (un)subscribe requests to the list. Use the
>> www interface or send it to gmx-users-request at gromacs.org.
>> Can't post? Read http://www.gromacs.org/mailing_lists/users.php
>
> _________________________________________________________________
> Express yourself instantly with MSN Messenger! Download today it's FREE!
> http://messenger.msn.click-url.com/go/onm00200471ave/direct/01/
> -------------- next part --------------
> An HTML attachment was scrubbed...
> URL: 
> http://www.gromacs.org/pipermail/gmx-users/attachments/20090205/c6369c16/attachment-0001.html
>
> ------------------------------
>
> Message: 6
> Date: Thu, 05 Feb 2009 10:43:22 +0100
> From: Ran Friedman <r.friedman at bioc.uzh.ch>
> Subject: Re: [gmx-users] micelle disaggregated in serial, but not
> parallel, runs using sd integrator
> To: Discussion list for GROMACS users <gmx-users at gromacs.org>
> Message-ID: <498AB4BA.2080202 at bioc.uzh.ch>
> Content-Type: text/plain; charset="iso-8859-1"
>
> Hi,
> Maybe it's a good idea to have ld-seed=-1 as a default if that's not
> already the case.
> Ran.
>
> Berk Hess wrote:
>> Hi,
>>
>> I don't know why I did not add checks for ld-seed before.
>> Now grompp gives a note when continutation=yes and ld-seed!=-1.
>>
>> tpbconv will now generate a new ld-seed when reading a trajectory
>> (but you should not use tpbconv, use a checkpoint file instead).
>>
>> But yesterday I forgot to tell that there is a bug in the checkpointing
>> of mdrun in 4.0 - 4.0.3. Without domain decomposition the initial box
>> size would always be stored in the checkpoint file, which causes problems
>> with NPT simulations. NPT simulations with domain decomposition and
>> all NVT simulations were fine.
>>
>> Gromacs 4.0.4 will all bugs fixed and extra checks should be released
>> today.
>>
>> Berk
>>
>> > Date: Wed, 4 Feb 2009 19:30:47 -0500
>> > From: chris.neale at utoronto.ca
>> > To: gmx-users at gromacs.org
>> > Subject: [gmx-users] micelle disaggregated in serial, but not
>> parallel, runs using sd integrator
>> >
>> > It appears as if you were correct Berk. I will report on the results
>> of my 24h test tomorrow, but I also set up another system
>> > that used ld_seed=1993 and ran in 20 ps segments instead of the 200
>> ps segments that I was previously using. This system shows
>> > signs of disaggregation on the 200 ps time-scale as opposed to the 2
>> ns time-scale that I observed for 200 ps segments.
>> >
>> > I don't know how you figured that one out, but I am very grateful.
>> >
>> > Now that I see the trajectories, it does make sense that any net
>> movement applied to an individual molecule by the noise will
>> > lead to directed movement over many separate segments.
>> >
>> > I think this is probably worth a note in the grompp output for sd
>> runs when a user sets ld_seed to something other than -1 and
>> > utilizes the -t option (or some other indication that this is
>> intended as a continuation).
>> >
>> > Chris.
>> >
>> > -- original message --
>> >
>> > Thank you Berk,
>> >
>> > I will repeat my runs using the checkpoint file and report my
>> findings back to this list. Thank you for this advice.
>> >
>> > Chris.
>> >
>> > -- original message --
>> >
>> > Hi,
>> >
>> > In this manner you use the same random seed and thus noise for all
>> parts.
>> > In most cases this will not lead to serious artifacts with SD,
>> > but you can never be sure.
>> > When checkpoints are used, you do not repeat random numbers.
>> > This also gives a difference between serial and parallel in 4.0.
>> > With serial you get exactly the same noise per atom, in parallel not,
>> > since atoms migrate from one node to another (with domain 
>> > decompostion).
>> >
>> > If you do not use checkpoints, use ld_seed=-1 and do not use tpbconv.
>> >
>> > Berk
>> >
>> >
>> > _______________________________________________
>> > gmx-users mailing list gmx-users at gromacs.org
>> > http://www.gromacs.org/mailman/listinfo/gmx-users
>> > Please search the archive at http://www.gromacs.org/search before
>> posting!
>> > Please don't post (un)subscribe requests to the list. Use the
>> > www interface or send it to gmx-users-request at gromacs.org.
>> > Can't post? Read http://www.gromacs.org/mailing_lists/users.php
>>
>> ------------------------------------------------------------------------
>> Express yourself instantly with MSN Messenger! MSN Messenger
>> <http://clk.atdmt.com/AVE/go/onm00200471ave/direct/01/>
>> ------------------------------------------------------------------------
>>
>> _______________________________________________
>> gmx-users mailing list    gmx-users at gromacs.org
>> http://www.gromacs.org/mailman/listinfo/gmx-users
>> Please search the archive at http://www.gromacs.org/search before 
>> posting!
>> Please don't post (un)subscribe requests to the list. Use the
>> www interface or send it to gmx-users-request at gromacs.org.
>> Can't post? Read http://www.gromacs.org/mailing_lists/users.php
>
> -------------- next part --------------
> An HTML attachment was scrubbed...
> URL: 
> http://www.gromacs.org/pipermail/gmx-users/attachments/20090205/1f265d48/attachment.html
>
> ------------------------------
>
> _______________________________________________
> gmx-users mailing list
> gmx-users at gromacs.org
> http://www.gromacs.org/mailman/listinfo/gmx-users
> Please search the archive at http://www.gromacs.org/search before posting!
>
> End of gmx-users Digest, Vol 58, Issue 31
> *****************************************


--------------------------------------------------------------------------------



Nessun virus nel messaggio in arrivo.
Controllato da AVG - http://www.avg.com
Versione: 8.0.233 / Database dei virus: 270.10.18/1935 -  Data di rilascio: 
04/02/2009 16.35




More information about the gromacs.org_gmx-users mailing list