[gmx-users] Extending Simulation

mohsen ramezanpour ramezanpour.mohsen at gmail.com
Sun Apr 10 10:27:16 CEST 2011


Dear Dr.Mark
Thank you very much
you are right
I understood your mean exactly,

On Sun, Apr 10, 2011 at 12:44 PM, Mark Abraham <Mark.Abraham at anu.edu.au>wrote:

>  On 10/04/2011 6:04 PM, mohsen ramezanpour wrote:
>
> Dear Dr.Mark
>
> On Sun, Apr 10, 2011 at 12:20 PM, Mark Abraham <Mark.Abraham at anu.edu.au>wrote:
>
>> On 10/04/2011 5:40 PM, mohsen ramezanpour wrote:
>>
>>> Dear All
>>> I used the following commands accoring to Extending Simulation in
>>> gromacs/Documentation/how-tos/Extending Simulation
>>> to extend my simulation.
>>> I entered:
>>>
>>> tpbconv  -s  npt-1.tpr    -extend  100  -o npt-1-extend.tpr
>>>
>>> nohup mpirun -np 4 mdrun -s  npt-1-extend.tpr  -cpi  npt-1.cpt
>>>
>>> Ii run this command on a node with 4 cpu.
>>> the result was these files:
>>> confout.gro , state.cpt   ,md.log  , traj.trr  ,state_prev.cpt  ,ener.edr
>>> ,
>>> #confout.gro.1# ,#confout.gro.2# ,#confout.gro.3# ,# ener.edr.1#
>>> ,#ener.edr.2# ,#md.log.1 #,#traj.trr.1# ,#traj.trr.2# ,
>>>
>>> I extended 4 files on 4 distinct nodes!
>>>
>>
>>  That's not the behaviour you were looking for. You've run the same .tpr
>> four times, once on each CPU. This is because mdrun was not compiled with
>> MPI. See the installation instructions.
>>
>>  I think I explained bad.let me explain it more:
> Suppose I entered the above commands on just one node(with 4 cpu) for
> npt-1.tpr (just one time I entered these commands)
>
>
> You seem to have used mpirun to run four different mdrun processes, each
> using the same .tpr, one on each cpu. You should be trying to run one
> mdrun_mpi using that .tpr, but all four cpus working on the same run.
>
> Look at the top few lines of your .log. That will tell you how many nodes
> mdrun thought it was running on. I think you will see "NNODES=1". If so,
> don't use this mdrun with mpirun.
>
>
>  besides if you were right it would result the same outputs when I run it
> on different nodes with the same commands,please see below!
>
>
> Not strictly true. See
> http://www.gromacs.org/Documentation/Terminology/Reproducibility
>
>
>
>
>>  it is excellent,because the number of outputs was different, for example
>>> one node produced 3 md.log and 4 confout.gro !
>>>
>>> The main question is:
>>> which one of the resulted outputs are the main result? on wich do  I must
>>> analyse?
>>>
>>
>>  They should all be equivalent, but not necessarily binary identical.
>>
> Actually I checked all of them,they are different,with different averages
> of quantities,
> and all of them show a final text that means the program finished
> succesfully!
>
>
> The averages being the same or different merely reflect whether your run
> was reproducible. Assuming there was no output files in your working
> directory to start with, you've run the same irreproducible run four times,
> because you're using a non-MPI mdrun.
>
> Mark
>
> --
> gmx-users mailing list    gmx-users at gromacs.org
> http://lists.gromacs.org/mailman/listinfo/gmx-users
> Please search the archive at
> http://www.gromacs.org/Support/Mailing_Lists/Search before posting!
> Please don't post (un)subscribe requests to the list. Use the
> www interface or send it to gmx-users-request at gromacs.org.
> Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://maillist.sys.kth.se/pipermail/gromacs.org_gmx-users/attachments/20110410/231aedcd/attachment.html>


More information about the gromacs.org_gmx-users mailing list