[gmx-users] restarting a sorted shuffled trajectory
chris.neale at utoronto.ca
chris.neale at utoronto.ca
Tue Feb 27 18:31:49 CET 2007
> My problem was that i used the wrong .top and .gro (the sorted and
> shuffled one) file with the sorted, shuffled trr. The discrepancy
> came from the
> pre-processed .tpr (since one need to provide the .top and .gro linked
> into the processed .tpr tpr).
> Problem solved, run done (still need a lot of work, though ;-).
This sounds strange to me. If you are using tpbconv then everything
should be straightforward (and there should be no need for a .gro file
for restarts). If you are using grompp to do the restarts then you can
not use your originally shuffled and sorted .trr (by originally
shuffled and sorted I mean the one from the very first run). I would
strongly recommend that you use g_msd and select the waters. If you
get discontinuities at the restart positions then something is wrong.
Also you should load the trajectory (at least a part spanning a
restart) into VMD and select a few water molecules to watch and ensure
that they don't "hop" at the restart.
If you just want to use shuffle without sort then you don't need to do
the complicated procedure that I outlined. However, if you are in
explicit solvent and want to use sort as well then I imagine that the
benefit of sorting is entirely lost by a couple of ns as the waters
are no longer in their "sorted" section (in coordinate space).
> However, you remark (see below) about the diminution of performance
> according to the initial sort over time is very interesting and i'll
> probably give it a try to improve scaling (on a single machine with 4
> processors, the relative performance is reported to be 78%, i would
> expect something close to 100%).
On a single machine with 4 processors (communications not rate
limiting) using shuffle and sort you should get 100% scaling even with
PME. In fact I get 100% with LAM-mpi and something like 110% with
open-mpi when resorting every 200ps. How it is possible to get >100%
is unclear to me.
More information about the gromacs.org_gmx-users
mailing list