[gmx-users] about parallel work
Carsten Kutzner
ckutzne at gwdg.de
Tue Apr 21 19:10:51 CEST 2009
Hi,
On Apr 21, 2009, at 5:53 PM, sheerychen wrote:
> yes, both versions are compiled as mpi version. However the start
> mpi messages are different. For MPICH, it would show that 1D domain
> decomposition like 3*1*1, and only 1 file would be produced. However
> for MPICH2, no such information appears and it would produce many
> files as (8 nodes):
> complex_em.edr complex_em.log #complex_em.trr.7#
> #complex_pr.log.5#
> #complex_em.edr.1# #complex_em.log.1# complex_pr.cpt
> #complex_pr.log.6#
> #complex_em.edr.2# #complex_em.log.2# #complex_pr.cpt.1#
> #complex_pr.log.7#
> #complex_em.edr.3# #complex_em.log.3# complex_pr.edr
> complex_pr_prev.cpt
> #complex_em.edr.4# #complex_em.log.4# #complex_pr.edr.1#
> complex_pr.tpr
> #complex_em.edr.5# #complex_em.log.5# #complex_pr.edr.2#
> complex_pr.trr
> #complex_em.edr.6# #complex_em.log.6# #complex_pr.edr.3#
> #complex_pr.trr.1#
> #complex_em.edr.7# #complex_em.log.7# #complex_pr.edr.4#
That means that in this case 8 serial instances run instead of one
parallel.
Either gromacs was not compiled properly with appropriate MPI
libraries or something
during startup did not work. You should check if the mpich2 ring of
daemons is running
with mpdtrace. If not, look up mpdboot.
Carsten
More information about the gromacs.org_gmx-users
mailing list