Re: Re: [gmx-users] “Fatal error in PMPI_Bcast: Other MPI error, …..” occurs when using the ‘particle decomposition’ option.

xhomes at sohu.com xhomes at sohu.com
Wed Jun 2 06:14:19 CEST 2010


Hi, Mark,<o:p></o:p>

&nbsp;<o:p></o:p>

I’ve noticed about the minimum cell diameter restrict, but I still had no
idea about how to adjust the related parameter after I read the manual part
mentioned by the error info. I don’t have too much understanding about the
algorithm, so I turned around to rely on the ‘-pd’ option:)<o:p></o:p>

&nbsp;<o:p></o:p>

About choosing double precision, I notice normal mode analysis need the double precision version
of some programs. And I don’t know whether I should use double precision
version of mdrun for covariance analysis, so I just chose the double one! <o:p></o:p>

I also don’t have too much idea about choosing which ensemble to conduct covariance
analysis. I’ve noticed that temperature coupling would ‘correct’ the motion of
the atoms. I think a more ‘natural’ trajectory with least artifact should be
generated for covariance analysis. Any comments about this?<o:p></o:p> ----- Original Message -----From: xhomes at sohu.comDate: Tuesday, June 1, 2010 21:59Subject:  Re: [gmx-users] “Fatal error in PMPI_Bcast: Other MPI error, …..” occurs when using the ‘particle decomposition’ option.To: Discussion list for GROMACS users &lt;gmx-users at gromacs.org&gt;&gt; &gt; Hi, Mark,&gt; Thanks for the reply! &gt; It seemed that I got something messed up. At the beginning, I used ‘constraints = all-bonds’ and ‘domain decomposition’.&gt;When the simulation scale to more than 2 processes, an error like below will occur: The "domain_decomposition" .mdp flag is an artefact of pre-GROMACS-4 development of DD. It does nothing. Forget about it. DD is enabled by default unless you use mdrun -pd.&gt; ####################&gt; Fatal error: There is no domain decomposition for 6 nodes that is compatible with the given box and a minimum cell size of 2.06375 nm&gt; Change the number of nodes or mdrun option -rcon or -dds or your LINCS settings&gt; Look in the log file for details on the domain decomposition&gt; ####################&gt; &nbsp;With DD and all-bonds, the coupled constraints create a minimum cell diameter that must be satisfied on all processors. Your system is too small for this to be true. The manual sections on DD mention this, though perhaps you wouldn't pick that up on a first reading.&gt; I refer to the manual and found no answer. Then I turned to use ‘particle decomposition’, tried&gt; all kind of method, including change mpich to lammpi, change Gromacs from V4.05&gt; to V4.07,adjusting the mdp file (e.g. ‘constraints = hbonds’ or no PME), and none of these&gt; take effect! I thought I have tried ‘constraints = hbonds’ with ‘domain decomposition’, at least with lammpi. PD might fail for a similar reason, I suppose.&gt; However, when I tried ‘constraints = hbonds’ and ‘domain decomposition’ under mpich today, it scaled to more than 2 processes well! And now it also scaled well under lammpi using ‘constraints= hbonds’ and ‘domain decomposition’!Yep. Your constraints are not so tightly coupled now.&gt; So, it seemed the key is ‘constraints= hbonds’ for ‘domain decomposition’.Knowing how your tools work is key :-) The problem with complex tools like GROMACS is knowing what's worth knowing :-)&gt; &nbsp;&gt; Of course, the simulation still crashed when using ‘particle decomposition’ with ‘constraints = hbonds or all-bonds’, and I don’t know why.Again, your system is probably too small to be bothered with parallelising with constraints.&gt; I use double precision version and NTP ensemble to perform a PCA!I doubt that you need to collect data in double precision. Any supposed extra accuracy of integration is probably getting swapped by noise from temperature coupling.  I suppose you may wish to run the analysis tool in double, but it'll read a single-precision trajectory just fine. Using single precision will make things more than a factor of two faster.Mark
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://maillist.sys.kth.se/pipermail/gromacs.org_gmx-users/attachments/20100602/9edd2356/attachment.html>


More information about the gromacs.org_gmx-users mailing list