[gmx-developers] Configure warning for MPI version, was: [gmx-users] New beta release: 4.5-beta3
Rossen Apostolov
rossen.apostolov at cbr.su.se
Thu Aug 12 11:16:07 CEST 2010
Hi,
I put warnings in CMake and autoconf about these versions, although no
actual checks are being done. We should probably fix that for the final
release.
Rossen
On 8/11/10 11:58 PM, Roland Schulz wrote:
> Hi,
>
> should we add configure errors/warning for MPI versions?
>
> Problems known to me:
> MVAPICH2 <1.4.1 crashes
> OpenMPI <1.4.x is slow
> OpenMPI 1.4.0 hangs on GCC 4.4.x on x86/x64
>
> Thus I would suggest we give an error for MVAPICH2 <1.4.1 and OpenMPI
> 1.4.0 and a warning for OpenMPI <1.4.x
>
> Roland
>
> ---------- Forwarded message ----------
> From: *Da-Wei Li* <lidawei at gmail.com <mailto:lidawei at gmail.com>>
> Date: Wed, Aug 11, 2010 at 4:34 PM
> Subject: Re: [gmx-users] New beta release: 4.5-beta3
> To: Discussion list for GROMACS users <gmx-users at gromacs.org
> <mailto:gmx-users at gromacs.org>>
>
>
> According to our HPC manage, our MVAPICH2's version is 1.0.3.
>
> thanks for the information.
>
> dawei
>
> On Wed, Aug 11, 2010 at 2:52 PM, Roland Schulz <roland at utk.edu
> <mailto:roland at utk.edu>> wrote:
> > MVAPICH2 >=1.4.1 should be fine. Did you use a version <1.4.1?
> > On Wed, Aug 11, 2010 at 2:45 PM, Da-Wei Li <lidawei at gmail.com
> <mailto:lidawei at gmail.com>> wrote:
> >>
> >> Yes it is. Now everything is fine with openmpi instead of mvapich2.
> >>
> >> dawei
> >>
> >> On Wed, Aug 11, 2010 at 12:59 AM, Mark Abraham
> <mark.abraham at anu.edu.au <mailto:mark.abraham at anu.edu.au>>
> >> wrote:
> >> >
> >> >
> >> > ----- Original Message -----
> >> > From: Da-Wei Li <lidawei at gmail.com <mailto:lidawei at gmail.com>>
> >> > Date: Wednesday, August 11, 2010 6:23
> >> > Subject: Re: [gmx-users] New beta release: 4.5-beta3
> >> > To: Discussion list for GROMACS users <gmx-users at gromacs.org
> <mailto:gmx-users at gromacs.org>>
> >> >
> >> >> Hi,all
> >> >>
> >> >> Unfortunately, it still crash. Here it is the output of mdrun. Both
> >> >> Gromacs-4.0.7 and non-parallel 4.5b3 work fine on this system. I use
> >> >> mvapich2 and intel compiler.
> >> >
> >> > MPICH variants are known to cause problems. Try OpenMPI.
> >> >
> >> > Mark
> >> >
> >> >> ***************************
> >> >> Getting Loaded...
> >> >> Reading file em.tpr, VERSION 4.5-beta3 (single precision)
> >> >> Loaded with Money
> >> >>
> >> >>
> >> >> Will use 9 particle-particle and 7 PME only nodes
> >> >> This is a guess, check the performance at the end of the log file
> >> >> Making 1D domain decomposition 9 x 1 x 1
> >> >>
> >> >> Back Off! I just backed up em.trr to ./#em.trr.4#
> >> >>
> >> >> Back Off! I just backed up em.edr to ./#em.edr.4#
> >> >>
> >> >> Steepest Descents:
> >> >> Tolerance (Fmax) = 1.00000e+03
> >> >> Number of steps
> >> >> = 50000
> >> >> rank 9 in job 1 hpc-8-6.local_58777 caused
> >> >> collective abort of all ranks
> >> >> exit status of rank 9: killed by signal 9
> >> >>
> >> >> *******************************
> >> >>
> >> >> dawei
> >> >>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://maillist.sys.kth.se/pipermail/gromacs.org_gmx-developers/attachments/20100812/a8b58311/attachment.html>
More information about the gromacs.org_gmx-developers
mailing list