[gmx-users] Seeking advice on how to build Gromacs on Teragrid resources

J. Nathan Scott scottjn at chemistry.montana.edu
Fri Dec 10 18:56:46 CET 2010


On Thu, Dec 9, 2010 at 3:38 PM, Mark Abraham <Mark.Abraham at anu.edu.au> wrote:
> On 10/12/2010 9:14 AM, J. Nathan Scott wrote:
>>
>> Hello gmx users! I realize this may be a touch off topic, but I am
>> hoping that someone out there can offer some advice on how to build
>> Gromacs for parallel use on a Teragrid site. Our group is currently
>> using Abe on Teragrid, and unfortunately the latest version of Gromacs
>> compiled for public use on Abe is 4.0.2. Apparently installation of
>> 4.5.3 is at least on the to-do list for Abe, but we would very much
>> like to use 4.5.3 now if we can get this issue figured it out.
>>
>> I have built a parallel version of mdrun using Abe installed versions
>> of fftw3 and mvapich2 using the following commands:
>
> Certainly MPICH use is discouraged, as GROMACS seems to find some bugs in
> it. I'm not sure about MVAPICH. Certainly you should be sure to be using the
> latest version. Compare with OpenMPI if you can.

Mark, thank you very much for the tip, it turns out the problem was my
choice of using the MVAPICH2 1.2 libraries/includes for compilation.
Switching to Open MPI 1.3.2 appears to have solved everything and
mdrun is now running quite well on multiple nodes on Abe. Thank you
again for the suggestion!

Best Wishes,
-Nathan

>> setenv CPPFLAGS "-I/usr/apps/math/fftw/fftw-3.1.2/gcc/include/
>> -I/usr/apps/mpi/marmot_mvapich2_intel/include"
>> setenv LDFLAGS "-L/usr/apps/math/fftw/fftw-3.1.2/gcc/lib
>> -L/usr/apps/mpi/marmot_mvapich2_intel/lib"
>> ./configure --enable-mpi --enable-float --prefix=/u/ac/jnscott/gromacs
>> --program-suffix=_mpi
>> make -j 8 mdrun&&  make install-mdrun
>>
>> My PBS script file looks like the following:
>>
>> -------------------------------
>> #!/bin/csh
>> #PBS -l nodes=2:ppn=8
>
> Simplify the conditions when trying to diagnose a problem - try to run on
> one 8-processor node, or even 1 processor. Your crash is consistent with
> some MPI problem, because (off the top of my head) it seems to happen when
> GROMACS starts to do communication to pass around the input data.
>
> Mark
>
<snip>



-- 
----------
J. Nathan Scott, Ph.D.
Postdoctoral Fellow
Department of Chemistry and Biochemistry
Montana State University



More information about the gromacs.org_gmx-users mailing list