Fwd: [its-cluster-admin] Fwd: [gmx-users] gmx_blast error when attempting to run in parallel
Sashank Karri
sashank.karri at case.edu
Thu Jun 11 20:54:19 CEST 2009
Hi,
Our cluster admin argues that gromacs was in fact compiled with MPI.
Below is the script he used to compile gromacs. Do you see any errors in
the script? Are there any other possible reasons why I'm getting this
gmx_bast error?
-Sashank
---------- Forwarded message ----------
From: Tula Paudel <trp6 at case.edu>
Date: Thu, Jun 11, 2009 at 10:39 AM
Subject: Re: [its-cluster-admin] Fwd: [gmx-users] gmx_blast error when
attempting to run in parallel
To: Sashank Karri <srk18 at case.edu>
Dear Sashank,
I don't agree that it's not compliled with mpi. Here is the script, I used
to build the gromacs. I clearly used the option --enable-mpi as you can see
and compiled with mpicc and mpif77.
#!/bin/sh
export version=4.0.3par
export CC=mpicc
export CXXCPP=mpicc
export CFLAGS="-O2"
export F77='mpif77'
export FFLAGS="-O2"
export CXX=mpicc
export CPPFLAGS="/usr/local/fftw-3.1.2/include"
export LDFLAGS="-L/usr/local/fftw-3.1.2/lib
-L/usr/lib/gcc-lib/x86_64-redhat-linux/3.3.4 -L/usr/lib64 "
export LIBS=' -lmpich -lpthread -lrt -lfftw -lm'
cd /usr/local/src/gromacs/gromacs-${version}
make clean
make distclean
./configure --prefix=/usr/local/gromacs/gromacs-${version} --with-fft=fftw3
--enable-mpi
make > make.out 2>&1
make install
The name of executable should not really matter. Any way I will try to build
new version later today or tomorrow. You can give a try yourself. If you are
succeded or I make progress let me know. Also could you please sene me some
of your simple test files.
Thanks.
Tula
On Thu, Jun 11, 2009 at 10:29 AM, Sashank Karri <srk18 at case.edu> wrote:
> Hi Tula,
>
> I received two e-mails informing me that gromacs was not compiled with
> MPI. Here is the second, more informative e-mail. Could you possibly try
> compiling gromacs-4.0.5 this time? I'm around Fridays with nothing to do
> now, if you want someone with a little bit of gromacs experience around
> while you're working on it.
>
> Thanks,
>
> Sashank
>
> ---------- Forwarded message ----------
> From: Justin A. Lemkul <jalemkul at vt.edu>
> Date: Thu, Jun 11, 2009 at 8:24 AM
> Subject: Re: [gmx-users] gmx_blast error when attempting to run in parallel
> To: Discussion list for GROMACS users <gmx-users at gromacs.org>
>
>
>
>
> Sashank Karri wrote:
>
>> Hi, I'm running gromacs-4.0.3 on a cluster. I am testing gromacs on it.
>> We are currently getting this error when I run with 4 nodes with one
>> dedicated to PME calculations.
>> Back Off! I just backed up md.log to ./#md.log.5#
>> Reading file ionsol_minim96-1.tpr, VERSION 4.0.3 (single precision)
>>
>> ------------------------------
>> -------------------------
>> Program mdrun, VERSION 4.0.3
>> Source code file: network.c, line: 357
>>
>> Routine should not have been called:
>> gmx_bast
>> -------------------------------------------------------
>>
>> "I'll Match Your DNA" (Red Hot Chili Peppers).
>>
>> Here is my job that I'm submitting to the cluster.
>>
>> #PBS -N ionsol_karri
>> #PBS -l walltime=24:00:00
>> #PBS -l nodes=4:ppn=4:quad
>> #PBS -j oe
>>
>> module load mpich
>> module load gromacs-4.0.3
>> cd /home/srk18/newplcre/
>> mpirun -nodes 4 /usr/local/gromacs/gromacs-4.0.3/bin/mdrun -npme 1 -s
>> ionsol_minim96-1.tpr -o finminim96_traj.trr -x finminim96_traj.xtc -c
>> final_minim.g96 -e enermin_fin.edr -cpo state96.cpt
>>
>>
> Is your mdrun truly MPI-enabled? Typically the configure script advises
> you to append _mpi to the parallel version.
>
> Also, upgrade to the newest version of Gromacs, 4.0.5, and try again. That
> way, if it is a bug, then changes can be made to the newest version, and not
> one that is five months old.
>
> -Justin
>
> When discussing the installation of gromacs earlier with the cluster
>> admin, he sent the following message:
>>
>> >I remember I seeing your error (not the present error) while I tried to
>> make the dynamic version of the GROMACS. So I used the --enable-static while
>> building parallel version of the GROMACS. That build the executable and most
>> of the inbuilt test were successful. If you think we need dynamic build let
>> me know, I might have to struggle a bit. Also It would be good if you share
>> information you have regarding building.
>>
>> Do you folks have any idea as to how to fix the above error?
>>
>> Thanks,
>> Sashank Karri
>>
>>
>> ------------------------------------------------------------------------
>>
>> _______________________________________________
>> gmx-users mailing list gmx-users at gromacs.org
>> http://lists.gromacs.org/mailman/listinfo/gmx-users
>> Please search the archive at http://www.gromacs.org/search before
>> posting!
>> Please don't post (un)subscribe requests to the list. Use the www
>> interface or send it to gmx-users-request at gromacs.org.
>> Can't post? Read http://www.gromacs.org/mailing_lists/users.php
>>
>
> --
> ========================================
>
> Justin A. Lemkul
> Ph.D. Candidate
> ICTAS Doctoral Scholar
> Department of Biochemistry
> Virginia Tech
> Blacksburg, VA
> jalemkul[at]vt.edu | (540) 231-9080
> http://www.bevanlab.biochem.vt.edu/Pages/Personal/justin
>
> ========================================
>
> _______________________________________________
> gmx-users mailing list gmx-users at gromacs.org
> http://lists.gromacs.org/mailman/listinfo/gmx-users
> Please search the archive at http://www.gromacs.org/search before posting!
> Please don't post (un)subscribe requests to the list. Use the www interface
> or send it to gmx-users-request at gromacs.org.
> Can't post? Read http://www.gromacs.org/mailing_lists/users.php
>
>
--
Tula R Paudel Ph.D
10900 Euclid Ave
Rockfeller 104A / Crawford 403
Cleveland OH-44106
@216 368 4035, 216 368 0395
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://maillist.sys.kth.se/pipermail/gromacs.org_gmx-users/attachments/20090611/25c9f4ac/attachment.html>
More information about the gromacs.org_gmx-users
mailing list