[gmx-users] mdrun with mpich
Pradeep Kota
kotanmd at gmail.com
Sun Feb 5 09:46:04 CET 2006
Thanks Dr.David..I figured it out sometime after i mailed the group.
Thanks again.
regards,
kota.
>
> On 2/5/06, David van der Spoel <spoel at xray.bmc.uu.se> wrote:
> >
> > Pradeep Kota wrote:
> > > Thanks Dr.David. I could compile gromacs successfully. But when i run
> > an
> > > md simulation on four processors, it returns the following error.
> > >
> > >
> > > [0] MPI Abort by user Aborting program !
> > > [0] Aborting program!
> > > p4_error: latest msg from perror: No such file or directory
> > > p0_2057: p4_error: : -1
> > > -------------------------------------------------------
> > > Program mdrun_mpi, VERSION 3.3
> > > Source code file: futil.c, line: 308
> > >
> > > File input/output error:
> > > md.log
> > > -------------------------------------------------------
> > >
> > > Thanx for Using GROMACS - Have a Nice Day
> > >
> > > Halting program mdrun_mpi
> > >
> > > gcq#1768121632: Thanx for Using GROMACS - Have a Nice Day
> > >
> > > [0] MPI Abort by user Aborting program !
> > > [0] Aborting program!
> > > p4_error: latest msg from perror: No such file or directory
> > >
> > -----------------------------------------------------------------------------
> > > It seems that [at least] one of the processes that was started with
> > > mpirun did not invoke MPI_INIT before quitting (it is possible that
> > > more than one process did not invoke MPI_INIT -- mpirun was only
> > > notified of the first one, which was on node n8952p0_
> > > 058: p4_error: : -1
> > > mpirun can *only* be used with MPI programs (i.e ., programs that
> > > invoke MPI_INIT and MPI_FINALIZE). You can use the "lamexec" program
> > > to run non-MPI programs over the lambooted nodes.
> > >
> > -----------------------------------------------------------------------------
> >
> > >
> > >
> > > i had lambooted all the nodes properly and it did not have any
> > problems
> > > at that stage. and as it said, i tried using lamexec too. still no
> > luck.
> > > i tried using other switches with mpirun too..couldnt quite figure out
> >
> > > what the error could be.
> >
> > you are still running an MPICH exacutable here. lamboot is used for LAM
> > only. Are you inadvertedly mixing LAM and MPICH?
> >
> >
> >
> > >
> > > thanks in anticipation,
> > > kota.
> > >
> > > On 2/2/06, *David van der Spoel* <spoel at xray.bmc.uu.se
> > > <mailto:spoel at xray.bmc.uu.se>> wrote:
> > >
> > > Pradeep Kota wrote:
> > > > thanks for the info, Mr.David.. but when i try to compile
> > gromacs, it
> > > > returns an error. the following is an excerpt from the output
> > of
> > > 'make'.
> > > >
> > > seems like you have two different versions of fftw3 installed, or
> > mixed
> > > single and double precision. Otherwise I don't know.
> > >
> > >
> > >
> > > > /usr/local/lib/libfftw3f.a( the-planner.o) definition of
> > > > _fftwf_the_planner in section (__TEXT,__text)
> > > > /usr/local/lib/libfftw3f.a( version.o) definition of _fftwf_cc
> > in
> > > section
> > > > (__TEXT,__cstring)
> > > > /usr/local/lib/libfftw3f.a(version.o) definition of
> > > _fftwf_codelet_optim
> > > > in section (__TEXT,__cstring)
> > > > /usr/local/lib/libfftw3f.a( version.o) definition of
> > > _fftwf_version in
> > > > section (__TEXT,__cstring)
> > > > /usr/bin/libtool: internal link edit command failed
> > > > make[4]: *** [libgmx.la < http://libgmx.la> < http://libgmx.la
> > >]
> > > Error 1
> > > > make[3]: *** [all-recursive] Error 1
> > > > make[2]: *** [all-recursive] Error 1
> > > > make[1]: *** [all] Error 2
> > > > make: *** [all-recursive] Error 1
> > > >
> > > > i think Mr.Jack Howarth had already pointed the same thing out,
> > > sometime
> > > > back. any suggestion!?
> > > >
> > > > regards,
> > > > kota.
> > > >
> > > > On 2/2/06, *Pradeep Kota* < kotanmd at gmail.com
> > > <mailto:kotanmd at gmail.com>
> > > > <mailto:kotanmd at gmail.com <mailto: kotanmd at gmail.com>>> wrote:
> > > >
> > > > Thank you for your response Mr.David. I compiled lam
> > successfully
> > > > without fortran support. all i wanted to know was whether
> > it
> > > would
> > > > make any difference to the running time of gromacs. I am
> > curious
> > > > because, it is well-known that fortran loops are faster
> > than
> > > loops
> > > > of other languages..so, i wanted to clarify ! moreover, i
> > > would want
> > > > to know how different this is, from mpich.
> > > > thanks for the support.
> > > > regards,
> > > > kota.
> > > >
> > > >
> > > > On 2/2/06, *David van der Spoel* < spoel at xray.bmc.uu.se
> > > <mailto:spoel at xray.bmc.uu.se>
> > > > <mailto:spoel at xray.bmc.uu.se <mailto:spoel at xray.bmc.uu.se
> > >>>
> > > wrote:
> > > >
> > > > Pradeep Kota wrote:
> > > >> Thank you for your response, itamar..but the cluster isolated
> >
> > > > from the
> > > >> internet for security reasons. i dont think there is any
> > > > chance to use
> > > >> fink on the head node either. any other alternatives?
> > > >> regards,
> > > >
> > > > compile LAM without fortran, there's a flag for it
> > > >
> > > >> kota.
> > > >>
> > > >> On 2/2/06, *Itamar Kass* < ikass at cc.huji.ac.il
> > > <mailto:ikass at cc.huji.ac.il>
> > > > <mailto: ikass at cc.huji.ac.il <mailto:
> > ikass at cc.huji.ac.il>>
> > > >> <mailto: ikass at cc.huji.ac.il <mailto:ikass at cc.huji.ac.il>
> > > <mailto:ikass at cc.huji.ac.il <mailto: ikass at cc.huji.ac.il> >> >
> > > > wrote:
> > > >>
> > > >> Why not using fortan?
> > > >> install it using Fink and let lam have it.
> > > >>
> > > >> Itamar.
> > > >>
> > > >> ===========================================
> > > >> | Itamar Kass
> > > >> | The Alexander Silberman
> > > >> | Institute of Life Sciences
> > > >> | Department of Biological Chemistry
> > > >> | The Hebrew University, Givat-Ram
> > > >> | Jerusalem, 91904, Israel
> > > >> | Tel: +972-(0)2-6585194
> > > >> | Fax: +972-(0)2-6584329
> > > >> | Email: ikass at cc.huji.ac.il <mailto:ikass at cc.huji.ac.il>
> > > <mailto: ikass at cc.huji.ac.il <mailto:ikass at cc.huji.ac.il>>
> > > > <mailto:ikass at cc.huji.ac.il <mailto:ikass at cc.huji.ac.il
> > >
> > > <mailto:ikass at cc.huji.ac.il <mailto:ikass at cc.huji.ac.il> >>
> > > >> | Net:
> > > >>
> > > >
> > > http://www.ls.huji.ac.il/~membranelab/itamar/itamar_homepage.html<http://www.ls.huji.ac.il/%7Emembranelab/itamar/itamar_homepage.html>
> > > > <
> > > http://www.ls.huji.ac.il/%7Emembranelab/itamar/itamar_homepage.html
> > >
> > > >> <
> > > >
> > >
> > http://www.ls.huji.ac.il/%7Emembranelab/itamar/itamar_homepage.html>
> > > >> ============================================
> > > >>
> > > >> ----- Original Message -----
> > > >> *From:* Pradeep Kota <mailto: kotanmd at gmail.com
> > > <mailto:kotanmd at gmail.com>
> > > > <mailto:kotanmd at gmail.com <mailto:kotanmd at gmail.com>>>
> > > >> *To:* Discussion list for GROMACS users
> > > >> <mailto: gmx-users at gromacs.org
> > > <mailto:gmx-users at gromacs.org>
> > > > <mailto: gmx-users at gromacs.org
> > > <mailto:gmx-users at gromacs.org>>>
> > > >> *Sent:* Thursday, February 02, 2006 4:43 AM
> > > >> *Subject:* [gmx-users] mdrun with mpich
> > > >>
> > > >> Dear users,
> > > >> I have compiled gromacs on our dual core mac os x
> > G5
> > > >> cluster, and tried running a simulation on a 540
> > > > residue protein
> > > >> for 1 ns on 8 processors. i used mpich as the mpi
> > > > environment
> > > >> for parallelising gromacs. it worked fine, and the job
> > > > was split
> > > >> properly and assigned to nodes. now, the cpu usage per
> > > > processor
> > > >> is not more than 50% on any of the processors. and the
> > > > total
> > > >> running time for this was 13hrs. though output is
> > written
> > > >> properly onto the specified output file, mdirun
> > doesnot
> > > >> terminate even after running through all the steps. it
> > > > still
> > > >> shows two mdrun_mpi processes running on the head
> > node,
> > > > with a
> > > >> 0% cpu usage. was going through gmx-users mailing list
> > and
> > > >> somehow figured out that mpich is not a good idea for
> > > > running
> > > >> gromacs. so, i wanted to switch over to lam. now when
> > i
> > > > complie
> > > >> gromacs using lam, it is not able to link the
> > libraries
> > > >> properly. so, i tried reinstalling lam on my cluster.
> > > > now, lam
> > > >> keeps complaining about not being able to find a
> > fortran
> > > >> compiler. i should not need a fortran compiler unless
> > > > i'm using
> > > >> SUN or SGI for this purpose..(am i going wrong here?).
> > what
> > > >> flags do i need to compile lam with, in order to
> > > > compile gromacs
> > > >> successfully? any help is very much appreciated..
> > > >> thanks in advance,
> > > >> regards,
> > > >> kota.
> > > >>
> > > >>
> > > >
> > >
> > ------------------------------------------------------------------------
> >
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://maillist.sys.kth.se/pipermail/gromacs.org_gmx-users/attachments/20060205/4cc9d46d/attachment.html>
More information about the gromacs.org_gmx-users
mailing list