[gmx-users] FW: Gromacs version 3.3

Wong, RYM (Richard) R.Y.M.Wong at rl.ac.uk
Thu Dec 1 11:23:56 CET 2005



-----Original Message-----
From: gmx-users-bounces at gromacs.org
[mailto:gmx-users-bounces at gromacs.org]On Behalf Of Florian Haberl
Sent: 01 December 2005 10:12
To: Discussion list for GROMACS users
Subject: Re: [gmx-users] FW: Gromacs version 3.3


hi,

On Thursday 01 December 2005 10:19, Wong, RYM (Richard) wrote:
> -----Original Message-----
> From: gmx-users-bounces at gromacs.org
> [mailto:gmx-users-bounces at gromacs.org]On Behalf Of David van der Spoel
> Sent: 30 November 2005 20:01
> To: Discussion list for GROMACS users
> Subject: Re: [gmx-users] FW: Gromacs version 3.3
>
> Wong, RYM (Richard) wrote:
> >>Hi,
> >>
> >>I have installed Gromacs version 3.3 on a machine running Linux
> >> 2.4.21-37.ELsmp.
> >
> >The configuration command syntax I used to compile the MPI version of
> > Gromacs-3.3:
>
> What kind of machine?
>
> The system contains 20 dual Intel Xeon processors.
> The version of MPI is MPICH-1.2.5..10 associated with GNU (i.e. mpicc
> invokes gcc-3.2.3). Multiprocessors are connected together via Myrinet GM
> version 2.0.8. Do you need any other information?
>
> >>  ./configure --prefix=/usr/local/applications/bioinformatics/gromacs-3.3
> >> --exec-prefix=/usr/local/applications/bioinformatics/gromacs-3.3
> >> --enable-mpi --enable-shared --with-fft=fftw2 --program-suffix="_mpi"
> >> CPPFLAGS=-I/usr/local/applications/libraries/numerical/fft/include
> >> LDFLAGS=-L/usr/local/applications/libraries/numerical/fft/lib CC=mpicc
> >>
> >>Then executed "make"
> >>Then executed "make install"
> >>Then I used the command "grompp -np 2" to create the input data files in
> >> the sample "water" provided by the Gromacs development team.
> >>
> >>When I run the command "mdrun_mpi" , I get 'segmentation fault' error.
> >>
> >>  Warning: main: task 0 died with signal 11 (Segmentation fault)
> >>  Warning: main: task 1 died with signal 11 (Segmentation fault)
>
> Did you use mpirun?
> Due to the security reason, we use mpiexec version 0.75 instead of using
> mpirun directly. We are using MPICH-1.2.5..10 and Myrinet GM-2.0.8.
> Do you need any other information?

Did you use a command line like "mpiexec -np 2 /pathtomdrun/mdrun_mpi -np 2" , 
you have to invoke the mpiprocess first with mpirun/ mpiexec to start  
mdrun_mpi. 

 The command I use for executing the mdrun_mpi:
    "/usr/local/bin/mpiexec -n 4 -verbose /usr/local/applications/bioinformatics/gromacs-3.3/bin/mdrun_mpi -np 4". 

Error messages recorded in the log file:
********************************************
[1] Malloc hook test failed: the malloc provided by MPICH-GM is not used, it may have been overloaded by a malloc provided by another library.

[0] Malloc hook test failed: the malloc provided by MPICH-GM is not used, it may have been overloaded by a malloc provided by another library.

[2] Malloc hook test failed: the malloc provided by MPICH-GM is not used, it may have been overloaded by a malloc provided by another library.

[3] Malloc hook test failed: the malloc provided by MPICH-GM is not used, it may have been overloaded by a malloc provided by another library.

mpiexec: Warning: accept_abort_conn: MPI_Abort from IP 192.168.100.5, killing all.
mpiexec: Warning: main: task 0 exited with status 255.
mpiexec: Warning: main: task 1 exited with status 255.
mpiexec: Warning: main: task 2 exited with status 255.
**********************************************
Does it work without mpi, e.g. only on one cpu?

Yes. The serial version of mdrun works OK.

Are you using a scheduler/ queing system like maui/ torque?
No. I create a script which contains the above command and submit it to PBS (portable batch system).


>
> Thank you
> Richard
> _______________________________________________
> gmx-users mailing list
> gmx-users at gromacs.org
> http://www.gromacs.org/mailman/listinfo/gmx-users
> Please don't post (un)subscribe requests to the list. Use the
> www interface or send it to gmx-users-request at gromacs.org.


Greetings,

Florian
-- 
---------------------------------------------------------------------------------------------------
 Florian Haberl                             Universitaet Erlangen/Nuernberg
 Computer-Chemie-Centrum     Naegelsbachstr. 25, D-91052 Erlangen
 
 Mailto: florian.haberl AT chemie.uni-erlangen.de
 
---------------------------------------------------------------------------------------------------

_______________________________________________
gmx-users mailing list
gmx-users at gromacs.org
http://www.gromacs.org/mailman/listinfo/gmx-users
Please don't post (un)subscribe requests to the list. Use the 
www interface or send it to gmx-users-request at gromacs.org.



More information about the gromacs.org_gmx-users mailing list