[gmx-users] parallel run problems--please help

Chng Choon Peng cpchng at bii.a-star.edu.sg
Wed Feb 11 01:48:00 CET 2004


Hi Jee,

   Since you can¹t even get just 1 process to run, I suspect there might be
something wrong with
the installation. (if 1 process works, then likely it¹s some runtime library
problem across nodes)

But for a start, just try putting in the full path to mdrun_mpi in your
mpirun calls.
And I presume you compiled with ³--enable-mpi² for both FFTw and Gromacs.

cheers,
Choon-Peng
-- 
Mr. Choon-Peng CHNG
Research Associate
BioInformatics Institute, BMSI, A*STAR
30 Biopolis Street
#07-01 Matrix Building
Singapore 138671
Tel (O): +65 64788301 Fax (O): +65 64789047
www.bii.a-star.edu.sg/~cpchng


On 2/10/04 9:39 AM, "Jee Eun Rim" <jrim at stanford.edu> wrote:

> Hello,
>  
> I have installed gromacs on an SGI Origin 3800, and tried running the tutorial
> gmxdemo. 
> I have no problem running the serial version, but when I modify the script and
> try to run the parallel version with mpirun, I get the following errors,
>  
> input:
> grompp_d -np 1 -f em -c ${MOL}_b4em -p ${MOL} -o ${MOL}_em >& !
> output.grompp_em
> mpirun -np 1 mdrun_mpi -np 1 -nice 4 -s ${MOL}_em -o ${MOL}_em -c ${MOL}_b4pr
> -v >& ! output.mdrun_em
>  
> output:
> MPI: MPI_COMM_WORLD rank 0 has terminated without calling MPI_Finalize()
> MPI: aborting job
>  
> Also, if I increase the number of processes to 4, I get the following error
>  
> input:
> grompp_d -np 4 -f pr -c ${MOL}_b4pr -r ${MOL}_b4pr -p ${MOL} -o ${MOL}_pr >& !
> output.grompp_pr
> mpirun -np 4 mdrun_mpi -np 4 -nice 4 -s ${MOL}_pr -o ${MOL}_pr -c ${MOL}_b4md
> -v >& ! output.mdrun_pr
>  
> output:
> Fatal error: run input file cpeptide_pr.tpr was made for 4 nodes,
>                  while mdrun_mpi expected it to be for 1 nodes.
> Fatal error: run input file cpeptide_pr.tpr was made for 4 nodes,
>                  while mdrun_mpi expected it to be for 1 nodes.
> Fatal error: run input file cpeptide_pr.tpr was made for 4 nodes,
>                  while mdrun_mpi expected it to be for 1 nodes.
> Fatal error: run input file cpeptide_pr.tpr was made for 4 nodes,
>                  while mdrun_mpi expected it to be for 1 nodes.
>  
> Even though grompp_d had the option -np 4.
>  
> Does this mean that gromacs was not compiled/installed correctly? Any help
> would be greatly appreciated.
> By the way, gromacs was compiled with MPICH, with --enable-shared.
>  
> Thanks a lot.
>  
> Jee.
>  
> 


-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://maillist.sys.kth.se/pipermail/gromacs.org_gmx-users/attachments/20040211/41c9b776/attachment.html>


More information about the gromacs.org_gmx-users mailing list