[gmx-users] MPI: MPI_COMM_WORLD rank 1 has terminated without calling MPI_Finalize()
Wei Fu
wfu at mail.shcnc.ac.cn
Thu Nov 7 10:44:09 CET 2002
Hi, Tivadar Orban, hi, everyone,
I am a new gromacs user. I have browsed the gromacs mailing list and found tht I encounter the same problem with you when I run "mpirun -np 4 mdrun_mpi ...", the following information appears, then mpirun aborts.
__________________________________________________________________
Back Off! I just backed up ener.edr to ./#ener.edr.1#
Steepest Descents:
Tolerance = 2.00000e+03
Number of steps = 100
MPI: MPI_COMM_WORLD rank 1 has terminated without calling MPI_Finalize()
MPI: Received signal 11
___________________________________________________________________________
To my surprise, when my system is only protein+sol, the mpirun can succssessfully pass. when my system is protein+sol+lipid without using the couloumbtype=PME im em.mdp, the mpirun also can pass, but when I use couloumbtype=PME, the above information appears. Even I return to without using couloumbtype=PME in em.mdp, the above information still appears. At last, I try to run " mpirun -np 4 mdrun_mpi -s run.tpr ..." on protein+sol system (miss lipid) with and without couloumbtype=PME in em.mdp, the mpirun can pass.
Hi, Orban, can you tell me how to solve this problem? Anyone can help me? Any suggestion is welcome.
Thanks in advance!
Best regards,
Wei Fu
2002-11-07
******************************************
Ms. Wei Fu, Ph.D.
State Key Laboratory of Drug Research
Shanghai Institute of Materia Medica
Chinese Academy of Sciences
294 Taiyuan Road
Shanghai, 200031
P.R. China
wfu at mail.shcnc.ac.cn or fuweifw at yahoo.com
wfu at iris3.simm.ac.cn
********************************************************
Back Off! I just backed up ener.edr to ./#ener.edr.4#
Steepest Descents:
Tolerance = 1.00000e+03
Number of steps = 100
MPI: MPI_COMM_WORLD rank 1 has terminated without calling MPI_Finalize()
MPI: aborting job
Can someone tell me what is the problem here?
Regards,
Tivadar, Orban
More information about the gromacs.org_gmx-users
mailing list