[gmx-users] mpi problem]

Ruben Martinez Buey ruben at akilonia.cib.csic.es
Fri Mar 21 23:50:57 CET 2003


Hi all,
I´ve still have a problem with mpi. I can run mdrun with only one
processor (np 1), but not with more than one (np > 1). This is the
error:

                    
Reading file prueba.tpr, VERSION 3.1.4 (double precision)
Reading file prueba.tpr, VERSION 3.1.4 (double precision)
Steepest Descents:
   Tolerance         =  1.00000e-02
   Number of steps   =       100000
MPI: MPI_COMM_WORLD rank 2 has terminated without calling MPI_Finalize()
MPI: Received signal 10
____________________________________________________________________________

And the otuput files are:

core.13520690

ener.edr (empty)

md0.log:
Log file opened: nodeid 0, nnodes = 8, host = jen50, process = 13440144
                         :-)  G  R  O  M  A  C  S  (-:

               Go Rough, Oppose Many Angry Chinese Serial killers

...    ...   ...   ...
CPU=  0, lastcg=  606, targetcg= 2409, myshift=    4
CPU=  1, lastcg= 1243, targetcg= 3046, myshift=    5
CPU=  2, lastcg= 1859, targetcg=   57, myshift=    6
CPU=  3, lastcg= 2219, targetcg=  417, myshift=    5
CPU=  4, lastcg= 2551, targetcg=  749, myshift=    5
CPU=  5, lastcg= 2883, targetcg= 1081, myshift=    4
CPU=  6, lastcg= 3215, targetcg= 1413, myshift=    4
CPU=  7, lastcg= 3604, targetcg= 1802, myshift=    3
nsb->shift =   6, nsb->bshift=  0
Listing Scalars
nsb->nodeid:         0
nsb->nnodes:      8
nsb->cgtotal:  3605
nsb->natoms:   7963
nsb->shift:       6
nsb->bshift:      0
Nodeid   index  homenr  cgload  workload
     0       0     996     607       607
     1     996     995    1244      1244
     2    1991     996    1860      1860
     3    2987     995    2220      2220
     4    3982     996    2552      2552
     5    4978     996    2884      2884
     6    5974     996    3216      3216
     7    6970     993    3605      3605

parameters of the run (nodeid=0):
... ...   ...   ...

and md1:
Log file opened: nodeid 1, nnodes = 8, host = jen50, process = 13525852


, md2, md3,...md7.log

Thanks again for your kind attention,
with best regards,
Ruben



More information about the gromacs.org_gmx-users mailing list