[gmx-users] problems with non pbc simulations in parallel
Gavin Melaugh
gmelaugh01 at qub.ac.uk
Wed Mar 10 15:16:38 CET 2010
Hi all
I have installed gromacs-4.0.7-parallel with open mpi. I have
successfully ran a few short simulations on 2,3 and 4 nodes using pbc. I
am now interested in simulating a cluster of 32 molecules with no pbc in
parallel and the simulation doe not proceed. I have set by box vectors
to 0 0 0 in the conf.gro file, pbc = no in the mdp file, and use
dparticle decomposition. The feedback I get from the following command
nohup mpirun -np 2 /local1/gromacs-4.0.7-parallel/bin/mdrun -pd -s &
is
Back Off! I just backed up md.log to ./#md.log.1#
Reading file topol.tpr, VERSION 4.0.7 (single precision)
starting mdrun 'test of 32 hexylcage molecules'
1000 steps, 0.0 ps.
[emerald:22662] *** Process received signal ***
[emerald:22662] Signal: Segmentation fault (11)
[emerald:22662] Signal code: Address not mapped (1)
[emerald:22662] Failing at address: (nil)
[emerald:22662] [ 0] /lib64/libpthread.so.0 [0x7fbc17eefa90]
[emerald:22662] [ 1]
/local1/gromacs-4.0.7-parallel/bin/mdrun(nosehoover_tcoupl+0x74) [0x436874]
[emerald:22662] [ 2]
/local1/gromacs-4.0.7-parallel/bin/mdrun(update+0x171) [0x4b2311]
[emerald:22662] [ 3]
/local1/gromacs-4.0.7-parallel/bin/mdrun(do_md+0x2608) [0x42dd38]
[emerald:22662] [ 4]
/local1/gromacs-4.0.7-parallel/bin/mdrun(mdrunner+0xe33) [0x430973]
[emerald:22662] [ 5]
/local1/gromacs-4.0.7-parallel/bin/mdrun(main+0x5b8) [0x431128]
[emerald:22662] [ 6] /lib64/libc.so.6(__libc_start_main+0xe6)
[0x7fbc17ba6586]
[emerald:22662] [ 7] /local1/gromacs-4.0.7-parallel/bin/mdrun [0x41e1e9]
[emerald:22662] *** End of error message ***
--------------------------------------------------------------------------
mpirun noticed that process rank 1 with PID 22662 on node emerald exited
on signal 11 (Segmentation fault).
p.s I have ran several of these non pbc simulations with the same system
in serial and have never experienced a problem. Has anyone ever come
across this sort of problem before? and if so could you please provide
some advice.
Many Thanks
Gavin
More information about the gromacs.org_gmx-users
mailing list