[gmx-users] gmx_mpi mdrun with multidir and tabulated bond potentials

Andrei Gasic aggasic at central.uh.edu
Thu Jul 27 16:35:54 CEST 2017


Hello everyone,

I'm currently trying to run an equilibration for replica exchange md using
tabulated bond potentials. I am receiving an error message when I use more
than ~900 tables. However, I'm my simulation needs 1014 tables.  Here is my
input: "mpirun -np 40 gmx_mpi mdrun -multidirequil{1..40} -tableb table_b{
0..1013}.xvg".

The program runs fine with all 1014 tables when running in serial (not
using mpirun and using gmx mdrun).  But it seem the gmx_mpi version runs
out of memory after an input of more than 900 tables. I'm not sure how to
correct this issue. Below is the error message when I use more than 900
tables.

If anyone has a solution, please let me know.

Thank you,

Andrei

*** glibc detected *** mpiexec.hydra: free(): invalid next size (fast):
0x00000000012dc070 ***

======= Backtrace: =========

/lib64/libc.so.6[0x3fbd675e66]

/lib64/libc.so.6[0x3fbd6789b3]

mpiexec.hydra[0x41d2ae]

mpiexec.hydra[0x4059c1]

/lib64/libc.so.6(__libc_start_main+0xfd)[0x3fbd61ed5d]

mpiexec.hydra[0x4054d9]

======= Memory map: ========

00400000-004b7000 r-xp 00000000 00:16 57423226
/share/apps/intel/compilers_and_libraries_2016.0.109/
linux/mpi/intel64/bin/mpiexec.hydra

006b7000-006bb000 rw-p 000b7000 00:16 57423226
/share/apps/intel/compilers_and_libraries_2016.0.109/
linux/mpi/intel64/bin/mpiexec.hydra

006bb000-006be000 rw-p 00000000 00:00 0

012ce000-012ef000 rw-p 00000000 00:00 0
[heap]

3fbd200000-3fbd220000 r-xp 00000000 08:03 408442
/lib64/ld-2.12.so

3fbd41f000-3fbd420000 r--p 0001f000 08:03 408442
/lib64/ld-2.12.so

3fbd420000-3fbd421000 rw-p 00020000 08:03 408442
/lib64/ld-2.12.so

3fbd421000-3fbd422000 rw-p 00000000 00:00 0

3fbd600000-3fbd78a000 r-xp 00000000 08:03 408443
/lib64/libc-2.12.so

3fbd78a000-3fbd98a000 ---p 0018a000 08:03 408443
/lib64/libc-2.12.so

3fbd98a000-3fbd98e000 r--p 0018a000 08:03 408443
/lib64/libc-2.12.so

3fbd98e000-3fbd98f000 rw-p 0018e000 08:03 408443
/lib64/libc-2.12.so

3fbd98f000-3fbd994000 rw-p 00000000 00:00 0

3fbda00000-3fbda02000 r-xp 00000000 08:03 393336
/lib64/libdl-2.12.so

3fbda02000-3fbdc02000 ---p 00002000 08:03 393336
/lib64/libdl-2.12.so

3fbdc02000-3fbdc03000 r--p 00002000 08:03 393336
/lib64/libdl-2.12.so

3fbdc03000-3fbdc04000 rw-p 00003000 08:03 393336
/lib64/libdl-2.12.so

3fbde00000-3fbde17000 r-xp 00000000 08:03 408444
/lib64/libpthread-2.12.so

3fbde17000-3fbe017000 ---p 00017000 08:03 408444
/lib64/libpthread-2.12.so

3fbe017000-3fbe018000 r--p 00017000 08:03 408444
/lib64/libpthread-2.12.so

3fbe018000-3fbe019000 rw-p 00018000 08:03 408444
/lib64/libpthread-2.12.so

3fbe019000-3fbe01d000 rw-p 00000000 00:00 0

3fbe200000-3fbe283000 r-xp 00000000 08:03 408446
/lib64/libm-2.12.so

3fbe283000-3fbe482000 ---p 00083000 08:03 408446
/lib64/libm-2.12.so

3fbe482000-3fbe483000 r--p 00082000 08:03 408446
/lib64/libm-2.12.so

3fbe483000-3fbe484000 rw-p 00083000 08:03 408446
/lib64/libm-2.12.so

2b71a3e1d000-2b71a3e1f000 rw-p 00000000 00:00 0

2b71a3e2d000-2b71a3e2e000 rw-p 00000000 00:00 0

2b71a3e2e000-2b71a3e44000 r-xp 00000000 00:16 61341698
/share/apps/gcc-4.9.2/lib64/libgcc_s.so.1

2b71a3e44000-2b71a4043000 ---p 00016000 00:16 61341698
/share/apps/gcc-4.9.2/lib64/libgcc_s.so.1

2b71a4043000-2b71a4044000 rw-p 00015000 00:16 61341698
/share/apps/gcc-4.9.2/lib64/libgcc_s.so.1

2b71a4044000-2b71a4047000 rw-p 00000000 00:00 0

2b71a4054000-2b71a4060000 r-xp 00000000 08:03 393268
/lib64/libnss_files-2.12.so

2b71a4060000-2b71a4260000 ---p 0000c000 08:03 393268
/lib64/libnss_files-2.12.so

2b71a4260000-2b71a4261000 r--p 0000c000 08:03 393268
/lib64/libnss_files-2.12.so

2b71a4261000-2b71a4262000 rw-p 0000d000 08:03 393268
/lib64/libnss_files-2.12.so

7ffffc395000-7ffffc3b0000 rw-p 00000000 00:00 0
[stack]

7ffffc3ff000-7ffffc400000 r-xp 00000000 00:00 0
[vdso]

ffffffffff600000-ffffffffff601000 r-xp 00000000 00:00 0
[vsyscall]

/share/apps/intel/impi/5.1.1.109/intel64/bin/mpirun: line 241: 62150
Aborted                 (core dumped) mpiexec.hydra "$@" 0<&0


More information about the gromacs.org_gmx-users mailing list