[gmx-users] Gromacs 2016.3 FLOATING-POINT EXECPTION/DIVIDE-BY_ZERO errors

David van der Spoel spoel at xray.bmc.uu.se
Tue Sep 5 23:09:10 CEST 2017


On 05/09/17 11:54, Rainer Rutka wrote:
> HI!.
> My name is Rainer. I am one of the module-/SW-maintainers
> for the bwHPC-C5-project. "http://www.bwhpc-c5.de/en/index.php"
> 
> Since a couple of time we can't run Gromacs 2016.n on
> our clusters. Here's one of the errors we received from
> one of the users:
> 
> <snip>
> I am using Gromacs on BwUniCluster for few months and I performed 
> molecular Dynamics Simulations without any problem until the end of 
> July. Since August, I cannot prepare the input file and if I prepare 
> them elsewhere, the simulation crash although it was the same that the 
> ones which perfectly worked few days before. It seems that I have not 
> such problem smaller systems with less atoms.
> 
> uc1n997:43275] *** Process received signal ***
> [uc1n997:43275] Signal: Floating point exception (8)
> [uc1n997:43275] Signal code: Integer divide-by-zero (1)
> [uc1n997:43275] Failing at address: 0x42fd74
> [uc1n997:43275] [ 0] /usr/lib64/libpthread.so.0(+0xf5e0)[0x2b65e89805e0]
> [uc1n997:43275] [ 1] gmx_mpi_d(__svml_idiv4_h9+0x64)[0x42fd74]
> [uc1n997:43275] [ 2] 
> /pfs/data1/software_uc1/bwhpc/common/chem/gromacs/5.1.2-openmpi-1.8-intel-15.0/bin/../lib64/libgromacs_mpi_d.so.1(count_bonded_distances+0x43c)[0x2b65e6b1f72c] 
> 
> [uc1n997:43275] [ 3] 
> /pfs/data1/software_uc1/bwhpc/common/chem/gromacs/5.1.2-openmpi-1.8-intel-15.0/bin/../lib64/libgromacs_mpi_d.so.1(pme_load_estimate+0x20)[0x2b65e6b1ef70] 
> 
> [uc1n997:43275] [ 4] 
> /pfs/data1/software_uc1/bwhpc/common/chem/gromacs/5.1.2-openmpi-1.8-intel-15.0/bin/../lib64/libgromacs_mpi_d.so.1(gmx_grompp+0x257a)[0x2b65e639e84a] 
> 
> [uc1n997:43275] [ 5] 
> /pfs/data1/software_uc1/bwhpc/common/chem/gromacs/5.1.2-openmpi-1.8-intel-15.0/bin/../lib64/libgromacs_mpi_d.so.1(_ZN3gmx24CommandLineModuleManager3runEiPPc+0x267)[0x2b65e6127e97] 
> 
> [uc1n997:43275] [ 6] gmx_mpi_d(main+0xbc)[0x40c13c]
> [uc1n997:43275] [ 7] 
> /usr/lib64/libc.so.6(__libc_start_main+0xf5)[0x2b65e8baec05]
> [uc1n997:43275] [ 8] gmx_mpi_d[0x40bfb9]
> [uc1n997:43275] *** End of error message ***
> </snip>
> 
> Gromacs 2016.3 was build this way (excerpt):
> 
> [...]
> # #(3) Load required modules for build process
> module load compiler/intel/16.0
> module load mpi/openmpi/2.1-intel-16.0
> module load numlib/mkl/11.3.4
> module load devel/cmake/3.3.2
> [...]
> # double precission
> cmake -DCMAKE_VERBOSE_MAKEFILE=ON -DGMX_MPI=ON -DGMX_GPU=OFF 
> -DGMX_DOUBLE=ON -DGMX_THREAD_MPI=OFF -DGMX_FFT_LIBRARY=mkl 
> -DMPIEXEC=${MPI_BIN_DIR}/mpirun -DREGRESSIONTEST_DOWNLOAD=OFF 
> -DCMAKE_INSTALL_PREFIX=${TARGET_DIR}
> make 2>&1 | tee ${LOG_DIR}/make_double.out
> make install 2>&1 | tee ${LOG_DIR}/make-install_double.out
> [...]
> 
> ANY HELP IS MUCH APPRECIATED.
Much more information is needed than this.
If this system used to work but now it does not anymore than please have 
the user submit a bug report at
http://redmine.gromacs.org
> 
> Thanx in advance.
> 
> .-)
> 
> 
> 


-- 
David van der Spoel, Ph.D., Professor of Biology
Head of Department, Cell & Molecular Biology, Uppsala University.
Box 596, SE-75124 Uppsala, Sweden. Phone: +46184714205.
http://www.icm.uu.se


More information about the gromacs.org_gmx-users mailing list