[gmx-users] Segmentation fault

Mark Abraham mark.j.abraham at gmail.com
Thu Mar 12 07:47:35 CET 2015


Hi,

I would not expect a good result from using both wrapper compilers and a
collection of manually-specified include and library paths, because you
have good chances of getting some detail wrong. Getting those right is a
matter of following the docs for the MVAPICH2 + icc combination. In
particular, you likely want to have followed Intel's advice about sourcing
their compiler setup scripts.

Unrelated, but turning off OpenMP support seems likely to give away options
for improved performance for no real benefit.

Mark

On Thu, Mar 12, 2015 at 12:28 AM, Wim Rm Cardoen <wim.cardoen at utah.edu>
wrote:

> Hello,
>
> I have compiled the latest version of gromacs (5.0.4) on our cluster
> (RHEL6 OS)
> using the intel compiler (2015.1.133) as well as mvapich2 2.0 compiled
> with the same version of the INTEL compiler.
> For the FFTW and the BLAS/LAPACK I used Intel MKL's library.
> When I tested the regression test ALL tests failed due to segmentation
> faults at the MPI level.
> Help would be strongly appreciated.
>
> Thanks ,
>
> Wim
>
> I compiled gromacs 5.0.4 with the following flags (Non-threaded and
> non-gpu version)
> --------------------------------------------------------------------
> cmake .. -DCMAKE_INSTALL_PREFIX=/uufs/
> chpc.utah.edu/sys/installdir/gromacs/5.0.4-mvapich2-2.0.em.i \
>
>  -DCMAKE_C_COMPILER=/uufs/ember.arches/sys/pkg/mvapich2/2.0i/bin/mpicc \
>
>  -DCMAKE_CXX_COMPILER=/uufs/ember.arches/sys/pkg/mvapich2/2.0i/bin/mpicxx \
>          -DCMAKE_INCLUDE_PATH="
> -I/uufs/ember.arches/sys/pkg/mvapich2/2.0i/include  -I/uufs/
> chpc.utah.edu/sys/pkg/intel/ics/composer_xe_2015/mkl/include/fftw " \
>          -DCMAKE_LIBRARY_PATH="
> -Wl,-rpath=/uufs/ember.arches/sys/pkg/mvapich2/2.0i/lib
> -L/uufs/ember.arches/sys/pkg/mvapich2/2.0i/lib -lmpich  -Wl,-rpath=/uufs/
> chpc.utah.edu/sys/pkg/intel/ics/composer_xe_2015/mkl/lib/intel64  -L/uufs/
> chpc.utah.edu/sys/pkg/intel/ics/composer_xe_2015/mkl/lib/intel64
> -lmkl_core -lmkl_intel_lp64 -lmkl_sequential  -Wl,-rpath=/uufs/
> chpc.utah.edu/sys/pkg/intel/ics/composer_xe_2015/lib/intel64  -L/uufs/
> chpc.utah.edu/sys/pkg/intel/ics/composer_xe_2015/lib/intel64 -limf
> -lifcore " \
>           -DGMX_MPI=on  -DGMX_DOUBLE=ON  -DGMX_FFT_LIBRARY=mkl
> -DGMX_BUILD_MANUAL=on \
>           -DGMX_OPENMP=OFF  -DGMX_THREAD_MPI=OFF
> make -j 6
> make check
> # 100% tests passed, 0 tests failed out of 15
> #
> # Label Time Summary:
> # GTest              =   5.77 sec
> # IntegrationTest    =  24.73 sec
> # UnitTest           =   5.77 sec
>
> make install
>
>
> # Regression tests
> wget http://gerrit.gromacs.org/download/regressiontests-5.0.4.tar.gz
> tar -zxvf regressiontests-5.0.4.tar.gz<
> http://gerrit.gromacs.org/download/regressiontests-5.0.4.tar.gz>
> cd regressiontests-5.0.4
> module load intel/2015.1.133
> module load mvapich2/2.0.em.i
> source /uufs/
> chpc.utah.edu/sys/installdir/gromacs/5.0.4-mvapich2-2.0.em.i/bin/GMXRC
> which mdrun_mpi_d
> # /uufs/
> chpc.utah.edu/sys/installdir/gromacs/5.0.4-mvapich2-2.0.em.i/bin/mdrun_mpi_d
> which mpirun
> # /uufs/ember.arches/sys/pkg/mvapich2/2.0i/bin/mpirun
> ./gmxtest.pl -np 12 -verbose -double  -mdrun  /uufs/
> chpc.utah.edu/sys/installdir/gromacs/5.0.4-mvapich2-2.0.em.i/bin/mdrun_mpi_d
>  -mpirun /uufs/ember.arches/sys/pkg/mvapich2/2.0i/bin/mpirun simple
>
>
> #
> ---------------------------------------------------------------------------------------------------------------------------------------------
> Abnormal return value for
> '/uufs/ember.arches/sys/pkg/mvapich2/2.0i/bin/mpirun -np 12 -wdir /uufs/
> chpc.utah.edu/common/home/u0253283/regressiontests-5.0.4/simple/rb125
> /uufs/
> chpc.utah.edu/sys/installdir/gromacs/5.0.4-mvapich2-2.0.em.i/bin/mdrun_mpi_d
>     -notunepme >mdrun.out 2>&1' was -1
> FAILED. Check mdrun.out, md.log file(s) in rb125 for rb125
> 16 out of 16 simple tests FAILED
>
> GROMACS:      gmx mdrun, VERSION 5.0.4 (double precision)
> Executable:   /uufs/
> chpc.utah.edu/sys/installdir/gromacs/5.0.4-mvapich2-2.0.em.i/bin/gmx_mpi_d
> Library dir:  /uufs/
> chpc.utah.edu/sys/installdir/gromacs/5.0.4-mvapich2-2.0.em.i/share/gromacs/top
> Command line:
>   mdrun_mpi_d -notunepme
>
> Reading file topol.tpr, VERSION 5.0.4 (double precision)
> [em110:mpi_rank_0][error_sighandler] Caught error: Segmentation fault
> (signal 11)
> [em110:mpi_rank_1][error_sighandler] Caught error: Segmentation fault
> (signal 11)
> [em110:mpi_rank_3][error_sighandler] Caught error: Segmentation fault
> (signal 11)
> [em110:mpi_rank_4][error_sighandler] Caught error: Segmentation fault
> (signal 11)
> [em110:mpi_rank_9][error_sighandler] Caught error: Segmentation fault
> (signal 11)
> [em110:mpi_rank_2][error_sighandler] Caught error: Segmentation fault
> (signal 11)
> [em110:mpi_rank_5][error_sighandler] Caught error: Segmentation fault
> (signal 11)
> [em110:mpi_rank_6][error_sighandler] Caught error: Segmentation fault
> (signal 11)
> [em110:mpi_rank_7][error_sighandler] Caught error: Segmentation fault
> (signal 11)
> [em110:mpi_rank_8][error_sighandler] Caught error: Segmentation fault
> (signal 11)
> [em110:mpi_rank_11][error_sighandler] Caught error: Segmentation fault
> (signal 11)
>
>
> --
> Gromacs Users mailing list
>
> * Please search the archive at
> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
> posting!
>
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
> * For (un)subscribe requests visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
> send a mail to gmx-users-request at gromacs.org.
>


More information about the gromacs.org_gmx-users mailing list