[gmx-users] MPI GPU job failed

Szilárd Páll pall.szilard at gmail.com
Thu Aug 11 17:55:47 CEST 2016


On Thu, Aug 11, 2016 at 4:22 PM, Albert <mailmd2011 at gmail.com> wrote:
> well, here is the command line I used for compiling:
>
>
> env CC=mpicc CXX=mpicxx F77=mpif90 FC=mpif90 LDF90=mpif90
> CMAKE_PREFIX_PATH=/soft/gromacs/fftw-3.3.4:/soft/intel/impi/5.1.3.223 cmake
> .. -DBUILD_SHARED_LIB=OFF -DBUILD_TESTING=OFF
> -DCMAKE_INSTALL_PREFIX=/soft/gromacs/5.1.3_intel -DGMX_MPI=ON -DGMX_GPU=ON
> -DGMX_PREFER_STATIC_LIBS=ON -DCUDA_TOOLKIT_ROOT_DIR=/usr/local/cuda
>
> here is my cshrc:
>
> source /soft/intel/bin/compilervars.csh intel64
> source /soft/intel/impi/5.1.3.223/bin64/mpivars.csh
> set path=(/soft/intel/impi/5.1.3.223/intel64/bin $path)
> setenv CUDA_HOME /usr/local/cuda
> setenv MKL_HOME /soft/intel/mkl/
> setenv LD_LIBRARY_PATH
> /soft/intel/compilers_and_libraries_2016.3.223/linux/mpi/lib64:/usr/local/cuda/lib64:/soft/intel/lib/intel64:/soft/intel/lib/ia32:/soft/intel/mkl/lib/intel64:/soft/intel/mkl/lib/ia32:{$LD_LIBRARY_PATH}
>
>
> It should build with MPI support with above settings. Does it?

It should. You can always verify it in the header of the log file.
It's always useful to post full logs here.

>
>
>
> On 08/11/2016 04:18 PM, Szilárd Páll wrote:
>>
>> PPS: given the double output (e.g. "Reading file 61.tpr, ...") it's
>> even more likely that you're using a non-PI build.
>>
>> BTW, looks like you had the same issue about two years ago:
>>
>> https://mailman-1.sys.kth.se/pipermail/gromacs.org_gmx-users/2014-September/092046.html
>
>
> --
> Gromacs Users mailing list
>
> * Please search the archive at
> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!
>
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
> * For (un)subscribe requests visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a
> mail to gmx-users-request at gromacs.org.


More information about the gromacs.org_gmx-users mailing list