[gmx-users] gromacs 2018 for GPU install on cluster with truly static libraries?

Mark Abraham mark.j.abraham at gmail.com
Mon Jul 23 12:07:06 CEST 2018


Hi,

Most of those are going to have the same problem and solution for every
installed piece of software that uses CMake, so perhaps you already have
some local knowledge to exploit or share?

You can use -DCMAKE_EXE_LINKER_FLAGS="-static" and probably also
-static-intel (see Intel's docs) in the first call of CMake and perhaps it
will only find static libraries, but I've never tried that. Otherwise, all
the find_library calls fill CMake cache variables that you can set manually
to the full path of the static libraries you need (eg for cufft see
http://docs.nvidia.com/cuda/cufft/index.html). Most places don't bother
with all this, and instead make a shared folder available on the compute
nodes at run time. That's not optimal for performance, however.

Mark

On Mon, Jul 23, 2018, 02:51 Shayna Hilburg <shilburg at mit.edu> wrote:

> Thank you!
> I see these:
>
> linux-vdso.so.1 =>  (0x00007ffc4f0df000)
>
> libpthread.so.0 => /lib/x86_64-linux-gnu/libpthread.so.0
> (0x00007f10b6b51000)
>
> libdl.so.2 => /lib/x86_64-linux-gnu/libdl.so.2 (0x00007f10b694d000)
>
> librt.so.1 => /lib/x86_64-linux-gnu/librt.so.1 (0x00007f10b6745000)
>
> libcufft.so.8.0 => /mnt/shared/cuda8/lib64/libcufft.so.8.0
> (0x00007f10ad8f6000)
>
> libmkl_intel_lp64.so => /home/shayna/miniconda3/lib/libmkl_intel_lp64.so
> (0x00007f10acde3000)
>
> libmkl_intel_thread.so =>
> /home/shayna/miniconda3/lib/libmkl_intel_thread.so (0x00007f10aaa15000)
>
> libmkl_core.so => /home/shayna/miniconda3/lib/libmkl_core.so
> (0x00007f10a69e3000)
>
> libiomp5.so => /home/shayna/miniconda3/lib/libiomp5.so (0x00007f10a6608000)
>
> libm.so.6 => /lib/x86_64-linux-gnu/libm.so.6 (0x00007f10a62ff000)
>
> libstdc++.so.6 => /home/shayna/miniconda3/lib/libstdc++.so.6
> (0x00007f10a5fc5000)
>
> libgcc_s.so.1 => /home/shayna/miniconda3/lib/libgcc_s.so.1
> (0x00007f10a5db3000)
>
> libc.so.6 => /lib/x86_64-linux-gnu/libc.so.6 (0x00007f10a59e9000)
>
> /lib64/ld-linux-x86-64.so.2 (0x00007f10b6d6e000)
>
> Is there a simple way to get all of these or a suggestion for a better way
> to run gromacs on our computing cluster?
>
> Thanks,
> Shayna
>
>
>
> On Sun, Jul 22, 2018 at 8:11 PM Mark Abraham <mark.j.abraham at gmail.com>
> wrote:
>
> > Hi,
> >
> > CMake will link to whatever it is allowed to find. What does ldd on the
> > executable report as the libraries being dynamically linked? Those are
> the
> > ones that cmake found for which there were apparently no static
> > equivalents.
> >
> > Mark
> >
> > On Sun, Jul 22, 2018, 18:16 Shayna Hilburg <shilburg at mit.edu> wrote:
> >
> > > Hi all,
> > >
> > > I'm trying to install GROMACS 2018 for use on GPUs. We typically keep
> the
> > > software on the master node and just call it through a mounted drive on
> > the
> > > compute nodes. However, despite using static library tags, it appears
> > there
> > > are still dependencies. It works fine on our master node but not on the
> > > compute nodes yet.
> > > Does anyone have a method for installing and reliably running GROMACS
> in
> > > this way (with all libraries in a prescribed location)? Any help would
> be
> > > appreciated please! At least a list of which libraries we need to
> > manually
> > > install
> > >
> > > Some information:
> > > -We tried these 2 different install methods in our efforts:
> > > cmake .. -DGMX_BUILD_OWN_FFTW=ON -DREGRESSIONTEST_DOWNLOAD=ON
> > -DGMX_GPU=on
> > > -DCMAKE_INSTALL_PREFIX=/mnt/shared/gromacs2018 -DBUILD_SHARED_LIBS=OFF
> > > -DGMX_FFT_LIBRARY=fftw3 -DCUDA_TOOLKIT_ROOT_DIR=/mnt/shared/cuda8
> > > -DGMX_PREFER_STATIC_LIBS=ON
> > >
> > > *cmake .. -DGMX_BUILD_OWN_FFTW=ON -DREGRESSIONTEST_DOWNLOAD=ON
> > -DGMX_GPU=on
> > > -DCMAKE_INSTALL_PREFIX=/mnt/shared/gromacs2018
> -DGMX_BUILD_SHARED_EXE=OFF
> > > -DGMX_FFT_LIBRARY=fftw3 -DCUDA_TOOLKIT_ROOT_DIR=/mnt/shared/cuda8*
> > >
> > > Thank you!
> > > Shayna
> > >
> > >
> > > --
> > >
> > >
> > > Shayna Hilburg
> > > Doctoral Candidate
> > > Massachusetts Institute of Technology
> > > Department of Materials Science and Engineering
> > > Program in Polymers and Soft Matter
> > > Alexander-Katz <http://soft-materials.scripts.mit.edu/www/> Group
> > > --
> > > Gromacs Users mailing list
> > >
> > > * Please search the archive at
> > > http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
> > > posting!
> > >
> > > * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
> > >
> > > * For (un)subscribe requests visit
> > > https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
> > > send a mail to gmx-users-request at gromacs.org.
> > >
> > --
> > Gromacs Users mailing list
> >
> > * Please search the archive at
> > http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
> > posting!
> >
> > * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
> >
> > * For (un)subscribe requests visit
> > https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
> > send a mail to gmx-users-request at gromacs.org.
> >
> --
> Gromacs Users mailing list
>
> * Please search the archive at
> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
> posting!
>
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
> * For (un)subscribe requests visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
> send a mail to gmx-users-request at gromacs.org.
>


More information about the gromacs.org_gmx-users mailing list