[gmx-users] Problem with CUDA
Borchert, Christopher B ERDC-RDE-ITL-MS Contractor
Christopher.B.Borchert at erdc.dren.mil
Mon Apr 9 15:57:18 CEST 2018
Thanks. That was already set by cuda module. Didn't help me.
Chris
-----Original Message-----
From: gromacs.org_gmx-users-bounces at maillist.sys.kth.se [mailto:gromacs.org_gmx-users-bounces at maillist.sys.kth.se] On Behalf Of Szilárd Páll
Sent: Saturday, April 07, 2018 11:28 AM
To: Discussion list for GROMACS users <gmx-users at gromacs.org>
Cc: gromacs.org_gmx-users at maillist.sys.kth.se
Subject: Re: [gmx-users] Problem with CUDA
Sorry, forgot to mention that for dynamic linking you (might?) need export CRAYPE_LINK_TYPE=dynamic
--
Szilárd
On Sat, Apr 7, 2018 at 12:09 AM, Borchert, Christopher B ERDC-RDE-ITL-MS Contractor <Christopher.B.Borchert at erdc.dren.mil>
wrote:
> Unfortunately using your shortened cmake args, I still get fpic
> errors. But it does complete the build statically with
> -DGMX_BUILD_SHARED_EXE=OFF
>
> CC=cc CXX=CC cmake ../ -DGMX_SIMD=AVX2_256 -DGMX_MPI=ON -DGMX_GPU=ON -DCMAKE_PREFIX_PATH=${FFTW_DIR}/..
>
> /usr/bin/ld:
> CMakeFiles/libgromacs.dir/mdlib/nbnxn_cuda/libgromacs_generated_nbnxn_
> cuda.cu.o: relocation R_X86_64_32 against
> `_Z58nbnxn_kernel_ElecEwQSTabTwinCut_VdwLJEwCombLB_F_prune_cuda11cu_at
> omdata10cu_nbparam8cu_plistb' can not be used when making a shared
> object; recompile with -fPIC
>
> Thanks,
> Chris
>
> -----Original Message-----
> From: gromacs.org_gmx-users-bounces at maillist.sys.kth.se
> [mailto:gromacs.org_gmx-users-bounces at maillist.sys.kth.se] On Behalf
> Of Szilárd Páll
> Sent: Friday, April 06, 2018 2:40 PM
> To: Discussion list for GROMACS users <gmx-users at gromacs.org>
> Cc: gromacs.org_gmx-users at maillist.sys.kth.se
> Subject: Re: [gmx-users] Problem with CUDA
>
> On CSCS Piz Daint I use the following command line (assuming
> PrgEnv-gnu) where everything in "[]" is optional and compilation should work just fine without.
>
> CC=cc CXX=CC cmake ../
> -DGMX_SIMD=THE_RIGHT_SIMD_FLAVOR
> -DGMX_MPI=ON
> -DGMX_GPU=ON
> -DCMAKE_PREFIX_PATH=${FFTW_DIR}/.. \
> [ -DGMX_FFT_LIBRARY=fftw3
> -DGMX_CUDA_TARGET_SM=60
> -DGMX_PREFER_STATIC_LIBS=ON
> -DBUILD_SHARED_LIBS=OFF
> -DGMX_BUILD_MDRUN_ONLY=ON
> -DGMX_EXTERNAL_BLAS=OFF -DGMX_EXTERNAL_LAPACK=OFF ]
>
> In fact, other than the "-DCMAKE_PREFIX_PATH=${FFTW_DIR}/.."and
> setting the right compiler wrappers, the rest is most of the time unnecessary for a "vanilla" build.
>
> Cheers,
> --
> Szilárd
>
>
> On Fri, Apr 6, 2018 at 8:32 PM, Borchert, Christopher B
> ERDC-RDE-ITL-MS Contractor <Christopher.B.Borchert at erdc.dren.mil>
> wrote:
>> You are trying to give me a hint. :) My cmake args are taken from a co-worker, and the statement syntax is from the CMakeCache.txt file. On a Cray you force cc/CC and all the module libraries/headers should be automatically found. Strangely it didn’t find fftw without help. Regardless I get the fpic error. But you've given me a path to investigate. Thanks.
>>
>> cmake ..
>> -DGMX_GPU=ON
>> -DCMAKE_C_COMPILER:FILEPATH=`which cc`
>> -DCMAKE_CXX_COMPILER:FILEPATH=`which CC`
>> -DGMX_FFT_LIBRARY=fftw3
>> -DFFTWF_LIBRARY:FILEPATH=${FFTW_DIR}/libfftw3f.so
>> -DFFTWF_INCLUDE_DIR:PATH=$FFTW_INC
>>
>> /usr/bin/ld:
>> CMakeFiles/libgromacs.dir/mdlib/nbnxn_cuda/libgromacs_generated_nbnxn
>> _
>> cuda.cu.o: relocation R_X86_64_32 against
>> `_Z58nbnxn_kernel_ElecEwQSTabTwinCut_VdwLJEwCombLB_F_prune_cuda11cu_a
>> t omdata10cu_nbparam8cu_plistb' can not be used when making a shared
>> object; recompile with -fPIC
>>
>> Chris
>>
>> -----Original Message-----
>> From: gromacs.org_gmx-users-bounces at maillist.sys.kth.se
>> [mailto:gromacs.org_gmx-users-bounces at maillist.sys.kth.se] On Behalf
>> Of Szilárd Páll
>> Sent: Friday, April 06, 2018 12:05 PM
>> To: Discussion list for GROMACS users <gmx-users at gromacs.org>
>> Cc: gromacs.org_gmx-users at maillist.sys.kth.se
>> Subject: Re: [gmx-users] Problem with CUDA
>>
>> FYI: not even a my vanilla (non-CRAY) local build which does work otherwise succeeds with cmake . -DCUDA_NVCC_FLAGS:STRING="-rdc=true"
>> so as I guessed, that's the culprit.
>>
>> Out of curiosity I wonder: what's the reason for the "inventive" use of CMake options, none of which are needed?
>>
>> --
>> Szilárd
>>
>>
>> On Fri, Apr 6, 2018 at 6:57 PM, Szilárd Páll <pall.szilard at gmail.com> wrote:
>>> I think the fpic errors can't be caused by missing rdc=true because
>>> the latter refers to the GPU _device_ code, but GROMACS does not
>>> need relocatable device code, so that should not be necessary.
>>> --
>>> Szilárd
>>>
>>>
>>> On Fri, Apr 6, 2018 at 6:33 PM, Borchert, Christopher B
>>> ERDC-RDE-ITL-MS Contractor <Christopher.B.Borchert at erdc.dren.mil>
>>> wrote:
>>>> Thanks Szilárd. My understanding is rdc is nvcc's equivalent of fpic. I get fpic errors without it. In fact I get fpic errors without including fpic explicitly in the C/CXX flags.
>>>>
>>>> /usr/bin/ld:
>>>> CMakeFiles/libgromacs.dir/mdlib/nbnxn_cuda/libgromacs_generated_nbn
>>>> x
>>>> n
>>>> _cuda.cu.o: relocation R_X86_64_32 against
>>>> `_Z58nbnxn_kernel_ElecEwQSTabTwinCut_VdwLJEwCombLB_F_prune_cuda11cu
>>>> _ a tomdata10cu_nbparam8cu_plistb' can not be used when making a
>>>> shared object; recompile with -fPIC
>>>>
>>>> So I removed boost, avx2, mpi, and dynamic but get the same result. What else should I remove?
>>>>
>>>> cmake ..
>>>> -DCMAKE_VERBOSE_MAKEFILE:BOOL=TRUE
>>>> -DCMAKE_C_COMPILER:FILEPATH=`which cc` -DCMAKE_C_FLAGS:STRING=-fPIC
>>>> -DCMAKE_CXX_COMPILER:FILEPATH=`which CC`
>>>> -DCMAKE_CXX_FLAGS:STRING=-fPIC -DCMAKE_INSTALL_PREFIX:PATH=$PREFIX
>>>> -DGMX_FFT_LIBRARY=fftw3
>>>> -DFFTWF_LIBRARY:FILEPATH=${FFTW_DIR}/libfftw3f.so
>>>> -DFFTWF_INCLUDE_DIR:PATH=$FFTW_INC
>>>> -DGMX_GPU=ON
>>>> -DCUDA_NVCC_FLAGS:STRING="-rdc=true"
>>>>
>>>> /opt/cray/pe/craype/2.5.13/bin/CC -march=core-avx2 -fPIC -std=c++11 -O3 -DNDEBUG -funroll-all-loops -fexcess-precision=fast CMakeFiles/template.dir/template.cpp.o -o ../../bin/template -Wl,-rpath,/p/work/borchert/gromacs-2018.1/build/lib ../../lib/libgromacs.so.3.1.0 -fopenmp -lm
>>>> ../../lib/libgromacs.so.3.1.0: undefined reference to `__cudaRegisterLinkedBinary_59_tmpxft_000001e3_00000000_21_pmalloc_cuda_compute_61_cpp1_ii_63d60154'
>>>> ../../lib/libgromacs.so.3.1.0: undefined reference to `__cudaRegisterLinkedBinary_57_tmpxft_0000a64f_00000000_21_pme_spread_compute_61_cpp1_ii_d982d3ad'
>>>> ../../lib/libgromacs.so.3.1.0: undefined reference to `__cudaRegisterLinkedBinary_71_tmpxft_000003a4_00000000_21_cuda_version_information_compute_61_cpp1_ii_8ab8dc1d'
>>>> ../../lib/libgromacs.so.3.1.0: undefined reference to `__cudaRegisterLinkedBinary_58_tmpxft_0000a80b_00000000_21_pme_timings_compute_61_cpp1_ii_75ae0e44'
>>>> ../../lib/libgromacs.so.3.1.0: undefined reference to `__cudaRegisterLinkedBinary_56_tmpxft_0000a10b_00000000_21_pme_3dfft_compute_61_cpp1_ii_79dff388'
>>>> ../../lib/libgromacs.so.3.1.0: undefined reference to `__cudaRegisterLinkedBinary_57_tmpxft_00009bd7_00000000_21_nbnxn_cuda_compute_61_cpp1_ii_f147f02c'
>>>> ../../lib/libgromacs.so.3.1.0: undefined reference to `__cudaRegisterLinkedBinary_50_tmpxft_0000a9c8_00000000_21_pme_compute_61_cpp1_ii_6dbf966c'
>>>> ../../lib/libgromacs.so.3.1.0: undefined reference to `__cudaRegisterLinkedBinary_56_tmpxft_0000a490_00000000_21_pme_solve_compute_61_cpp1_ii_06051a94'
>>>> ../../lib/libgromacs.so.3.1.0: undefined reference to `__cudaRegisterLinkedBinary_56_tmpxft_0000ab85_00000000_21_cudautils_compute_61_cpp1_ii_25933dd5'
>>>> ../../lib/libgromacs.so.3.1.0: undefined reference to `__cudaRegisterLinkedBinary_54_tmpxft_0000aefc_00000000_21_pinning_compute_61_cpp1_ii_5d0f4aae'
>>>> ../../lib/libgromacs.so.3.1.0: undefined reference to `__cudaRegisterLinkedBinary_56_tmpxft_0000ad42_00000000_21_gpu_utils_compute_61_cpp1_ii_70828085'
>>>> ../../lib/libgromacs.so.3.1.0: undefined reference to `__cudaRegisterLinkedBinary_57_tmpxft_0000a2d0_00000000_21_pme_gather_compute_61_cpp1_ii_a7a2f9c7'
>>>> ../../lib/libgromacs.so.3.1.0: undefined reference to `__cudaRegisterLinkedBinary_67_tmpxft_00009f4e_00000000_21_nbnxn_cuda_data_mgmt_compute_61_cpp1_ii_a1eafeba'
>>>>
>>>> Chris
>>>>
>>>> -----Original Message-----
>>>> From: gromacs.org_gmx-users-bounces at maillist.sys.kth.se
>>>> [mailto:gromacs.org_gmx-users-bounces at maillist.sys.kth.se] On
>>>> Behalf Of Szilárd Páll
>>>> Sent: Friday, April 06, 2018 10:17 AM
>>>> To: Discussion list for GROMACS users <gmx-users at gromacs.org>
>>>> Cc: gromacs.org_gmx-users at maillist.sys.kth.se
>>>> Subject: Re: [gmx-users] Problem with CUDA
>>>>
>>>> Hi,
>>>>
>>>> What is the reason for using the custom CMake options? What's the -rdc=true for -- I don't think it's needed and it can very well be causing the issue. Have you tried to actually do an as-vanilla-as-possible build?
>>>>
>>>> --
>>>> Szilárd
>>>>
>>>>
>>>> On Thu, Apr 5, 2018 at 6:52 PM, Borchert, Christopher B
>>>> ERDC-RDE-ITL-MS Contractor <Christopher.B.Borchert at erdc.dren.mil>
>>>> wrote:
>>>>> Hello. I'm taking a working build from a co-worker and trying to add GPU support on a Cray XC. CMake works but make fails. Both 2016 and 2018 die at the same point -- can't find gromac's own routines.
>>>>>
>>>>> 2016.5:
>>>>> /opt/cray/pe/craype/2.5.13/bin/CC -march=core-avx2 -O2 -fPIC -dynamic -std=c++0x -O3 -DNDEBUG -funroll-all-loops -fexcess-precision=fast -dynamic CMakeFiles/template.dir/template.cpp.o -o ../../bin/template -Wl,-rpath,/p/work/cots/gromacs-2016.5/build/lib:/opt/nvidia/cudatoolkit8.0/8.0.54_2.3.12_g180d272-2.2/lib64/stubs -dynamic ../../lib/libgromacs_mpi.so.2.5.0 -fopenmp -lcudart /opt/nvidia/cudatoolkit8.0/8.0.54_2.3.12_g180d272-2.2/lib64/stubs/libnvidia-ml.so -lhwloc -lz -ldl -lrt -lm -lfftw3f
>>>>> ../../lib/libgromacs_mpi.so.2.5.0: undefined reference to `__cudaRegisterLinkedBinary_59_tmpxft_0001bc78_00000000_21_pmalloc_cuda_compute_61_cpp1_ii_63d60154'
>>>>> ../../lib/libgromacs_mpi.so.2.5.0: undefined reference to `__cudaRegisterLinkedBinary_56_tmpxft_0001bac2_00000000_21_gpu_utils_compute_61_cpp1_ii_d70ebee0'
>>>>> ../../lib/libgromacs_mpi.so.2.5.0: undefined reference to `__cudaRegisterLinkedBinary_56_tmpxft_0001b90b_00000000_21_cudautils_compute_61_cpp1_ii_24d20763'
>>>>> ../../lib/libgromacs_mpi.so.2.5.0: undefined reference to `__cudaRegisterLinkedBinary_71_tmpxft_0001c016_00000000_21_cuda_version_information_compute_61_cpp1_ii_e35285be'
>>>>> ../../lib/libgromacs_mpi.so.2.5.0: undefined reference to `__cudaRegisterLinkedBinary_57_tmpxft_0001b592_00000000_21_nbnxn_cuda_compute_61_cpp1_ii_6e47f057'
>>>>> ../../lib/libgromacs_mpi.so.2.5.0: undefined reference to `__cudaRegisterLinkedBinary_67_tmpxft_0001b754_00000000_21_nbnxn_cuda_data_mgmt_compute_61_cpp1_ii_a1eafeba'
>>>>> collect2: error: ld returned 1 exit status
>>>>>
>>>>> 2018.1:
>>>>> /opt/cray/pe/craype/2.5.13/bin/CC -march=core-avx2 -O2 -fPIC -dynamic -std=c++11 -O3 -DNDEBUG -funroll-all-loops -fexcess-precision=fast -dynamic CMakeFiles/template.dir/template.cpp.o -o ../../bin/template -Wl,-rpath,/p/work/cots/gromacs-2018.1/build/lib ../../lib/libgromacs_mpi.so.3.1.0 -fopenmp -lm
>>>>> ../../lib/libgromacs_mpi.so.3.1.0: undefined reference to `__cudaRegisterLinkedBinary_56_tmpxft_000068a5_00000000_21_pme_3dfft_compute_61_cpp1_ii_79dff388'
>>>>> ../../lib/libgromacs_mpi.so.3.1.0: undefined reference to `__cudaRegisterLinkedBinary_67_tmpxft_00006621_00000000_21_nbnxn_cuda_data_mgmt_compute_61_cpp1_ii_a1eafeba'
>>>>> ../../lib/libgromacs_mpi.so.3.1.0: undefined reference to `__cudaRegisterLinkedBinary_57_tmpxft_00006f47_00000000_21_pme_spread_compute_61_cpp1_ii_d982d3ad'
>>>>> ../../lib/libgromacs_mpi.so.3.1.0: undefined reference to `__cudaRegisterLinkedBinary_56_tmpxft_00006d70_00000000_21_pme_solve_compute_61_cpp1_ii_06051a94'
>>>>> ../../lib/libgromacs_mpi.so.3.1.0: undefined reference to `__cudaRegisterLinkedBinary_59_tmpxft_00008da7_00000000_21_pmalloc_cuda_compute_61_cpp1_ii_63d60154'
>>>>> ../../lib/libgromacs_mpi.so.3.1.0: undefined reference to `__cudaRegisterLinkedBinary_50_tmpxft_00007930_00000000_21_pme_compute_61_cpp1_ii_6dbf966c'
>>>>> ../../lib/libgromacs_mpi.so.3.1.0: undefined reference to `__cudaRegisterLinkedBinary_58_tmpxft_00007382_00000000_21_pme_timings_compute_61_cpp1_ii_75ae0e44'
>>>>> ../../lib/libgromacs_mpi.so.3.1.0: undefined reference to `__cudaRegisterLinkedBinary_57_tmpxft_00006b11_00000000_21_pme_gather_compute_61_cpp1_ii_a7a2f9c7'
>>>>> ../../lib/libgromacs_mpi.so.3.1.0: undefined reference to `__cudaRegisterLinkedBinary_56_tmpxft_00007f9f_00000000_21_cudautils_compute_61_cpp1_ii_25933dd5'
>>>>> ../../lib/libgromacs_mpi.so.3.1.0: undefined reference to `__cudaRegisterLinkedBinary_54_tmpxft_000088f9_00000000_21_pinning_compute_61_cpp1_ii_5d0f4aae'
>>>>> ../../lib/libgromacs_mpi.so.3.1.0: undefined reference to `__cudaRegisterLinkedBinary_57_tmpxft_000039b7_00000000_21_nbnxn_cuda_compute_61_cpp1_ii_f147f02c'
>>>>> ../../lib/libgromacs_mpi.so.3.1.0: undefined reference to `__cudaRegisterLinkedBinary_71_tmpxft_000091d4_00000000_21_cuda_version_information_compute_61_cpp1_ii_8ab8dc1d'
>>>>> ../../lib/libgromacs_mpi.so.3.1.0: undefined reference to `__cudaRegisterLinkedBinary_56_tmpxft_00008407_00000000_21_gpu_utils_compute_61_cpp1_ii_70828085'
>>>>> collect2: error: ld returned 1 exit status
>>>>>
>>>>> BUILD INSTRUCTIONS:
>>>>> module swap PrgEnv-cray PrgEnv-gnu module swap gcc gcc/5.3.0
>>>>> export CRAYPE_LINK_TYPE=dynamic
>>>>>
>>>>> module load cudatoolkit/8.0.54_2.3.12_g180d272-2.2
>>>>> module load cmake/gcc-6.3.0/3.7.2
>>>>> module load fftw/3.3.4.11
>>>>> export BOOST_DIR=/app/unsupported/boost/1.64.0-gcc-6.3.0
>>>>>
>>>>> export PREFIX=/app/unsupported/gromacs/201x.x
>>>>> mkdir $PREFIX
>>>>>
>>>>> cmake .. \
>>>>> -DCMAKE_VERBOSE_MAKEFILE:BOOL=TRUE \
>>>>> -DCMAKE_C_COMPILER:FILEPATH=`which cc` \
>>>>> -DCMAKE_C_FLAGS:STRING="-O2 -fPIC -dynamic" \
>>>>> -DCMAKE_CXX_COMPILER:FILEPATH=`which CC` \
>>>>> -DCMAKE_CXX_FLAGS:STRING="-O2 -fPIC -dynamic" \
>>>>> -DCMAKE_INSTALL_PREFIX:PATH=$PREFIX \
>>>>> -DGMX_FFT_LIBRARY=fftw3 \
>>>>> -DCMAKE_EXE_LINKER_FLAGS:STRING=-dynamic \
>>>>> -DGMX_SIMD=AVX2_256 \
>>>>> -DFFTWF_LIBRARY:FILEPATH=${FFTW_DIR}/libfftw3f.so \
>>>>> -DFFTWF_INCLUDE_DIR:PATH=$FFTW_INC \ -DBoost_DIR:PATH=$BOOST_DIR
>>>>> \ -DBoost_INCLUDE_DIR:PATH=${BOOST_DIR}/include \
>>>>> -DBoost_LIBRARY_DIR:PATH=${BOOST_DIR}/lib \ -DGMX_MPI=ON \
>>>>> -DGMX_GPU=ON \ -DCUDA_NVCC_FLAGS:STRING="-rdc=true" \
>>>>> -DCUDA_USE_STATIC_CUDA_RUNTIME:BOOL=OFF
>>>>>
>>>>> make 2>&1 | tee make.log
>>>>>
>>>>> Thanks,
>>>>> Chris
>>>>>
>>>>> --
>>>>> Gromacs Users mailing list
>>>>>
>>>>> * Please search the archive at BlockedBlockedBlockedBlockedhttp://www.gromacs.org/Support/Mailing_Lists/GMX-Users_ListBlockedBlockedBlockedBlocked before posting!
>>>>>
>>>>> * Can't post? Read
>>>>> BlockedBlockedBlockedBlockedhttp://www.gromacs.org/Support/Mailing
>>>>> _ListsBlBlocked
>>>>> ockedBloBlocked
>>>>> cked
>>>>>
>>>>> * For (un)subscribe requests visit
>>>>> BlockedBlockedBlockedBlockedhttps://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-usersBlockedBlockedBlockedBlocked or send a mail to gmx-users-request at gromacs.org.
>>>> --
>>>> Gromacs Users mailing list
>>>>
>>>> * Please search the archive at BlockedBlockedBlockedBlockedhttp://www.gromacs.org/Support/Mailing_Lists/GMX-Users_ListBlockedBlockedBlockedBlocked before posting!
>>>>
>>>> * Can't post? Read
>>>> BlockedBlockedBlockedBlockedhttp://www.gromacs.org/Support/Mailing_
>>>> ListsBloBlocked
>>>> ckedBlocBlocked
>>>> ked
>>>>
>>>> * For (un)subscribe requests visit
>>>> BlockedBlockedBlockedBlockedhttps://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-usersBlockedBlockedBlockedBlocked or send a mail to gmx-users-request at gromacs.org.
>>>> --
>>>> Gromacs Users mailing list
>>>>
>>>> * Please search the archive at BlockedBlockedBlockedhttp://www.gromacs.org/Support/Mailing_Lists/GMX-Users_ListBlockedBlockedBlocked before posting!
>>>>
>>>> * Can't post? Read
>>>> BlockedBlockedBlockedhttp://www.gromacs.org/Support/Mailing_ListsBl
>>>> ockedBloBlocked
>>>> cked
>>>>
>>>> * For (un)subscribe requests visit
>>>> BlockedBlockedBlockedhttps://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-usersBlockedBlockedBlocked or send a mail to gmx-users-request at gromacs.org.
>> --
>> Gromacs Users mailing list
>>
>> * Please search the archive at BlockedBlockedBlockedhttp://www.gromacs.org/Support/Mailing_Lists/GMX-Users_ListBlockedBlockedBlocked before posting!
>>
>> * Can't post? Read
>> BlockedBlockedBlockedhttp://www.gromacs.org/Support/Mailing_ListsBloc
>> kedBlockBlocked
>> ed
>>
>> * For (un)subscribe requests visit
>> BlockedBlockedBlockedhttps://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-usersBlockedBlockedBlocked or send a mail to gmx-users-request at gromacs.org.
>> --
>> Gromacs Users mailing list
>>
>> * Please search the archive at BlockedBlockedhttp://www.gromacs.org/Support/Mailing_Lists/GMX-Users_ListBlockedBlocked before posting!
>>
>> * Can't post? Read
>> BlockedBlockedhttp://www.gromacs.org/Support/Mailing_ListsBlockedBloc
>> ked
>>
>> * For (un)subscribe requests visit
>> BlockedBlockedhttps://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-usersBlockedBlocked or send a mail to gmx-users-request at gromacs.org.
> --
> Gromacs Users mailing list
>
> * Please search the archive at BlockedBlockedhttp://www.gromacs.org/Support/Mailing_Lists/GMX-Users_ListBlockedBlocked before posting!
>
> * Can't post? Read
> BlockedBlockedhttp://www.gromacs.org/Support/Mailing_ListsBlockedBlock
> ed
>
> * For (un)subscribe requests visit
> BlockedBlockedhttps://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-usersBlockedBlocked or send a mail to gmx-users-request at gromacs.org.
> --
> Gromacs Users mailing list
>
> * Please search the archive at Blockedhttp://www.gromacs.org/Support/Mailing_Lists/GMX-Users_ListBlocked before posting!
>
> * Can't post? Read
> Blockedhttp://www.gromacs.org/Support/Mailing_ListsBlocked
>
> * For (un)subscribe requests visit
> Blockedhttps://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-usersBlocked or send a mail to gmx-users-request at gromacs.org.
--
Gromacs Users mailing list
* Please search the archive at Blockedhttp://www.gromacs.org/Support/Mailing_Lists/GMX-Users_ListBlocked before posting!
* Can't post? Read Blockedhttp://www.gromacs.org/Support/Mailing_ListsBlocked
* For (un)subscribe requests visit
Blockedhttps://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-usersBlocked or send a mail to gmx-users-request at gromacs.org.
More information about the gromacs.org_gmx-users
mailing list