[gmx-users] Failed HardwareTopologyTest on Install (GROMACS 2019.4)

Mark Abraham mark.j.abraham at gmail.com
Wed Mar 18 07:53:07 CET 2020


Hi,

That's benign, and probably indicates some kind of mismatch in how hwloc
was linked (e.g. headers from one version and libraries from another) or
compiled (different compiler or version). mdrun will probably say the same
thing, but 2019 has fallback code that means you don't need hwloc support
for normal functionality, so you could just cmake -DGMX_HWLOC=off too.

Mark

On Tue, 17 Mar 2020 at 23:16, Adam Antoszewski <antoszewski at uchicago.edu>
wrote:

> Hey all,
>
> I am trying to install GROMACS 2019.4 (patched with plumed 2.5.3)  on the
> Bridges <https://portal.xsede.org/psc-bridges> cluster, which uses Intel
> Haswell CPUS. When compiling/installing with GNU compiler (GCC 4.8.4) and
> OpenMPI, the compile runs well. The cmake command I used was :
>
> cmake .. \
>
>     -DCMAKE_C_COMPILER=mpicc \
>
>     -DCMAKE_CXX_COMPILER=mpicxx \
>
>     -DGMX_MPI=ON \
>
>     -DCMAKE_INSTALL_PREFIX="${INSTALL_ROOT}" \
>
>     -DGMX_BUILD_OWN_FFTW=OFF \
>
>     -DGMX_FFT_LIBRARY=fftw3 \
>
>     -DREGRESSIONTEST_PATH="${BUILD_ROOT}/regressiontests-2019.4" \
>     -DGMX_SIMD=AVX2_256
>
> Upon running 'make check', I pass 47/48 tests. The only test I fail is the
> HardwareUnitTests, specifically the HardwareTopologyTest. The error message
> I receive is reproduced below.
>
> -------
>
> 11/46 Test #11: HardwareUnitTests ...................***Failed    0.65 sec
>
> [==========] Running 5 tests from 2 test cases.
>
> [----------] Global test environment set-up.
>
> [----------] 1 test from CpuInfoTest
>
> [ RUN      ] CpuInfoTest.SupportLevel
>
> [       OK ] CpuInfoTest.SupportLevel (1 ms)
>
> [----------] 1 test from CpuInfoTest (1 ms total)
>
>
> [----------] 4 tests from HardwareTopologyTest
>
> [ RUN      ] HardwareTopologyTest.Execute
>
> [       OK ] HardwareTopologyTest.Execute (57 ms)
>
> [ RUN      ] HardwareTopologyTest.HwlocExecute
>
>
> /pylon5/mc5fphp/anto/gromacs_src/gromacs-2019.4/src/gromacs/hardware/tests/hardwaretopology.cpp:88:
> Failure
>
> Expected: (hwTop.supportLevel()) >=
> (gmx::HardwareTopology::SupportLevel::Basic), actual: 4-byte object <01-00
> 00-00> vs 4-byte object <02-00 00-00>
>
> Cannot determine basic hardware topology from hwloc. GROMACS will still
>
>
> work, but it might affect your performance for large nodes.
>
> Please mail gmx-developers at gromacs.org so we can try to fix it.
>
> [  FAILED  ] HardwareTopologyTest.HwlocExecute (51 ms)
>
> [ RUN      ] HardwareTopologyTest.ProcessorSelfconsistency
>
> [       OK ] HardwareTopologyTest.ProcessorSelfconsistency (50 ms)
>
> [ RUN      ] HardwareTopologyTest.NumaCacheSelfconsistency
>
> [       OK ] HardwareTopologyTest.NumaCacheSelfconsistency (48 ms)
>
> [----------] 4 tests from HardwareTopologyTest (206 ms total)
>
>
> [----------] Global test environment tear-down
>
> [==========] 5 tests from 2 test cases ran. (207 ms total)
>
> [  PASSED  ] 4 tests.
>
> [  FAILED  ] 1 test, listed below:
>
> [  FAILED  ] HardwareTopologyTest.HwlocExecute
>
>
>  1 FAILED TEST
>
>
> .
>
> .
>
> .
>
>
> 98% tests passed, 1 tests failed out of 46
>
>
> Label Time Summary:
>
> GTest              = 137.68 sec (40 tests)
>
> IntegrationTest    = 103.21 sec (5 tests)
>
> MpiTest            =   1.36 sec (3 tests)
>
> SlowTest           =  15.28 sec (1 test)
>
> UnitTest           =  19.18 sec (34 tests)
>
>
> Total Test time (real) = 1461.58 sec
>
>
> The following tests FAILED:
>
> 11 - HardwareUnitTests (Failed)
>
> Errors while running CTest
>
> make[3]: *** [CMakeFiles/run-ctest-nophys] Error 8
>
> make[2]: *** [CMakeFiles/run-ctest-nophys.dir/all] Error 2
>
> make[1]: *** [CMakeFiles/check
> --------
>
> The rest of the tests pass without issue, so . Any ideas? Is this some
> issue with hwloc, and is there anything I can do to fix it, or do I need to
> contact the sys admins? Is this a big issue, even with the failed test? The
> nodes only ever have 24 CPUs, so I'm not sure if that counts as a 'large
> node' (or if the number of nodes used per simulation would matter). Thanks
> in advance for the help! By the way, I am emailing this list because the
> error message told me to email gmx-developers, but I don't have permissions
> to post there - I apologize if this is the wrong forum.
>
> Thanks so much!
>
> --
> Adam Antoszewski, PhD Candidate
> Dept. of Chemistry, The University of Chicago
> antoszewski at uchicago.edu
> --
> Gromacs Users mailing list
>
> * Please search the archive at
> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
> posting!
>
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
> * For (un)subscribe requests visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
> send a mail to gmx-users-request at gromacs.org.
>


More information about the gromacs.org_gmx-users mailing list