[gmx-users] regression test errors
Paul bauer
paul.bauer.q at gmail.com
Wed Oct 23 13:44:28 CEST 2019
Hello Dave,
I thought it was something like that.
The error is harmless (just telling you that MPI is doing its job), and
the testing script gets confused because of the extra message in the
output file.
So I think you are good to go (and we need to do something about the
testing script).
Happy simulating!
Cheers
Paul
On 23/10/2019 13:39, Dave M wrote:
> Hi Paul,
>
> I checked using this command for a specific folder, and I used '-mpirun
> mdrun' rather '-mpirun mpirun':
>
>
> ./gmxtest.pl -mpirun mdrun -np 2 -noverbose rotation
>
>
>
> I get lot of these errors:
>
>
> topol.tpr file different from ./reference_s.tpr. Check files in flex for
> flex
>
> FAILED. Check checktpr.out, checktpr.err file(s) in flex for flex
>
> topol.tpr file different from ./reference_s.tpr. Check files in flex-t for
> flex-t
>
> FAILED. Check checktpr.out, checktpr.err file(s) in flex-t for flex-t
>
> topol.tpr file different from ./reference_s.tpr. Check files in flex2 for
> flex2
>
> FAILED. Check checktpr.out, checktpr.err file(s) in flex2 for flex2
>
> topol.tpr file different from ./reference_s.tpr. Check files in flex2-t for
> flex2-t
>
> FAILED. Check checktpr.out, checktpr.err file(s) in flex2-t for flex2-t
>
>
> .... so on
>
>
>
> A) the only suspicious thing I see in checktpr.err is possibly different
> software versions.
>
>
> Command line:
>
> gmx_mpi check -s1 ./reference_s.tpr -s2 topol.tpr -tol 0.0001 -abstol
> 0.001
>
>
> Note: When comparing run input files, default tolerances are reduced.
>
> Reading file ./reference_s.tpr, VERSION 5.0-beta2-dev-20140130-02adca5
> (single precision)
>
> Note: file tpx version 96, software tpx version 116
>
> Reading file topol.tpr, VERSION 2019.4 (single precision)
>
>
>
>
> B) And Only suspicious thing I see in checktpr.out is pasted below (I have
> removed the host ip number). Just to mention I use Amazon web services so
> probably the following error is related to instance when it was created and
> then stored as an image and then re-used with a different ip. May be am
> just talking silly!
>
>
> [[2115,1],0]: A high-performance Open MPI point-to-point messaging module
>
> was unable to find any relevant network interfaces:
>
>
> Module: OpenFabrics (openib)
>
> Host: ip-xxx-xx-xx-xxx
>
>
> Another transport will be used instead, although this may result in
>
> lower performance.
>
>
> NOTE: You can disable this warning by setting the MCA parameter
>
> btl_base_warn_component_unused to 0.
>
>
> On Wed, Oct 23, 2019 at 4:10 AM Paul bauer <paul.bauer.q at gmail.com> wrote:
>
>> Hello Dave,
>>
>> this is weird, no idea why it didn't work then.
>> You can try running the test suite manually in the folder you found with
>>
>> perl gmxtest.pl -mpirun mpirun -np X -noverbose
>>
>> That will show if the test binary works and should report any failing
>> tests.
>> Don't forget to source the GMXRC file before trying, though!
>>
>> Cheers
>>
>> Paul
>>
>>
>> On 23/10/2019 12:36, Dave M wrote:
>>> Hi Paul,
>>>
>>> Thanks for the 'mpirun -n X gmx_mpi mdrun'. It works now.
>>>
>>> Regarding tests, I found the folder here build/tests/regressiontests
>>> So I checked all the log using a simple script (searching keyword
>>> 'Finished') and it shows that all the log files have Finished properly
>> in
>>> their corresponding folders. So log files do not say anything here.
>>>
>>> On Wed, Oct 23, 2019 at 3:07 AM Paul Bauer <paul.bauer.q at gmail.com>
>> wrote:
>>>> Hello Dave,
>>>>
>>>> You need to use mpirun -n (number of processes) gmx_mpi mdrun to use a
>> MPI
>>>> enabled build of GROMACS. This is what the error message tries to tell
>> you,
>>>> but we might need to improve on this.
>>>>
>>>> There should be a regressiontests folder somewhere in your build tree
>> if it
>>>> downloaded the tests correctly.
>>>>
>>>> Cheers
>>>>
>>>> Paul
>>>>
>>>> On Wed, 23 Oct 2019, 12:02 Dave M, <dave.gromax at gmail.com> wrote:
>>>>
>>>>> Hi Paul,
>>>>>
>>>>> Thanks for your reply.
>>>>> a) I just checked there is no tests/regressiontests, some other folder
>> is
>>>>> there test/sphysicalvalidation
>>>>> There is no log file.
>>>>> b) Regarding thread-mpi I think it is not installed because when I use
>>>> some
>>>>> command like this:
>>>>>
>>>>>
>>>>> gmx_mpi mdrun -v -deffnm 03-run -rdd 2.0 -nt 2
>>>>>
>>>>> I get an error:
>>>>>
>>>>>
>>>>> Fatal error:
>>>>>
>>>>> Setting the total number of threads is only supported with thread-MPI
>> and
>>>>> GROMACS was compiled without thread-MPI
>>>>>
>>>>> I think (please correct me) gmx_mpi is for external MPI openMPI in my
>>>> case
>>>>> so I tried just 'gmx mdrun' (not gmx_mpi) but then it says command not
>>>>> found. I am not sure what I missed in installation cmake flags.
>>>>>
>>>>> Dave
>>>>>
>>>>> On Wed, Oct 23, 2019 at 2:45 AM Paul Bauer <paul.bauer.q at gmail.com>
>>>> wrote:
>>>>>> Hello Dave,
>>>>>>
>>>>>> Did you have a look into the log files from the regression tests under
>>>>>> tests/regressiontests?
>>>>>> They might give us some insight into what is happening.
>>>>>>
>>>>>> The warning in respect to thread-MPI is harmless, it just tells you
>>>> that
>>>>>> you are using real MPI instead of thread-MPI.
>>>>>>
>>>>>> Cheers
>>>>>> Paul
>>>>>>
>>>>>> On Wed, 23 Oct 2019, 07:36 Dave M, <dave.gromax at gmail.com> wrote:
>>>>>>
>>>>>>> Hi All,
>>>>>>>
>>>>>>> Any hints/help much appreciated why am getting regression tests
>>>>> failure.
>>>>>>> Also to mention I think thread-mpi was not installed as I got an
>>>> error
>>>>>>> saying "MPI is not compatible with thread-MPI. Disabling thread-MPI".
>>>>> How
>>>>>>> to check the compatibility?
>>>>>>>
>>>>>>> Thanks.
>>>>>>>
>>>>>>> best regards,
>>>>>>> D
>>>>>>>
>>>>>>> On Sun, Oct 20, 2019 at 2:58 AM Dave M <dave.gromax at gmail.com>
>>>> wrote:
>>>>>>>> Hi All,
>>>>>>>>
>>>>>>>> I am trying to install gromacs2019.4 with:
>>>>>>>> cmake .. -DGMX_BUILD_OWN_FFTW=ON -DREGRESSIONTEST_DOWNLOAD=on
>>>>>>>> -DGMX_MPI=on -DGMX_GPU=on
>>>>>>>> -DCMAKE_INSTALL_PREFIX=/usr/local/gromacs/gromacs2019_4
>>>>>>>> -DGMX_FFT_LIBRARY=fftw3 -DCMAKE_BUILD_TYPE=Debug
>>>>>>>>
>>>>>>>> But 5 tests (41 to 46) were failed copied below:
>>>>>>>>
>>>>>>>>
>>>>>>>> The following tests FAILED:
>>>>>>>>
>>>>>>>> 41 - regressiontests/simple (Failed)
>>>>>>>>
>>>>>>>> 42 - regressiontests/complex (Failed)
>>>>>>>>
>>>>>>>> 43 - regressiontests/kernel (Failed)
>>>>>>>>
>>>>>>>> 44 - regressiontests/freeenergy (Failed)
>>>>>>>>
>>>>>>>> 45 - regressiontests/rotation (Failed)
>>>>>>>>
>>>>>>>> Errors while running CTest
>>>>>>>>
>>>>>>>> CMakeFiles/run-ctest-nophys.dir/build.make:57: recipe for target
>>>>>>>> 'CMakeFiles/run-ctest-nophys' failed
>>>>>>>>
>>>>>>>> make[3]: *** [CMakeFiles/run-ctest-nophys] Error 8
>>>>>>>>
>>>>>>>> CMakeFiles/Makefile2:1392: recipe for target
>>>>>>>> 'CMakeFiles/run-ctest-nophys.dir/all' failed
>>>>>>>>
>>>>>>>> make[2]: *** [CMakeFiles/run-ctest-nophys.dir/all] Error 2
>>>>>>>>
>>>>>>>> CMakeFiles/Makefile2:1172: recipe for target
>>>>>> 'CMakeFiles/check.dir/rule'
>>>>>>>> failed
>>>>>>>>
>>>>>>>> make[1]: *** [CMakeFiles/check.dir/rule] Error 2
>>>>>>>>
>>>>>>>> Makefile:626: recipe for target 'check' failed
>>>>>>>> make: *** [check] Error 2
>>>>>>>>
>>>>>>>> Not sure what could be wrong. Just to add I get some error/warning
>>>>>> during
>>>>>>>> installation which says "MPI is not compatible with thread-MPI.
>>>>>> Disabling
>>>>>>>> thread-MPI". I am using -DGMX_MPI=on and to have openMPI on ubuntu
>>>>>> 18.04
>>>>>>> I
>>>>>>>> used "sudo apt-get install openmpi-bin openmpi-common
>>>> libopenmpi-dev"
>>>>>>>> Please let me know how I can fix this.
>>>>>>>>
>>>>>>>> best regards,
>>>>>>>> D
>>>>>>>>
>>>>>>>>
>>>>>>> --
>>>>>>> Gromacs Users mailing list
>>>>>>>
>>>>>>> * Please search the archive at
>>>>>>> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
>>>>>>> posting!
>>>>>>>
>>>>>>> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>>>>>>>
>>>>>>> * For (un)subscribe requests visit
>>>>>>> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users
>>>> or
>>>>>>> send a mail to gmx-users-request at gromacs.org.
>>>>>>>
>>>>>> --
>>>>>> Gromacs Users mailing list
>>>>>>
>>>>>> * Please search the archive at
>>>>>> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
>>>>>> posting!
>>>>>>
>>>>>> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>>>>>>
>>>>>> * For (un)subscribe requests visit
>>>>>> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
>>>>>> send a mail to gmx-users-request at gromacs.org.
>>>>>>
>>>>> --
>>>>> Gromacs Users mailing list
>>>>>
>>>>> * Please search the archive at
>>>>> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
>>>>> posting!
>>>>>
>>>>> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>>>>>
>>>>> * For (un)subscribe requests visit
>>>>> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
>>>>> send a mail to gmx-users-request at gromacs.org.
>>>>>
>>>> --
>>>> Gromacs Users mailing list
>>>>
>>>> * Please search the archive at
>>>> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
>>>> posting!
>>>>
>>>> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>>>>
>>>> * For (un)subscribe requests visit
>>>> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
>>>> send a mail to gmx-users-request at gromacs.org.
>>>>
>> --
>> Paul Bauer, PhD
>> GROMACS Release Manager
>> KTH Stockholm, SciLifeLab
>> 0046737308594
>>
>> --
>> Gromacs Users mailing list
>>
>> * Please search the archive at
>> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before
>> posting!
>>
>> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>>
>> * For (un)subscribe requests visit
>> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or
>> send a mail to gmx-users-request at gromacs.org.
>>
--
Paul Bauer, PhD
GROMACS Release Manager
KTH Stockholm, SciLifeLab
0046737308594
More information about the gromacs.org_gmx-users
mailing list