[gmx-developers] ICC 14 support

Shirts, Michael R. (mrs5pt) mrs5pt at eservices.virginia.edu
Tue Sep 30 17:07:20 CEST 2014


Is there a plan (long term) to do (essentially) automated performance tests so that we can perform consistent(ish) checks for new changes in code, then post the results in an easy(ish) way to interpret for others?

Best,
~~~~~~~~~~~~
Michael Shirts
Associate Professor
Department of Chemical Engineering
University of Virginia
michael.shirts at virginia.edu
(434)-243-1821

From: Mark Abraham <mark.j.abraham at gmail.com<mailto:mark.j.abraham at gmail.com>>
Reply-To: "gmx-developers at gromacs.org<mailto:gmx-developers at gromacs.org>" <gmx-developers at gromacs.org<mailto:gmx-developers at gromacs.org>>
Date: Tuesday, September 30, 2014 at 10:59 AM
To: Discussion list for GROMACS development <gmx-developers at gromacs.org<mailto:gmx-developers at gromacs.org>>
Subject: Re: [gmx-developers] ICC 14 support



On Tue, Sep 30, 2014 at 4:48 PM, Kevin Chen <fch6699 at gmail.com<mailto:fch6699 at gmail.com>> wrote:
Have you guys compared GROMACS performance between icc and gcc? Per Szilárd's note, seems gcc is better, am I right?

Yeah, I'd take that one to the bank. Of course, we're always interested to hear of observations (either way).

Mark

Cheers,

Kevin


-----Original Message-----
From: gromacs.org_gmx-developers-bounces at maillist.sys.kth.se<mailto:gromacs.org_gmx-developers-bounces at maillist.sys.kth.se> [mailto:gromacs.org_gmx-developers-bounces at maillist.sys.kth.se<mailto:gromacs.org_gmx-developers-bounces at maillist.sys.kth.se>] On Behalf Of Szilárd Páll
Sent: Monday, September 29, 2014 3:50 PM
To: Discussion list for GROMACS development
Subject: Re: [gmx-developers] ICC 14 support

BTW: you'll have less trouble when combining with CUDA as well as better performance with gcc!

Cheers,
--
Szilárd


On Mon, Sep 29, 2014 at 10:14 PM, Kevin Chen <fch6699 at gmail.com<mailto:fch6699 at gmail.com>> wrote:
> Hi Roland,
>
>
>
> Thanks for the reply! Looks like the error messages were generated by
> Cuda6.0 (see errors below) instead of GROMACS. Switching back to
> ICC13.1 and turning off static-libraries worked for us.
>
>
>
> Best,
>
>
>
> Kevin
>
>
>
>
>
>
>
> ======================================================================
> ===============================================================
>
>
>
> “In file included from /opt/apps/cuda/6.0/include/cuda_runtime.h(59),
>
> /opt/apps/cuda/6.0/include/host_config.h(72): catastrophic error:
> #error
> directive: -- unsupported ICC configuration! Only ICC 13.1 on Linux
> x86_64 is supported!
>
>   #error -- unsupported ICC configuration! Only ICC 13.1 on Linux
> x86_64 is supported!
>
>    ^
>
>
>
> /opt/apps/cuda/6.0/include/host_config.h(72): catastrophic error:
> #error
> directive: -- unsupported ICC configuration! Only ICC 13.1 on Linux
> x86_64 is supported!
>
>   #error -- unsupported ICC configuration! Only ICC 13.1 on Linux
> x86_64 is supported!
>
>    ^
>
>
>
>                  from
> /admin/build/admin/rpms/stampede/BUILD/gromacs-5.0.1/src/gromacs/gmxlib/gpu_utils/gpu_utils.cu<http://gpu_utils.cu>(0):
>
> /opt/apps/cuda/6.0/include/host_config.h(72): catastrophic error:
> #error
> directive: -- unsupported ICC configuration! Only ICC 13.1 on Linux
> x86_64 is supported!
>
>   #error -- unsupported ICC configuration! Only ICC 13.1 on Linux
> x86_64 is supported!
>
>    ^
>
>
>
> In file included from /opt/apps/cuda/6.0/include/cuda_runtime.h(59),
>
>                  from
> /admin/build/admin/rpms/stampede/BUILD/gromacs-5.0.1/src/gromacs/gmxlib/cuda_tools/pmalloc_cuda.cu<http://pmalloc_cuda.cu>(0):
>
> /opt/apps/cuda/6.0/include/host_config.h(72): catastrophic error:
> #error
> directive: -- unsupported ICC configuration! Only ICC 13.1 on Linux
> x86_64 is supported!
>
>
>
>>
>
>
>
>
>
>
>
>
> From: gromacs.org_gmx-developers-bounces at maillist.sys.kth.se<mailto:gromacs.org_gmx-developers-bounces at maillist.sys.kth.se>
> [mailto:gromacs.org_gmx-developers-bounces at maillist.sys.kth.se<mailto:gromacs.org_gmx-developers-bounces at maillist.sys.kth.se>] On
> Behalf Of Roland Schulz
> Sent: Monday, September 29, 2014 11:16 AM
> To: gmx-developers at gromacs.org<mailto:gmx-developers at gromacs.org>
> Subject: Re: [gmx-developers] ICC 14 support
>
>
>
> Hi,
>
>
>
> what problem do you have with ICC14? Both ICC14 and ICC15 should work fine.
> There was an issue with ICC14+static-libraries (9e8061e13f48, 4.6.7
> and
> 5.0.1) and ICC14+unit-tests(b0e60e91add6, 5.0.1). Both are fixed in
> release-5-0 and will be included 5.0.2. You can either use the
> release-5-0 branch, just apply the patch, wait till 5.0.2 (should be
> soon), or don't use static-libraries (default) and don't try to run
> the unit-tests (the compiler issue isn't present in the main code thus
> even though the unit tests fail the actual program is OK).
>
>
>
> Roland
>
>
>
> On Mon, Sep 29, 2014 at 10:35 AM, Kevin Chen <fch6699 at gmail.com<mailto:fch6699 at gmail.com>> wrote:
>
> Hi Guys,
>
> Was wondering if GROMACS will support ICC 14 in the near future?
>
> Kevin Chen, Ph.D.
> HPC Applications, TACC
>
>
> -----Original Message-----
> From: gromacs.org_gmx-developers-bounces at maillist.sys.kth.se<mailto:gromacs.org_gmx-developers-bounces at maillist.sys.kth.se>
> [mailto:gromacs.org_gmx-developers-bounces at maillist.sys.kth.se<mailto:gromacs.org_gmx-developers-bounces at maillist.sys.kth.se>] On
> Behalf Of Alexey Shvetsov
> Sent: Sunday, September 28, 2014 3:09 PM
> To: gmx-developers at gromacs.org<mailto:gmx-developers at gromacs.org>
> Subject: Re: [gmx-developers] Possible bug in gmx
>
> Mark Abraham писал 28-09-2014 18:17:
>> How about a redmine issue - this thread's not about GROMACS
>> development, per se ;-)
>
> Sorry about that =D
>
> Redmine issue http://redmine.gromacs.org/issues/1607
> With relevant tpr file attached
>>
>> Mark
>>
>> On Sun, Sep 28, 2014 at 3:18 PM, Alexey Shvetsov
>> <alexxy at omrb.pnpi.spb.ru<mailto:alexxy at omrb.pnpi.spb.ru>> wrote:
>>
>>> Hi Berk!
>>>
>>> Its not a cut and paste error, also there are no pdb dumps.
>>> Also I see this error before with other systems.
>>>
>>> I can provide tpr file for that system
>>>
>>> https://biod.pnpi.spb.ru/~alexxy/gmx/psa_pep_ctrl.md_npt.tpr [1]
>>>
>>> Berk Hess писал 28-09-2014 16:37:
>>>
>>> Hi,
>>>
>>> I assume that your old and be coordinates being identical is correct
>>> and not a cut-and-paste error.
>>> This seems a bit strange or do you freeze part of the system?
>>> The only things moving here are then the domain boundaries and I
>>> don't see an issue there, since they only moved a little.
>>>
>>> Do you have any more output besides the error message? PDB dump
>>> files maybe?
>>>
>>> Cheers,
>>>
>>> Berk
>>>
>>> On 09/28/2014 02:22 PM, Alexey Shvetsov wrote:
>>> Hi,
>>>
>>> just wanna add that this error seems to be reproducable even on
>>> single node. Also i get same error for gpu runs.
>>> However i dont see it in large systems (800k+ atoms) running on
>>> large number of cpus (512+)
>>>
>>> Alexey Shvetsov писал 28-09-2014 13:44:
>>> Hi,
>>>
>>> DD grid is
>>>
>>> Domain decomposition grid 4 x 1 x 1, separate PME ranks 0 PME domain
>>> decomposition: 4 x 1 x 1
>>>
>>> for 4 node setup
>>>
>>> and
>>>
>>> Domain decomposition grid 4 x 2 x 1, separate PME ranks 0 PME domain
>>> decomposition: 4 x 2 x 1
>>>
>>> for 8 node setup
>>>
>>> It's reproducable with 5.0 release and latest git master. I try to
>>> check if its reproducable with 1 node. Also i can provide tpr file
>>> for this system
>>>
>>> Mark Abraham писал 28-09-2014 13:28:
>>> Hi,
>>>
>>> It's hard to say on that information. There were some issues fixed
>>> in the lead-up to GROMACS 5 with DD not always working with 2
>>> domains in a direction, but that's a pure guess. I'd assume you can
>>> reproduce this with release-5-0 branch. Do you observe it with a single domain?
>>> If not, then it's surely a bug (and should be submitted to redmine).
>>>
>>> Mark
>>>
>>> On Sun, Sep 28, 2014 at 11:18 AM, Alexey Shvetsov
>>> <alexxy at omrb.pnpi.spb.ru<mailto:alexxy at omrb.pnpi.spb.ru>> wrote:
>>>
>>> Hi all!
>>>
>>> I'm doing some tests with small peptide and constantly getting this
>>> error.
>>> I get it with few systems.
>>>
>>> Systems sizes are around 10k or 20k
>>> I run it on 4 or 8 old nodes each with two xeon 54xx series
>>>
>>> starting mdrun '2ZCH_3 in water'
>>> 50000000 steps, 100000.0 ps (continuing from step 1881000, 3762.0
>>> ps).
>>>
>>> Step 13514000:
>>> The charge group starting at atom 6608 moved more than the distance
>>> allowed by the domain decomposition (1.112924) in direction X
>>> distance out of cell -1.193103 Old coordinates: 5.467 0.298 3.636
>>> New
>>> coordinates: 5.467 0.298 3.636 Old cell boundaries in direction X:
>>> 4.037 5.382 New cell boundaries in direction X: 4.089 5.452
>>>
>>>
>> ---------------------------------------------------------------------
>> -
>> ----
>>> MPI_ABORT was invoked on rank 3 in communicator MPI_COMM_WORLD with
>>> errorcode 1.
>>>
>>> NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
>>> You may or may not see output from other processes, depending on
>>> exactly when Open MPI kills them.
>>>
>>>
>> ---------------------------------------------------------------------
>> -
>> ----
>>>
>>> -------------------------------------------------------
>>> Program mdrun_mpi, VERSION 5.1-dev-20140922-20c00a9-dirty-unknown
>>> Source code file:
>>>
>>>
>> /var/tmp/alexxy/portage/sci-chemistry/gromacs-9999/work/gromacs-9999/
>> s
>> rc/gromacs/mdlib/domdec.cpp,
>>> line: 4388
>>>
>>> Fatal error:
>>> A charge group moved too far between two domain decomposition steps
>>> This usually means that your system is not well equilibrated For
>>> more information and tips for troubleshooting, please check the
>>> GROMACS website at http://www.gromacs.org/Documentation/Errors [2]
>>> [1]
>>> -------------------------------------------------------
>>>
>>> -- Best Regards,
>>> Alexey 'Alexxy' Shvetsov, PhD
>>> Department of Molecular and Radiation Biophysics FSBI Petersburg
>>> Nuclear Physics Institute, NRC Kurchatov Institute, Leningrad
>>> region, Gatchina, Russia mailto:alexxyum at gmail.com<mailto:alexxyum at gmail.com>
>>> mailto:alexxy at omrb.pnpi.spb.ru<mailto:alexxy at omrb.pnpi.spb.ru>
>>> -- Gromacs Developers mailing list
>>>
>>> * Please search the archive at
>>> http://www.gromacs.org/Support/Mailing_Lists/GMX-developers_List
>>> [3] [2]
>>> before posting!
>>>
>>> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists [4]
>>> [3]
>>>
>>> * For (un)subscribe requests visit
>>>
>>>
>> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-develope
>> r
>> s
>>> [5]
>>> [4] or send a mail to gmx-developers-request at gromacs.org<mailto:gmx-developers-request at gromacs.org>.
>>>
>>> Links:
>>> ------
>>> [1] http://www.gromacs.org/Documentation/Errors [2] [2]
>>> http://www.gromacs.org/Support/Mailing_Lists/GMX-developers_List [3]
>>> [3] http://www.gromacs.org/Support/Mailing_Lists [4] [4]
>>>
>> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-develope
>> r
>> s
>>> [5]
>>
>>  -- Best Regards,
>>  Alexey 'Alexxy' Shvetsov, PhD
>>  Department of Molecular and Radiation Biophysics  FSBI Petersburg
>> Nuclear Physics Institute, NRC Kurchatov Institute,  Leningrad
>> region, Gatchina, Russia  mailto:alexxyum at gmail.com<mailto:alexxyum at gmail.com>
>> mailto:alexxy at omrb.pnpi.spb.ru<mailto:alexxy at omrb.pnpi.spb.ru>
>>
>>  --
>>  Best Regards,
>>  Alexey 'Alexxy' Shvetsov, PhD
>>  Department of Molecular and Radiation Biophysics  FSBI Petersburg
>> Nuclear Physics Institute, NRC Kurchatov Institute,  Leningrad
>> region, Gatchina, Russia  mailto:alexxyum at gmail.com<mailto:alexxyum at gmail.com>
>> mailto:alexxy at omrb.pnpi.spb.ru<mailto:alexxy at omrb.pnpi.spb.ru>
>>  --
>>  Gromacs Developers mailing list
>>
>>  * Please search the archive at
>> http://www.gromacs.org/Support/Mailing_Lists/GMX-developers_List [3]
>> before posting!
>>
>>  * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists [4]
>>
>>  * For (un)subscribe requests visit
>>
>> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-develope
>> r s [5] or send a mail to gmx-developers-request at gromacs.org<mailto:gmx-developers-request at gromacs.org>.
>>
>>
>> Links:
>> ------
>> [1] https://biod.pnpi.spb.ru/~alexxy/gmx/psa_pep_ctrl.md_npt.tpr
>> [2] http://www.gromacs.org/Documentation/Errors
>> [3] http://www.gromacs.org/Support/Mailing_Lists/GMX-developers_List
>> [4] http://www.gromacs.org/Support/Mailing_Lists
>> [5]
>> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-develope
>> r
>> s
>
> --
> Best Regards,
> Alexey 'Alexxy' Shvetsov, PhD
> Department of Molecular and Radiation Biophysics FSBI Petersburg
> Nuclear Physics Institute, NRC Kurchatov Institute, Leningrad region,
> Gatchina, Russia mailto:alexxyum at gmail.com<mailto:alexxyum at gmail.com>
> mailto:alexxy at omrb.pnpi.spb.ru<mailto:alexxy at omrb.pnpi.spb.ru>
> --
> Gromacs Developers mailing list
>
> * Please search the archive at
> http://www.gromacs.org/Support/Mailing_Lists/GMX-developers_List
> before posting!
>
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
> * For (un)subscribe requests visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-developer
> s or send a mail to gmx-developers-request at gromacs.org<mailto:gmx-developers-request at gromacs.org>.
>
> --
> Gromacs Developers mailing list
>
> * Please search the archive at
> http://www.gromacs.org/Support/Mailing_Lists/GMX-developers_List
> before posting!
>
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
> * For (un)subscribe requests visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-developer
> s or send a mail to gmx-developers-request at gromacs.org<mailto:gmx-developers-request at gromacs.org>.
>
>
>
>
>
> --
> ORNL/UT Center for Molecular Biophysics cmb.ornl.gov<http://cmb.ornl.gov> 865-241-1537,
> ORNL PO BOX 2008 MS6309
>
>
> --
> Gromacs Developers mailing list
>
> * Please search the archive at
> http://www.gromacs.org/Support/Mailing_Lists/GMX-developers_List
> before posting!
>
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>
> * For (un)subscribe requests visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-developer
> s or send a mail to gmx-developers-request at gromacs.org<mailto:gmx-developers-request at gromacs.org>.
--
Gromacs Developers mailing list

* Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/GMX-developers_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-developers or send a mail to gmx-developers-request at gromacs.org<mailto:gmx-developers-request at gromacs.org>.

--
Gromacs Developers mailing list

* Please search the archive at http://www.gromacs.org/Support/Mailing_Lists/GMX-developers_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-developers or send a mail to gmx-developers-request at gromacs.org<mailto:gmx-developers-request at gromacs.org>.

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://maillist.sys.kth.se/pipermail/gromacs.org_gmx-developers/attachments/20140930/858ed718/attachment-0001.html>


More information about the gromacs.org_gmx-developers mailing list