Skip to content

Add version check when building intel-tbb with clang#13893

Merged
adamjstewart merged 2 commits intospack:developfrom
hainest:tbb_clang_check
Nov 27, 2019
Merged

Add version check when building intel-tbb with clang#13893
adamjstewart merged 2 commits intospack:developfrom
hainest:tbb_clang_check

Conversation

@hainest
Copy link
Copy Markdown
Contributor

@hainest hainest commented Nov 25, 2019

Clang builds incorrectly determine GCC version which in turn incorrectly causes a mismatch in C++ features resulting in a link error. This also means that clang builds require a gcc compiler to work correctly (this has always been the case). See uxlfoundation/oneTBB#147 for details.

@adamjstewart
Copy link
Copy Markdown
Member

Is this the case for both Clang and Apple Clang? Maybe it's only for Apple Clang which masks itself as GCC?

@hainest
Copy link
Copy Markdown
Contributor Author

hainest commented Nov 26, 2019

@adamjstewart I'm actually not sure about Apple Clang. I don't have or use a mac, so I've not tested it there. My builds have all been on Linux.

I should have been more specific when I said "incorrectly causes a mismatch in C++ features resulting in a link error" as it doesn't always result in a link error in the client application- it depends on which features of TBB you are using. Indeed, TBB will build just fine under clang, but produce incorrect headers and libraries as compared to the compiler's actual C++ support.

The problem stems from linux.inc:58 which used to read

export gcc_version:=$(shell gcc -dumpversion)

which set gcc_version to only the major version of the compiler (i.e., __GNUC__). This was corrected in 2019.7 to

export gcc_version:=$(shell gcc -dumpfullversion -dumpversion)

which returns the full dotted-decimal version containing major, minor, and patch. gcc_version is then used in the clang compilation process to determine the level of C++11 support linux.clang.inc:55. Hence the reliance on gcc even when building with clang.

Continuing down the chain, TBB_USE_GLIBCXX_VERSION sets __TBB_GLIBCXX_VERSION which is then used to determine the actual C++11 feature support. From the comment on line 47, it is clear that we are about to violate the constraints of what __TBB_GLIBCXX_VERSION is supposed to be, and failure is imminent. Starting at tbb_config.h:209, many of the feature detection macros will be set incorrectly because __TBB_GLIBCXX_VERSION will be a single-digit number instead of a five-digit one.

@adamjstewart
Copy link
Copy Markdown
Member

Flake8:

var/spack/repos/builtin/packages/intel-tbb/package.py:51: [W293] blank line contains whitespace
var/spack/repos/builtin/packages/intel-tbb/package.py:52: [E501] line too long (87 > 79 characters)
var/spack/repos/builtin/packages/intel-tbb/package.py:53: [E501] line too long (84 > 79 characters)
var/spack/repos/builtin/packages/intel-tbb/package.py:54: [E501] line too long (86 > 79 characters)
var/spack/repos/builtin/packages/intel-tbb/package.py:59: [W191] indentation contains tabs
var/spack/repos/builtin/packages/intel-tbb/package.py:59: [E101] indentation contains mixed spaces and tabs
var/spack/repos/builtin/packages/intel-tbb/package.py:59: [E128] continuation line under-indented for visual indent
var/spack/repos/builtin/packages/intel-tbb/package.py:61: [E101] indentation contains mixed spaces and tabs

@mwkrentel
Copy link
Copy Markdown
Member

@hainest Did you actually try building the new versions in #13891 ?

For me, applying patch tbb_cmakeConfig.patch fails with:

==> Installing intel-tbb
==> Searching for binary cache of intel-tbb
==> No binary for intel-tbb found: installing from source
==> Warning: No Spack mirrors are currently configured
1 out of 1 hunk FAILED -- saving rejects to file cmake/templates/TBBConfig.cmake.in.rej
==> Fetching https://github.com/01org/tbb/archive/2019_U8.tar.gz
==> Staging archive: /home/krentel/tbb/spack-repo/build-stage/spack-stage-intel-tbb-2019.8-q5krf53lc47dilixfuryzd7ska3lbygh/2019_U8.tar.gz
==> Created stage in /home/krentel/tbb/spack-repo/build-stage/spack-stage-intel-tbb-2019.8-q5krf53lc47dilixfuryzd7ska3lbygh
==> Patch /home/krentel/tbb/spack-repo/var/spack/repos/builtin/packages/intel-tbb/tbb_cmakeConfig.patch failed.
==> Error: ProcessError: Command exited with status 1:
    '/usr/bin/patch' '-s' '-p' '0' '-i' '/home/krentel/tbb/spack-repo/var/spack/repos/builtin/packages/intel-tbb/tbb_cmakeConfig.patch' '-d' '.'

Between 2019.4 and 2019.5, TBB reworked TBBConfig.cmake.in and
changed how they set _tbb_release_lib and _tbb_debug_lib, so the patch
no longer applies cleanly.

I think the following patch for package.py fixes the problem. It's
just 1 line, so you could either fold this into the current PR or I
could write a separate PR, either way.

     # Patch cmakeConfig.cmake.in to find the libraries where we install them.
-    patch("tbb_cmakeConfig.patch", level=0, when='@2017.0:')
+    patch("tbb_cmakeConfig.patch", level=0, when='@2017.0:2019.4')

@chissg Could you check that versions >= 2019.5 are Ok without
tbb_cmakeConfig.patch and don't need a revised version of the patch?

It seems to build/install cleanly for me, but I didn't really
investigate what the problem is without tbb_cmakeConfig.patch and
that 2019.5 and later don't need it.

@adamjstewart
Copy link
Copy Markdown
Member

Let's fix the patch in another PR. Either one of you are free to submit it.

@adamjstewart adamjstewart merged commit c80792f into spack:develop Nov 27, 2019
@mwkrentel
Copy link
Copy Markdown
Member

Ok, it's more complicated than simply disabling the cmake patch.
The patch needs to adapted to 2019.5 and later.

@chissg Would you like to help us out here?

@hainest hainest deleted the tbb_clang_check branch December 3, 2019 00:54
hainest added a commit to hainest/spack that referenced this pull request Dec 4, 2019
adamjstewart pushed a commit that referenced this pull request Dec 4, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants