Skip to content

bazel: patch to allow py-tensorflow (and likely other bazel packages)…#17013

Merged
scheibelp merged 1 commit intospack:developfrom
aweits:tensorflow_and_bazel_final
Jun 12, 2020
Merged

bazel: patch to allow py-tensorflow (and likely other bazel packages)…#17013
scheibelp merged 1 commit intospack:developfrom
aweits:tensorflow_and_bazel_final

Conversation

@aweits
Copy link
Copy Markdown
Contributor

@aweits aweits commented Jun 8, 2020

… to build.

bazel uses gcc's -MF option to write dependencies to a
file. Post-compilation, bazel reads this file and makes some
determinations.

"Since gcc is given only relative paths on the command line,
non-system include paths here should never be absolute. If they
are, it's probably due to a non-hermetic #include, & we should stop
the build with an error."

Spack directly injects absolute paths, which appear in this file and
cause bazel to fail the build despite the fact that compilation
succeeded.

This patch disables this failure mode by default, and allows for it
to be turned back on by using the '~nodepfail' variant.

… to build.

bazel uses gcc's -MF option to write dependencies to a
file. Post-compilation, bazel reads this file and makes some
determinations.

"Since gcc is given only relative paths on the command line,
 non-system include paths here should never be absolute. If they
 are, it's probably due to a non-hermetic #include, & we should stop
 the build with an error."

Spack directly injects absolute paths, which appear in this file and
cause bazel to fail the build despite the fact that compilation
succeeded.

This patch disables this failure mode by default, and allows for it
to be turned back on by using the '~nodepfail' variant.
@aweits
Copy link
Copy Markdown
Contributor Author

aweits commented Jun 8, 2020

@Sinan81 @adamjstewart

@adamjstewart
Copy link
Copy Markdown
Member

I wonder if we should even bother with a variant for this.

Can someone loop in some Google/Bazel devs to see what they think about this approach? Perhaps we could convince them to upstream this.

@Sinan81
Copy link
Copy Markdown
Contributor

Sinan81 commented Jun 8, 2020

I wonder if we should even bother with a variant for this.

I agree. It should be enabled by default!

@aweits
Copy link
Copy Markdown
Contributor Author

aweits commented Jun 9, 2020

If memory serves, they had previously suggested developing a custom toolchain for each package installed - where we would have to individually specify all of the spack-injected include paths (for that package) in cxx_builtin_include_directories. I guess I'm not quite sure what an upstreamable version of this would look like? We really want to change the behavior of bazel for spack, not each individual package we build, as they will all have some variant of the same issue.

@scheibelp scheibelp merged commit 67b8662 into spack:develop Jun 12, 2020
@scheibelp
Copy link
Copy Markdown
Member

Thanks!

Regarding #17013 (comment), it would be useful to know why they added that logic - i.e. what types of failure cases it avoids.

likask pushed a commit to likask/spack that referenced this pull request Jun 15, 2020
* commit '1501de59ed74802f48e32b0657fc6c95997b264a': (3648 commits)
  Package/py-lmfit: add new version (spack#16975)
  hpctoolkit: add version 2020.06.12 (spack#17081)
  add dependency for icd variant, or else build fails (spack#17079)
  clang: add 'version_argument', remove redundant method (spack#17071)
  New package: ocl-icd (spack#17078)
  Reframe 3.0 (spack#17005)
  py-healpy: a new package. (spack#17001)
  New recipe for building the Log4C package (spack#17038)
  fix depends issue and support for aarch64 (spack#17045)
  replace 'no' with 'none' as possible value of 'threads' variant (spack#17063)
  xrootd: new versions (spack#17076)
  add compilers to mpi setup_run_environment methods forall mpi implementations (spack#17015)
  bazel: patch to allow py-tensorflow (and likely other bazel packages) to build. (spack#17013)
  New package: FrontFlow Blue (spack#16901)
  cscope: Link tinfow instead of tinfo
  New package: alps (spack#17023)
  pygpu: fix linking with gpuarray (spack#17033)
  libtree package: add version 1.2.0, 1.1.4, and 1.1.3 (spack#17035)
  Buildcache: Fix bug in binary string replacement (spack#17075)
  New package: clinfo (spack#17042)
  ...

# Conflicts:
#	.gitignore
#	lib/spack/spack/binary_distribution.py
#	lib/spack/spack/modules/common.py
#	var/spack/repos/builtin/packages/med/package.py
#	var/spack/repos/builtin/packages/mofem-cephas/package.py
#	var/spack/repos/builtin/packages/mofem-fracture-module/package.py
#	var/spack/repos/builtin/packages/mofem-users-modules/package.py
#	var/spack/repos/builtin/packages/petsc/package.py
#	var/spack/repos/builtin/packages/python/package.py
manifestoso pushed a commit to DeepThoughtHPC/spack that referenced this pull request Jun 19, 2020
… to build. (spack#17013)

bazel uses gcc's -MF option to write dependencies to a
file. Post-compilation, bazel reads this file and makes some
determinations.

"Since gcc is given only relative paths on the command line,
 non-system include paths here should never be absolute. If they
 are, it's probably due to a non-hermetic #include, & we should stop
 the build with an error."

Spack directly injects absolute paths, which appear in this file and
cause bazel to fail the build despite the fact that compilation
succeeded.

This patch disables this failure mode by default, and allows for it
to be turned back on by using the '~nodepfail' variant.
@aweits aweits deleted the tensorflow_and_bazel_final branch February 12, 2021 13:21
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants