Merged
Conversation
If the version list passed to one_of_iff is empty, it still generates a
rule like this:
node_compiler_version_satisfies("fujitsu-mpi", "arm", ":") :- 1 { } 1.
1 { } 1 :- node_compiler_version_satisfies("fujitsu-mpi", "arm", ":").
The cardinality rules on the right and left above are never
satisfiale, and these rules do nothing.
- [x] Skip generating any rules at all for empty version lists.
…e bodies.
`node_compiler_hard()` means that something explicitly asked for a node's
compiler to be set -- i.e., it's not inherited, it's required. We're
generating this in spec_clauses even for specs in rule bodies, which
results in conditions like this for optional dependencies:
In py-torch/package.py:
depends_on('llvm-openmp', when='%apple-clang +openmp')
In the generated ASP:
declared_dependency("py-torch","llvm-openmp","build")
:- node("py-torch"),
variant_value("py-torch","openmp","True"),
node_compiler("py-torch","apple-clang"),
node_compiler_hard("py-torch","apple-clang"),
node_compiler_version_satisfies("py-torch","apple-clang",":").
The `node_compiler_hard` there means we would have to *explicitly* set
py-torch's compiler to trigger the llvm-openmp dependency, rather than
just letting it be set by preferences. This is wrong; the dependency
should be there regardless of how the compiler was set.
- [x] remove fn.node_compiler_hard() call from spec_clauses when
generating rule body clauses.
…ernal This commit introduces a new rule: real_node(Package) :- not external(Package), node(Package). that permits to distinguish between an external node and a real node that shouldn't trim dependency. It solves the case of concretizing ninja with an external Python.
…variant If a the default of a multi-valued variant is set to multiple values either in package.py or in packages.yaml we need to ensure that all the values are present in the concretized spec. Since each default value has a weight of 0 and the variant value is set implicitly by the concretizer we need to add a rule to maximize on the number of default values that are used.
This commit address the case of concretizing a root spec with a transitive conditional dependency on a virtual package, provided by an external. Before these modifications default variant values for the dependency bringing in the virtual package were not respected, and the external package providing the virtual was added to the DAG. The issue stems from two facts: - Selecting a provider has higher precedence than selecting default variants - To ensure that an external is preferred, we used a negative weight To solve it we shift all the providers weight so that: - External providers have a weight of 0 - Non external provider have a weight of 10 or more Using a weight of zero for external providers is such that having an external provider, if present, or not having a provider at all has the same effect on the higher priority minimization. Also fixed a few minor bugs in concretize.lp, that were causing spurious entries in the final answer set. Cleaned concretize.lp from leftover rules.
) This PR reworks a few attributes in the container subsection of spack.yaml to permit the injection of custom base images when generating containers with Spack. In more detail, users can still specify the base operating system and Spack version they want to use: spack: container: images: os: ubuntu:18.04 spack: develop in which case the generated recipe will use one of the Spack images built on Docker Hub for the build stage and the base OS image in the final stage. Alternatively, they can specify explicitly the two base images: spack: container: images: build: spack/ubuntu-bionic:latest final: ubuntu:18.04 and it will be up to them to ensure their consistency. Additional changes: * This commit adds documentation on the two approaches. * Users can now specify OS packages to install (e.g. with apt or yum) prior to the build (previously this was only available for the finalized image). * Handles to avoid an update of the available system packages have been added to the configuration to facilitate the generation of recipes permitting deterministic builds.
…pack#18260) `spack load` and `spack env activate` now use the prefix inspections defined in `modules.yaml`. This allows users to customize/override environment variable modifications if desired. If no `prefix_inspections` configuration is present, Spack uses the values in the default configuration.
As of spack#18260, `spack load` and `spack env activate` now use `prefix_inspections` from the modules configuration to decide how to modify environment variables. This updates the modules configuration documentation to describe how to update environment variables with the `prefix_inspections` section. This also updates the `spack load` and environments documentation to refer to the new `prefix_inspections` documentation.
The deprecatedProperties custom validator now can accept a function to compute a better error message. Improve error/warning message for deprecated properties
…k#19962) * Added -level_zero -rocm -opencl flags and sha256 for TAU v2.30. * Removed the depends_on clause for OpenCL and added a variant for OneAPI level_zero. * remove depends_on rocm * remove depends_on rocprofiler Co-authored-by: eugeneswalker <[email protected]>
Users can add test() methods to their packages to run smoke tests on
installations with the new `spack test` command (the old `spack test` is
now `spack unit-test`). spack test is environment-aware, so you can
`spack install` an environment and then run `spack test run` to run smoke
tests on all of its packages. Historical test logs can be perused with
`spack test results`. Generic smoke tests for MPI implementations, C,
C++, and Fortran compilers as well as specific smoke tests for 18
packages.
Inside the test method, individual tests can be run separately (and
continue to run best-effort after a test failure) using the `run_test`
method. The `run_test` method encapsulates finding test executables,
running and checking return codes, checking output, and error handling.
This handles the following trickier aspects of testing with direct
support in Spack's package API:
- [x] Caching source or intermediate build files at build time for
use at test time.
- [x] Test dependencies,
- [x] packages that require a compiler for testing (such as library only
packages).
See the packaging guide for more details on using Spack testing support.
Included is support for package.py files for virtual packages. This does
not change the Spack interface, but is a major change in internals.
Co-authored-by: Tamara Dahlgren <[email protected]>
Co-authored-by: wspear <[email protected]>
Co-authored-by: Adam J. Stewart <[email protected]>
This adds a new `mark` command that can be used to mark packages as either explicitly or implicitly installed. Apart from fixing the package database after installing a dependency manually, it can be used to implement upgrade workflows as outlined in spack#13385. The following commands demonstrate how the `mark` and `gc` commands can be used to only keep the current version of a package installed: ```console $ spack install pkgA $ spack install pkgB $ git pull # Imagine new versions for pkgA and/or pkgB are introduced $ spack mark -i -a $ spack install pkgA $ spack install pkgB $ spack gc ``` If there is no new version for a package, `install` will simply mark it as explicitly installed and `gc` will not remove it. Co-authored-by: Greg Becker <[email protected]>
There is a post-install routine in `ipykernel` that needs to be called for proper registration with jupyter.
* [hep] add hep tag to relevant packages * [lcio] add hep label
* create HipPackage base class and do some refactoring * comments and added conflict to raja for openmp with hip
Updated boto3 dependency and removed useless comments.
* py-ipykernel: fix bug in phase method * Fix bug in executable calling
* charmpp: various fixes - change URLs to https - address deprecated/renamed versions - make it build with the cmake build system * flake8 * Apply suggestions from code review Co-authored-by: Adam J. Stewart <[email protected]> Co-authored-by: Adam J. Stewart <[email protected]>
…k#19957) * AMD ROCm 3.9.0 release: Bump up version for aomp, roctracer-dev and updates to hip/hip-rocclr * Update package.py
Fixed hard tab in flux-sched edit and unbound hwloc in flux-core after testing to better support modern MPIs in spack environments Verified that [email protected] is when hwloc@2: became viable
Co-authored-by: Christian Kniep <[email protected]>
This fixes a logging error observed on macOS 11.0.1 (Big Sur). When performing a Spack install in debugging mode (e.g. `spack -d install py-scipy`) Spack is supposed to write a log of compiler wrapper command line invocations to the current working directory. Due to a regression error introduced by spack#18205, these files were no-longer generated, and Spack was printing errors such as "No such file or directory: None/." This is because the log file directory gets set from `spack.main.spack_working_dir`, but that variable is not set in the spawned process. This PR ensures that the working directory (at the time of the "spack install" invocation) is persisted to the subprocess.
I was keeping the old `clingo` driver code around in case we had to run
using the command line tool instad of through the Python interface.
So far, the command line is faster than running through Python, but I'm
working on fixing that. I found that if I do this:
```python
control = clingo.Control()
control.load("concretize.lp")
control.load("hdf5.lp") # code from spack solve --show asp hdf5
control.load("display.lp")
control.ground([("base", [])])
control.solve(...)
```
It's just as fast as the command line tool. So we can always generate the
code and load it manually if we need to -- we don't need two drivers for
clingo. Given that the python interface is also the only way to get unsat
cores, I think we pretty much have to use it.
So, I'm removing the old command line driver and other unused code. We
can dig it up again from the history if it is needed.
Enabling PSATD is not mutually exclusive with other runtime options anymore, so we can always compile with support for it to ease usability.
Require boost at cxxstd=14 if cxxstd=14 is selected, not 11
frankwillmore
pushed a commit
that referenced
this pull request
May 12, 2023
1. support version 3.1.3, which now depends on sundials@6 2. support version 3.1.2:, which broke the two patch files and therefore the two patch files have been replaced by more flexible filter_file() commands inside a patch() function. 3. rename the variant for python extension from using the package name "+pyuqtk" to the more standard "+python" 4. add maintainers @omsai and the upstream developer @bjdebus who offered to help with the spack packaging. 5. swig should only be a build-time dependency. swig is only necessary until @:3.1.0 6. confirmed python dependencies are correct by inspecting imports, subset python dependencies type to build, run, and confirmed all 31 build-time tests pass including the 9 python tests: ```console $ spack env create uqtk-dev $ spack add [email protected] $ spack install --test root && cat $(spack location -i uqtk)/.spack/install-time-test-log.txt ==> Testing package uqtk-3.1.3-nok6fut ==> [2023-04-19-14:56:25.005361] Running build-time tests ==> [2023-04-19-14:56:25.005536] RUN-TESTS: build-time tests [check] ==> [2023-04-19-14:56:25.009543] '/home/omsai/src/spack/opt/spack/linux-pureos10-skylake/gcc-10.2.1/gmake-4.4.1-b6g4apmfvxz3bn4eabh37dehcrg65fj7/bin/make' '-j4' '-n' 'test' ==> [2023-04-19-14:56:25.014903] '/home/omsai/src/spack/opt/spack/linux-pureos10-skylake/gcc-10.2.1/gmake-4.4.1-b6g4apmfvxz3bn4eabh37dehcrg65fj7/bin/make' '-j4' 'test' Running tests... /home/omsai/src/spack/opt/spack/linux-pureos10-skylake/gcc-10.2.1/cmake-3.26.3-zjmsfz23j5l4ytniz26uzvxonlu5qebr/bin/ctest --force-new-ctest-process Test project /tmp/omsai/spack-stage/spack-stage-uqtk-3.1.3-nok6fut47h42cnaau7wkoohgqy5f2qqa/spack-build-nok6fut Start 1: ArrayReadAndWrite Start 2: ArrayDelColumn Start 3: Array1DMiscTest Start 4: Array2DMiscTest 1/31 Test #1: ArrayReadAndWrite ................ Passed 0.01 sec Start 5: ArraySortTest 2/31 Test #2: ArrayDelColumn ................... Passed 0.01 sec Start 6: MultiIndexTest 3/31 Test #3: Array1DMiscTest .................. Passed 0.01 sec Start 7: CorrTest 4/31 Test #4: Array2DMiscTest .................. Passed 0.01 sec Start 8: QuadLUTest 5/31 Test #5: ArraySortTest .................... Passed 0.02 sec Start 9: MCMC2dTest 6/31 Test #6: MultiIndexTest ................... Passed 0.01 sec Start 10: MCMCRandomTest 7/31 Test #8: QuadLUTest ....................... Passed 0.02 sec Start 11: MCMCNestedTest 8/31 Test #10: MCMCRandomTest ................... Passed 0.02 sec Start 12: Deriv1dTest 9/31 Test #12: Deriv1dTest ...................... Passed 0.01 sec Start 13: SecondDeriv1dTest 10/31 Test #13: SecondDeriv1dTest ................ Passed 0.01 sec Start 14: GradHessianTest 11/31 Test #11: MCMCNestedTest ................... Passed 0.03 sec Start 15: GradientPCETest 12/31 Test #14: GradHessianTest .................. Passed 0.01 sec Start 16: PCE1dTest 13/31 Test #15: GradientPCETest .................. Passed 0.01 sec Start 17: PCEImplTest 14/31 Test #16: PCE1dTest ........................ Passed 0.01 sec Start 18: PCELogTest 15/31 Test #18: PCELogTest ....................... Passed 0.01 sec Start 19: Hessian2dTest 16/31 Test #19: Hessian2dTest .................... Passed 0.01 sec Start 20: BCS1dTest 17/31 Test #20: BCS1dTest ........................ Passed 0.01 sec Start 21: BCS2dTest 18/31 Test #21: BCS2dTest ........................ Passed 0.01 sec Start 22: LowRankRegrTest 19/31 Test #22: LowRankRegrTest .................. Passed 0.01 sec Start 23: PyModTest 20/31 Test #17: PCEImplTest ...................... Passed 0.07 sec Start 24: PyArrayTest 21/31 Test #23: PyModTest ........................ Passed 0.08 sec Start 25: PyArrayTest2 22/31 Test #25: PyArrayTest2 ..................... Passed 0.30 sec Start 26: PyQuadTest 23/31 Test #24: PyArrayTest ...................... Passed 1.44 sec Start 27: PyBCSTest1D 24/31 Test #26: PyQuadTest ....................... Passed 1.68 sec Start 28: PyBCSTest2D 25/31 Test #27: PyBCSTest1D ...................... Passed 1.66 sec Start 29: PyBADPTest 26/31 Test #7: CorrTest ......................... Passed 3.43 sec Start 30: PyRegressionTest 27/31 Test #28: PyBCSTest2D ...................... Passed 1.50 sec Start 31: PyGalerkinTest 28/31 Test #9: MCMC2dTest ....................... Passed 3.90 sec 29/31 Test #29: PyBADPTest ....................... Passed 1.66 sec 30/31 Test #30: PyRegressionTest ................. Passed 1.72 sec 31/31 Test #31: PyGalerkinTest ................... Passed 1.63 sec 100% tests passed, 0 tests failed out of 31 Total Test time (real) = 5.35 sec ==> [2023-04-19-14:56:30.382797] '/home/omsai/src/spack/opt/spack/linux-pureos10-skylake/gcc-10.2.1/gmake-4.4.1-b6g4apmfvxz3bn4eabh37dehcrg65fj7/bin/make' '-j4' '-n' 'check' ==> [2023-04-19-14:56:30.385983] Target 'check' not found in Makefile ```
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
No description provided.