Spack in CI #32749
Replies: 4 comments 8 replies
-
|
We've been using Spack semi-successfully in HPCToolkit's CI for the majority of a month now. I say semi-successfully because I've spun out 4 outstanding PRs over the course of our CI's (continued) development that fix issues with buildcaches or To summarize our Spack usage, we maintain a set of environments with our software dependencies and then Spack them out in every CI job where we build HPCToolkit. We pull from the public buildcache + #32327 whenever possible ( Re. binary package caching and unpacking, the current incarnation of our CI mixes 4 workarounds:
Before #32136, we used a different approach where we cached the entire install tree instead. We had a couple hiccups with this approach that slow and difficult to operate:
|
Beta Was this translation helpful? Give feedback.
-
|
Since we're griping about Spack in CI, I also wanted to mention another hiccup we ran across in HPCToolkit. We intend to test on a matrix of 11 OS + architecture combinations (we have 6 enabled ATM), and we use 2 environments (with views) for a total of 22 environment concretizations we want to maintain packages for, all in a single buildcache. The problem is To work around this problem, we separate the Obviously this is a rather atrocious setup, ideally this would all happen in a single job spanning the entire configuration matrix. But I'm not aware of any current way to do that, nor a simple extension that would improve the situation. Suggestions would be welcome. |
Beta Was this translation helpful? Give feedback.
-
|
See also: #32958 |
Beta Was this translation helpful? Give feedback.
-
|
Recently I've been having some difficulty in managing my buildcache and wanted to share some experiences and maybe get some insights from anyone that has ideas on how to make buildcaches better. And specifically how the buildcaches are populated via the generated CI pipelines. I asked before and was told that there is an intelligent algorithm that will detect when a recipe has meaningfully changed and then it will be rebuilt. In my experience this hasn't been the case. For instance I had a package that was incorrectly packaged. I accidentally left out a dependency, shipped it to the buildcache and found out much later. So I fixed the recipe but I had to go into the GCP bucket and remove the actual So that is to say that there really should be more tools for managing the artifacts in the buildcache and querying information on the state of the buildcache, or at least providing some more output from So far its been pretty scary doing anything with the build cache because I don't know what will just break it completely and I'm better off just starting from scratch.. which is getting more infeasible as users increase and the number of packages is exploding as more applications are put on it. So anything (even just documentation) would help a lot for more adoption in an industrial setting, as there is already some anxiety about moving away from things like conda. Looking forward I think it would be great to have some kind of web UI that you can use to manage this and maybe even get support from artifact registries for the build cache (and source mirrors too!). |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Wanted to start a general thread for discussing successes, failures, and potential solutions to leveraging Spack in Continuous Integration pipelines.
I'll start by copying some discussion that occurred in the Slack chat room recently.
@salotz-sitx / @salotz
[...]
@blue42u
This is a bit disappointing since even for a relatively small number of binaries this makes CI run very slowly.
Beta Was this translation helpful? Give feedback.
All reactions