Merged
Conversation
When using Python 3.9.6, Spack is no longer able to fetch anything. Commands like `spack fetch` and `spack install` all break. Python 3.9.6 includes a [new change](https://github.com/python/cpython/pull/25853/files#diff-b3712475a413ec972134c0260c8f1eb1deefb66184f740ef00c37b4487ef873eR462) that means that `scheme` must be a string, it cannot be None. The solution is to use an empty string like the method default. Fixes spack#24644. Also see Homebrew/homebrew-core#80175 where this issue was discovered by CI. Thanks @branchvincent for reporting such a serious issue before any actual users encountered it! Co-authored-by: Todd Gamblin <[email protected]>
f92001c to
7cbd30d
Compare
Member
Author
|
no longer necessary |
2badf60 to
5742b83
Compare
Member
Author
This PR fixes two problems with clang/llvm's version detection. clang's version output looks like this: ``` clang version 11.0.0 Target: x86_64-unknown-linux-gnu ``` This caused clang's version to be misdetected as: ``` [email protected] Target: ``` This resulted in errors when trying to actually use it as a compiler. When using `spack external find`, we couldn't determine the compiler version, resulting in errors like this: ``` ==> Warning: "[email protected]+clang+lld+lldb" has been detected on the system but will not be added to packages.yaml [reason=c compiler not found for [email protected]+clang+lld+lldb] ``` Changing the regex to only match until the end of the line fixes these problems. Fixes: spack#19473
Co-authored-by: Tiziano Müller <[email protected]>
Spack's source mirror was previously in a plain old S3 bucket. That will still work, but we can do better. This switches to AWS's CloudFront CDN for hosting the mirror. CloudFront is 16x faster (or more) than the old bucket. - [x] change mirror to https://mirror.spack.io
5742b83 to
28c48f9
Compare
302fbbb to
6af2353
Compare
Member
Author
|
One test failure on macOS: restarted it to see if it's just flaky |
Member
Author
|
This error is persistent... |
aa46672 to
cd441a2
Compare
Member
Author
|
macOS failure is not fixed by yet another pickle-pr, so let's see if they pass without... |
…#24794) This adds lockfile tracking to Spack's lock mechanism, so that we ensure that there is only one open file descriptor per inode. The `fcntl` locks that Spack uses are associated with an inode and a process. This is convenient, because if a process exits, it releases its locks. Unfortunately, this also means that if you close a file, *all* locks associated with that file's inode are released, regardless of whether the process has any other open file descriptors on it. Because of this, we need to track open lock files so that we only close them when a process no longer needs them. We do this by tracking each lockfile by its inode and process id. This has several nice properties: 1. Tracking by pid ensures that, if we fork, we don't inadvertently track the parent process's lockfiles. `fcntl` locks are not inherited across forks, so we'll just track new lockfiles in the child. 2. Tracking by inode ensures that referencs are counted per inode, and that we don't inadvertently close a file whose inode still has open locks. 3. Tracking by both pid and inode ensures that we only open lockfiles the minimum number of times necessary for the locks we have. Note: as mentioned elsewhere, these locks aren't thread safe -- they're designed to work in Python and assume the GIL. Tasks: - [x] Introduce an `OpenFileTracker` class to track open file descriptors by inode. - [x] Reference-count open file descriptors and only close them if they're no longer needed (this avoids inadvertently releasing locks that should not be released).
085060f to
57d92f8
Compare
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Included:
from typingstuff, not available in Spack 0.16]Not included:
Does not apply cleanly, can't really be backported without major changes or pulling in multiple concretizer refactoring prs:
Makes macOS tests fail with error reported below, so I'm hesitant to add any of these
__reduce__changes now:__reduce__method to Spec (Add a __reduce__ method to Spec #25658) [does not apply cleanly, but can be easily modified to work]__reduce__method to Environment (Add a __reduce__ method to Environment #25678) [needs the newkeep_relativeproperty to be dropped]SpecBuildInterfacepickleable (MakeSpecBuildInterfacepickleable #25628) [needs the ci changes to be discarded]