fetch: work around Anubis AI protection#51496
Merged
haampie merged 1 commit intospack:developfrom Nov 3, 2025
Merged
Conversation
More and more sites are deploying AI protection measures such as Anubis. This can cause errors like this when fetching: ``` $ spack fetch -M gobject-introspection ==> Fetching https://download.gnome.org/sources/gobject-introspection/1.78/gobject-introspection-1.78.1.tar.xz [100%] 1.06 MB @ 6.2 MB/s ==> Fetching https://gitlab.gnome.org/GNOME/gobject-introspection/-/merge_requests/490/commits.patch [100%] 3.13 KB @ 7.2 MB/s ==> Warning: The contents of $SPACK_STAGE/spack-stage--hdurcup-patch-8085a21385aba2370ba0859f7d0c5f0a6d6a051ab3c0ea0b8881d567d6356299/commits.patch fetched from https://gitlab.gnome.org/GNOME/gobject-introspection/-/merge_requests/490/commits.patch looks like HTML. This can indicate a broken URL, or an internet gateway issue. ==> Error: sha256 checksum failed for $SPACK_STAGE/spack-stage--hdurcup-patch-8085a21385aba2370ba0859f7d0c5f0a6d6a051ab3c0ea0b8881d567d6356299/commits.patch Expected 8085a21385aba2370ba0859f7d0c5f0a6d6a051ab3c0ea0b8881d567d6356299 but got 32c02388fa2e59f4882951e43c4de715775c10e107bc5a3ddec8d3d70ed9d512. File size = 3132 bytes. Contents = b'<!doctype html><...n></body></html>'. URL = https://gitlab.gnome.org/GNOME/gobject-introspection/-/merge_requests/490/commits.patch ``` After comparing the request headers sent by Python's urllib and curl, it seems that setting `Accept: */*` is enough to make fetching work again. Related: spack#50124 Signed-off-by: Michael Kuhn <[email protected]>
c675f9a to
581874a
Compare
haampie
approved these changes
Nov 3, 2025
Member
|
Thanks! |
hippo91
pushed a commit
to hippo91/spack
that referenced
this pull request
Nov 4, 2025
More and more sites are deploying AI protection measures such as Anubis. This can cause errors when fetching. After comparing the request headers sent by Python's urllib and curl, it seems that setting `Accept: */*` is enough to make fetching work again. Related: spack#50124 Signed-off-by: Michael Kuhn <[email protected]>
kshea21
pushed a commit
to kshea21/spack
that referenced
this pull request
Nov 4, 2025
More and more sites are deploying AI protection measures such as Anubis. This can cause errors when fetching. After comparing the request headers sent by Python's urllib and curl, it seems that setting `Accept: */*` is enough to make fetching work again. Related: spack#50124 Signed-off-by: Michael Kuhn <[email protected]>
Merged
becker33
pushed a commit
that referenced
this pull request
Feb 2, 2026
More and more sites are deploying AI protection measures such as Anubis. This can cause errors when fetching. After comparing the request headers sent by Python's urllib and curl, it seems that setting `Accept: */*` is enough to make fetching work again. Related: #50124 Signed-off-by: Michael Kuhn <[email protected]>
becker33
pushed a commit
that referenced
this pull request
Feb 2, 2026
More and more sites are deploying AI protection measures such as Anubis. This can cause errors when fetching. After comparing the request headers sent by Python's urllib and curl, it seems that setting `Accept: */*` is enough to make fetching work again. Related: #50124 Signed-off-by: Michael Kuhn <[email protected]> Signed-off-by: Gregory Becker <[email protected]>
becker33
pushed a commit
that referenced
this pull request
Feb 19, 2026
More and more sites are deploying AI protection measures such as Anubis. This can cause errors when fetching. After comparing the request headers sent by Python's urllib and curl, it seems that setting `Accept: */*` is enough to make fetching work again. Related: #50124 Signed-off-by: Michael Kuhn <[email protected]> Signed-off-by: Gregory Becker <[email protected]>
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
More and more sites are deploying AI protection measures such as Anubis. This can cause errors like this when fetching:
After comparing the request headers sent by Python's urllib and curl, it seems that setting
Accept: */*is enough to make fetching work again.Related: #50124