Skip to content

fetch: work around Anubis AI protection#51496

Merged
haampie merged 1 commit intospack:developfrom
michaelkuhn:fetch-anubis-ai-protection
Nov 3, 2025
Merged

fetch: work around Anubis AI protection#51496
haampie merged 1 commit intospack:developfrom
michaelkuhn:fetch-anubis-ai-protection

Conversation

@michaelkuhn
Copy link
Copy Markdown
Member

More and more sites are deploying AI protection measures such as Anubis. This can cause errors like this when fetching:

$ spack fetch -M gobject-introspection
==> Fetching https://download.gnome.org/sources/gobject-introspection/1.78/gobject-introspection-1.78.1.tar.xz
    [100%]    1.06 MB @    6.2 MB/s
==> Fetching https://gitlab.gnome.org/GNOME/gobject-introspection/-/merge_requests/490/commits.patch
    [100%]    3.13 KB @    7.2 MB/s
==> Warning: The contents of $SPACK_STAGE/spack-stage--hdurcup-patch-8085a21385aba2370ba0859f7d0c5f0a6d6a051ab3c0ea0b8881d567d6356299/commits.patch fetched from https://gitlab.gnome.org/GNOME/gobject-introspection/-/merge_requests/490/commits.patch  looks like HTML. This can indicate a broken URL, or an internet gateway issue.
==> Error: sha256 checksum failed for $SPACK_STAGE/spack-stage--hdurcup-patch-8085a21385aba2370ba0859f7d0c5f0a6d6a051ab3c0ea0b8881d567d6356299/commits.patch
Expected 8085a21385aba2370ba0859f7d0c5f0a6d6a051ab3c0ea0b8881d567d6356299 but got 32c02388fa2e59f4882951e43c4de715775c10e107bc5a3ddec8d3d70ed9d512. File size = 3132 bytes. Contents = b'<!doctype html><...n></body></html>'. URL = https://gitlab.gnome.org/GNOME/gobject-introspection/-/merge_requests/490/commits.patch

After comparing the request headers sent by Python's urllib and curl, it seems that setting Accept: */* is enough to make fetching work again.

Related: #50124

More and more sites are deploying AI protection measures such as Anubis.
This can cause errors like this when fetching:
```
$ spack fetch -M gobject-introspection
==> Fetching https://download.gnome.org/sources/gobject-introspection/1.78/gobject-introspection-1.78.1.tar.xz
    [100%]    1.06 MB @    6.2 MB/s
==> Fetching https://gitlab.gnome.org/GNOME/gobject-introspection/-/merge_requests/490/commits.patch
    [100%]    3.13 KB @    7.2 MB/s
==> Warning: The contents of $SPACK_STAGE/spack-stage--hdurcup-patch-8085a21385aba2370ba0859f7d0c5f0a6d6a051ab3c0ea0b8881d567d6356299/commits.patch fetched from https://gitlab.gnome.org/GNOME/gobject-introspection/-/merge_requests/490/commits.patch  looks like HTML. This can indicate a broken URL, or an internet gateway issue.
==> Error: sha256 checksum failed for $SPACK_STAGE/spack-stage--hdurcup-patch-8085a21385aba2370ba0859f7d0c5f0a6d6a051ab3c0ea0b8881d567d6356299/commits.patch
Expected 8085a21385aba2370ba0859f7d0c5f0a6d6a051ab3c0ea0b8881d567d6356299 but got 32c02388fa2e59f4882951e43c4de715775c10e107bc5a3ddec8d3d70ed9d512. File size = 3132 bytes. Contents = b'<!doctype html><...n></body></html>'. URL = https://gitlab.gnome.org/GNOME/gobject-introspection/-/merge_requests/490/commits.patch
```

After comparing the request headers sent by Python's urllib and curl, it
seems that setting `Accept: */*` is enough to make fetching work again.

Related: spack#50124

Signed-off-by: Michael Kuhn <[email protected]>
@michaelkuhn michaelkuhn force-pushed the fetch-anubis-ai-protection branch from c675f9a to 581874a Compare November 1, 2025 17:44
@haampie haampie added the v1.0.3 PRs to backport for v1.0.3 label Nov 3, 2025
@haampie
Copy link
Copy Markdown
Member

haampie commented Nov 3, 2025

Thanks!

@haampie haampie merged commit b892c07 into spack:develop Nov 3, 2025
33 checks passed
hippo91 pushed a commit to hippo91/spack that referenced this pull request Nov 4, 2025
More and more sites are deploying AI protection measures such as Anubis.
This can cause errors when fetching.

After comparing the request headers sent by Python's urllib and curl, it
seems that setting `Accept: */*` is enough to make fetching work again.

Related: spack#50124

Signed-off-by: Michael Kuhn <[email protected]>
kshea21 pushed a commit to kshea21/spack that referenced this pull request Nov 4, 2025
More and more sites are deploying AI protection measures such as Anubis.
This can cause errors when fetching.

After comparing the request headers sent by Python's urllib and curl, it
seems that setting `Accept: */*` is enough to make fetching work again.

Related: spack#50124

Signed-off-by: Michael Kuhn <[email protected]>
@becker33 becker33 mentioned this pull request Feb 2, 2026
becker33 pushed a commit that referenced this pull request Feb 2, 2026
More and more sites are deploying AI protection measures such as Anubis.
This can cause errors when fetching.

After comparing the request headers sent by Python's urllib and curl, it
seems that setting `Accept: */*` is enough to make fetching work again.

Related: #50124

Signed-off-by: Michael Kuhn <[email protected]>
becker33 pushed a commit that referenced this pull request Feb 2, 2026
More and more sites are deploying AI protection measures such as Anubis.
This can cause errors when fetching.

After comparing the request headers sent by Python's urllib and curl, it
seems that setting `Accept: */*` is enough to make fetching work again.

Related: #50124

Signed-off-by: Michael Kuhn <[email protected]>
Signed-off-by: Gregory Becker <[email protected]>
becker33 pushed a commit that referenced this pull request Feb 19, 2026
More and more sites are deploying AI protection measures such as Anubis.
This can cause errors when fetching.

After comparing the request headers sent by Python's urllib and curl, it
seems that setting `Accept: */*` is enough to make fetching work again.

Related: #50124

Signed-off-by: Michael Kuhn <[email protected]>
Signed-off-by: Gregory Becker <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

v1.0.3 PRs to backport for v1.0.3

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants