Describe the bug
I wanted to use Nix to package machine learning data sets.
One example is the COCO dataset (http://cocodataset.org/#download). The train2017.zip archive is 18 GiB.
When I use nix-prefetch-url, it downloads the entire file, but then during the calculation of the hash, it results in:
error: Nix daemon out of memory
Expected behavior
I'd expect that Nix can handle arbitrarily large files. Surely there shouldn't be a memory limit on the kind of packages we can work with.
Nix version is: