Skip to content

Packaging a large archive file like machine learning data (COCO) 18 GiB results in error: Nix daemon out of memory #3684

@CMCDragonkai

Description

@CMCDragonkai

Describe the bug

I wanted to use Nix to package machine learning data sets.

One example is the COCO dataset (http://cocodataset.org/#download). The train2017.zip archive is 18 GiB.

When I use nix-prefetch-url, it downloads the entire file, but then during the calculation of the hash, it results in:

error: Nix daemon out of memory

Expected behavior

I'd expect that Nix can handle arbitrarily large files. Surely there shouldn't be a memory limit on the kind of packages we can work with.

Nix version is:

nix (Nix) 2.3.2

Metadata

Metadata

Assignees

No one assigned

    Labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions