Skip to content

Imports from cloud storage with lots of files OOM kill and fail #3059

@sbrackeen

Description

@sbrackeen

Describe the bug
A large amount of data was imported from another cloud host into the storage policy hosted on S3 type storage. If you import the whole file set, Cloudreve will run out of memory repeatedly then eventually fail the job.

I increased the container host's memory to 32/64GB and it would still run out of RAM and crash the frontend.

To Reproduce
Steps to reproduce the behavior:
Run an import analyzing a large file structure from a storage policy until Cloudreve crashes.

Expected behavior
Cloudreve should have an ability to start the import from a CLI or not crash on large folder structures on import, or possibly an external script that can assist with a large import?

Screenshots
If applicable, add screenshots to help explain your problem.

Additional context
I will try again adding one folder at a time.


{
    "id": 6698,
    "created_at": "2025-10-30T16:39:06.749663Z",
    "updated_at": "2025-10-31T07:03:10.192635Z",
    "type": "import",
    "status": "error",
    "public_state": {
        "error": "panic error: Failed to create Redis connection: dial tcp: lookup redis: i/o timeout",
        "error_history": [
            "failed to list physical files: RequestError: send request failed\ncaused by: Get \"https://s3.us-west-002.backblazeb2.com/redacted?max-keys=1000&prefix=redacted%2F\": dial tcp: lookup s3.us-west-002.backblazeb2.com: i/o timeout",
            "failed to list physical files: RequestError: send request failed\ncaused by: Get \"https://s3.us-west-002.backblazeb2.com/redacted?max-keys=1000&prefix=redacted%2F\": net/http: TLS handshake timeout"
        ],
        "executed_duration": 3869645321194,
        "retry_count": 2,
        "resume_time": 1761893266
    },
    "private_state": "{\"policy_id\":2,\"src\":\"/redacted\",\"is_recursive\":true,\"dst\":\"cloudreve://X5tj@my/%2FDropboxImport%2F\",\"phase\":\"\",\"extract_media_meta\":false}",
    "correlation_id": "946d4ea2-8525-43f6-bc25-1874bbe6d7f7",
    "user_tasks": 6,
    "edges": {
        "user": {
            "id": 6,
            "created_at": "2025-08-27T20:36:18.93682Z",
            "updated_at": "2025-11-04T00:49:39.93187Z",
            "email": "redacted",
            "nick": "redacted",
            "status": "active",
            "storage": 9773829784,
            "settings": {
                "version_retention": true,
                "version_retention_max": 10
            },
            "group_users": 4,
            "edges": {}
        }
    },
    "user_hash_id": "X5tj",
    "task_hash_id": "BGWksW",
    "summary": {
        "props": {
            "dst": "cloudreve://X5tj@my/%2FDropboxImport%2F",
            "dst_policy_id": "XeIJ",
            "failed": 0,
            "src_str": "/redacted"
        }
    }
}

Metadata

Metadata

Assignees

No one assigned

    Labels

    backlogbugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions