Skip to content

Processing a large file results in Out of Memory errors #1358

@KClough

Description

@KClough

Describe the bug

When processing a large GTFS file, the desktop validator doesn't show any errors to the user.

When checking the system_errors.json file, the following error is present:

{
      "code": "thread_execution_error",
      "severity": "ERROR",
      "totalNotices": 1,
      "sampleNotices": [
        {
          "exception": "java.lang.OutOfMemoryError",
          "message": "Java heap space"
        }
      ]
}

Steps/Code to Reproduce

  1. Download latest release of the desktop validator (v4.0.0 at time of writing)
  2. Install the validator
  3. Download this dataset
  4. Run the desktop validator against the downloaded dataset
  5. When complete, observe the system_errors.json file contains a java.lang.OutOfMemoryError.

Expected Results

The validator should complete without error as long as the system has enough memory.

Actual Results

The validator should completes with error.

Screenshots

No response

Files used

File is too large to upload. Linked here.

Validator version

4.0.0

Operating system

MacOS Ventura 13.0, M1 Pro, 16GB Ram

Java version

openjdk version "11.0.16.1" 2022-08-12

Additional notes

When testing the CLI validator with different JVM settings, here were my results:

-Xmx6g ❌
-Xmx7g ✅
-Xmx8g ✅

Metadata

Metadata

Assignees

Labels

bugSomething isn't working (crash, a rule has a problem)status: ReadyAn issue that is ready to be worked on.tech debt

Type

No type

Projects

No projects

Relationships

None yet

Development

No branches or pull requests

Issue actions