-
Notifications
You must be signed in to change notification settings - Fork 114
Processing a large file results in Out of Memory errors #1358
Copy link
Copy link
Open
Labels
bugSomething isn't working (crash, a rule has a problem)Something isn't working (crash, a rule has a problem)status: ReadyAn issue that is ready to be worked on.An issue that is ready to be worked on.tech debt
Milestone
Description
Describe the bug
When processing a large GTFS file, the desktop validator doesn't show any errors to the user.
When checking the system_errors.json file, the following error is present:
{
"code": "thread_execution_error",
"severity": "ERROR",
"totalNotices": 1,
"sampleNotices": [
{
"exception": "java.lang.OutOfMemoryError",
"message": "Java heap space"
}
]
}Steps/Code to Reproduce
- Download latest release of the desktop validator (v4.0.0 at time of writing)
- Install the validator
- Download this dataset
- Run the desktop validator against the downloaded dataset
- When complete, observe the
system_errors.jsonfile contains ajava.lang.OutOfMemoryError.
Expected Results
The validator should complete without error as long as the system has enough memory.
Actual Results
The validator should completes with error.
Screenshots
No response
Files used
File is too large to upload. Linked here.
Validator version
4.0.0
Operating system
MacOS Ventura 13.0, M1 Pro, 16GB Ram
Java version
openjdk version "11.0.16.1" 2022-08-12
Additional notes
When testing the CLI validator with different JVM settings, here were my results:
-Xmx6g ❌
-Xmx7g ✅
-Xmx8g ✅
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
bugSomething isn't working (crash, a rule has a problem)Something isn't working (crash, a rule has a problem)status: ReadyAn issue that is ready to be worked on.An issue that is ready to be worked on.tech debt