Describe the bug
We've previously been using the v8 series with great success and no major problems. However, due to the Data API changes mentioned, we tried to upgrade to the latest v9 maven plugin with the new data feed mechanism.
Now we consistently see an OutOfMemoryError exception during the NVD API download process. In the v8.x maven plugin version we used to use 2gb memory for our build slaves, but even doubling that up to 4gb, we still see OOM exceptions.
With 2gb memory, we were able to download about ~10-20% of the data before OOM. With 4gb we've managed close to double that. Somehow the new approach appears to either be leaking or downloading data into memory and the memory handling strategy is clearly different to the earlier v8.x series.
Version of dependency-check used
v9.0.2 maven plugin
Log file
18:36:44 [INFO] Checking for updates
18:36:48 [INFO] NVD API has 231,970 records in this update
18:36:55 [INFO] Downloaded 10,000/231,970 (4%)
18:36:59 [INFO] Downloaded 20,000/231,970 (9%)
18:37:02 [INFO] Downloaded 30,000/231,970 (13%)
18:37:09 [INFO] Downloaded 40,000/231,970 (17%)
18:37:15 [INFO] Downloaded 50,000/231,970 (22%)
18:37:22 [INFO] Downloaded 60,000/231,970 (26%)
18:37:27 [INFO] Downloaded 70,000/231,970 (30%)
18:37:37 [INFO] Downloaded 80,000/231,970 (34%)
18:37:59 [INFO] Downloaded 90,000/231,970 (39%)
18:38:04 Exception in thread "httpclient-dispatch-2" java.lang.OutOfMemoryError: Java heap space
18:38:04 at org.apache.hc.core5.util.ByteArrayBuffer.expand(ByteArrayBuffer.java:58)
18:38:04 at org.apache.hc.core5.util.ByteArrayBuffer.append(ByteArrayBuffer.java:88)
18:38:04 at org.apache.hc.client5.http.async.methods.SimpleAsyncEntityConsumer.data(SimpleAsyncEntityConsumer.java:62)
18:38:04 at org.apache.hc.core5.http.nio.entity.AbstractBinDataConsumer.consume(AbstractBinDataConsumer.java:75)
18:38:04 at org.apache.hc.core5.http.nio.support.AbstractAsyncResponseConsumer.consume(AbstractAsyncResponseConsumer.java:134)
18:38:04 at org.apache.hc.client5.http.impl.async.HttpAsyncMainClientExec$1.consume(HttpAsyncMainClientExec.java:243)
18:38:04 at org.apache.hc.core5.http.impl.nio.ClientHttp1StreamHandler.consumeData(ClientHttp1StreamHandler.java:255)
18:38:04 at org.apache.hc.core5.http.impl.nio.ClientHttp1StreamDuplexer.consumeData(ClientHttp1StreamDuplexer.java:354)
18:38:04 at org.apache.hc.core5.http.impl.nio.AbstractHttp1StreamDuplexer.onInput(AbstractHttp1StreamDuplexer.java:325)
18:38:04 at org.apache.hc.core5.http.impl.nio.AbstractHttp1IOEventHandler.inputReady(AbstractHttp1IOEventHandler.java:64)
18:38:04 at org.apache.hc.core5.http.impl.nio.ClientHttp1IOEventHandler.inputReady(ClientHttp1IOEventHandler.java:41)
18:38:04 at org.apache.hc.core5.reactor.ssl.SSLIOSession.decryptData(SSLIOSession.java:600)
18:38:04 at org.apache.hc.core5.reactor.ssl.SSLIOSession.access$200(SSLIOSession.java:74)
18:38:04 at org.apache.hc.core5.reactor.ssl.SSLIOSession$1.inputReady(SSLIOSession.java:202)
18:38:04 at org.apache.hc.core5.reactor.InternalDataChannel.onIOEvent(InternalDataChannel.java:142)
18:38:04 at org.apache.hc.core5.reactor.InternalChannel.handleIOEvent(InternalChannel.java:51)
18:38:04 at org.apache.hc.core5.reactor.SingleCoreIOReactor.processEvents(SingleCoreIOReactor.java:178)
18:38:04 at org.apache.hc.core5.reactor.SingleCoreIOReactor.doExecute(SingleCoreIOReactor.java:127)
18:38:04 at org.apache.hc.core5.reactor.AbstractSingleCoreIOReactor.execute(AbstractSingleCoreIOReactor.java:86)
18:38:04 at org.apache.hc.core5.reactor.IOReactorWorker.run(IOReactorWorker.java:44)
18:38:04 at java.lang.Thread.run(Thread.java:750)
18:38:26 Exception in thread "httpclient-dispatch-2" java.lang.OutOfMemoryError: Java heap space
18:38:26 at org.apache.hc.core5.util.ByteArrayBuffer.expand(ByteArrayBuffer.java:58)
18:38:26 at org.apache.hc.core5.util.ByteArrayBuffer.append(ByteArrayBuffer.java:88)
18:38:26 at org.apache.hc.client5.http.async.methods.SimpleAsyncEntityConsumer.data(SimpleAsyncEntityConsumer.java:62)
18:38:26 at org.apache.hc.core5.http.nio.entity.AbstractBinDataConsumer.consume(AbstractBinDataConsumer.java:75)
To Reproduce
Suggest comparing v8.x invocation of the plugin with a 2gb maximum heap size, vs the new v9.x series with the same heap. That said, we have not seen this on every one of our projects, so it may be that there are some unique dependencies of the project that trigger this. If needed, we'll see if we can share the open source dependencies of the affected project.
Expected behavior
Same memory characteristics as v8.x maven plugin.
Additional context
N/A
Describe the bug
We've previously been using the v8 series with great success and no major problems. However, due to the Data API changes mentioned, we tried to upgrade to the latest v9 maven plugin with the new data feed mechanism.
Now we consistently see an OutOfMemoryError exception during the NVD API download process. In the v8.x maven plugin version we used to use 2gb memory for our build slaves, but even doubling that up to 4gb, we still see OOM exceptions.
With 2gb memory, we were able to download about ~10-20% of the data before OOM. With 4gb we've managed close to double that. Somehow the new approach appears to either be leaking or downloading data into memory and the memory handling strategy is clearly different to the earlier v8.x series.
Version of dependency-check used
v9.0.2 maven plugin
Log file
To Reproduce
Suggest comparing v8.x invocation of the plugin with a 2gb maximum heap size, vs the new v9.x series with the same heap. That said, we have not seen this on every one of our projects, so it may be that there are some unique dependencies of the project that trigger this. If needed, we'll see if we can share the open source dependencies of the affected project.
Expected behavior
Same memory characteristics as v8.x maven plugin.
Additional context
N/A