Describe the issue
I load two models cca. 800MB into the browser. When I attempt to run inference using a 650MB model with the ort.env.wasm.proxy flag set to true, the application crashes with an error message: "TypeError: Failed to execute 'postMessage' on 'Worker': ArrayBuffer at index 0 is not detachable and could not be transferred." Do you have any suggestions for preventing this error? Alternatively, if I don't set the flag to true, the inference process runs smoothly, but it causes the UI to freeze. Are there any alternative methods to prevent the UI from freezing?
To reproduce
Urgency
It's somewhat urgent
ONNX Runtime Installation
Built from Source
ONNX Runtime Version or Commit ID
1.16
Execution Provider
'wasm'/'cpu' (WebAssembly CPU)