Conversation
| /** | ||
| * Collects a batch of items that each have a size. When the cumulative size of the batch reaches 'maxBatchSize', it calls the callback. | ||
| * If the batch isn't filled within 'timeout' ms, the callback is also called. | ||
| * And after 'batchOnlyAfter' items, the timeout is ignored, and the callback is called only when the batch is full. |
There was a problem hiding this comment.
batchOnlyAfter seems misleading, we batch before too.
There was a problem hiding this comment.
Yeah I mean it as "after N, only batch" but it also looks like "only batch after N, not before". Maybe stopTimeoutAfter or timeoutItemsLimit or something...?
There was a problem hiding this comment.
runTimeoutUntilCount maybe
There was a problem hiding this comment.
Unless you want to follow the suggestion further up to have a short timeout first and after a while switch a long timeout. :)
| } | ||
|
|
||
| flush(): void { | ||
| this.totalNumberCompleted += this.batchSize; |
There was a problem hiding this comment.
Check if this.batchSize > 0?
| private doSearchWithBatchTimeout(engine: ISearchEngine<ISerializedFileMatch>, batchSize: number): PPromise<ISerializedSearchComplete, IRawProgressItem<ISerializedFileMatch>> { | ||
| return new PPromise<ISerializedSearchComplete, IRawProgressItem<ISerializedFileMatch>>((c, e, p) => { | ||
| // Use BatchedCollector to get new results to the frontend every 2s at least, until 50 results have been returned | ||
| const collector = new BatchedCollector(batchSize, /*timeout=*/2000, /*batchOnlyAfter=*/50, p); |
There was a problem hiding this comment.
I think 2s is a long time. What about using 500ms initially and after 5s start using a timeout of 2s? That would ensure that the first results are show quickly and that progress is visible even for very long running searches.
There was a problem hiding this comment.
That's a good idea. I picked 2 partly because it's hard to interact with the results when they are changing, since they come in slightly random order. It would be good to get at least some results in faster than 2s though.
If the search process batch isn't filled in 2 seconds, it's sent to the frontend anyway. After 50 results are returned, it reverts to the original batching behavior. Also fixed to batch by actual number of results, instead of 'number of files with matches' which it ended up doing before.
Fixes #16284