-
Notifications
You must be signed in to change notification settings - Fork 1.2k
Description
Summary
Right now, BatchingQueueCallInvoker creates a std::vector with initial capacity of 2048 elements to store the batch of tasks that will eventually be std::moved to the UI thread to be processed.
Instead of allocating and moving the vector each time, instead find a way of reusing the vector.
Motivation
Some internal partners have automated testing to catch memory leaks using memory-constrained environments, which catch OOM errors during memory allocations. Despite not being a leak, and its relatively small size, the constant (re-)allocation of this vector means it often triggers the OOM and gets blamed for others' leaks.
Finding a way to prevent the constant allocation would allow any actual leaks to be correctly root caused and reduce the noise blaming RNW.
Basic Example
No response
Open Questions
Since the tasks need to be moved onto the UI thread, is it even possible to do this?