Buffer.utf8Write() fails to write when buffer length exceeds 2 GB #51817
Labels
buffer
Issues and PRs related to the buffer subsystem.
confirmed-bug
Issues with confirmed bugs.
good first issue
Issues that are suitable for first-time contributors.
Version
18.16.0, 21.6.2
Platform
Darwin LX2970YFV4 23.3.0 Darwin Kernel Version 23.3.0: Wed Dec 20 21:30:44 PST 2023; root:xnu-10002.81.5~7/RELEASE_ARM64_T6000 arm64
Subsystem
No response
What steps will reproduce the bug?
How often does it reproduce? Is there a required condition?
All the time.
What is the expected behavior? Why is that the expected behavior?
The
utf8Write()
function call should succeed and actually write something.What do you see instead?
See reproduction steps.
Additional information
I built NodeJS from source code and used lldb and source code inspection to understand what's happening here.
If I allocate a buffer with size 2^32, an offset of 2, and provide a maxLength of is 2^32, eventually there's an attempt to pass
2^32-2
from thenode::StringBytes::Write()
buflen argument (size_t) into thev8::String::WriteUtf8()
capacity argument (int). This cast causes an integer overflow and the v8 function ends up with a capacity argument of -2. This in turn causes the rest ofv8::String::WriteUtf8()
to short-circuit and not write anything (release builds) or crash in a DHECK_GE (debug builds).Here's the top of the stack trace from the DCHECK_GE crash:
The text was updated successfully, but these errors were encountered: