You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Numbers in JSON are agnostic with regard to their representation within programming languages. While this allows for numbers of arbitrary precision to be serialized, it may lead to portability issues. For example, since no differentiation is made between integer and floating-point values, some implementations may treat 42, 42.0, and 4.2E+1 as the same number, while others may not. The JSON standard makes no requirements regarding implementation details such as overflow, underflow, loss of precision, rounding, or signed zeros, but it does recommend expecting no more than IEEE 754binary64 precision for "good interoperability"
Looking at the IEEE_754 article:
The 53-bit significand precision gives from 15 to 17 significant decimal digits precision (2−53 ≈ 1.11 × 10−16). If a decimal string with at most 15 significant digits is converted to the IEEE 754 double-precision format, giving a normal number, and then converted back to a decimal string with the same number of digits, the final result should match the original string. If an IEEE 754 double-precision number is converted to a decimal string with at least 17 significant digits, and then converted back to double-precision representation, the final result must match the original number.[1]
I'm not sure how we deal with this given that cjson is vendored.
There is an upstream patch in that area, but it left the default: openresty/lua-cjson@f79aa68
I'm not sure how we deal with this given that cjson is vendored.
There is an upstream patch in that area, but it left the default: openresty/lua-cjson@f79aa68
We don't support json_cfg_encode_number_precision or any other runtime options since 8d4a53f .
Problem
Returns
3.0537008069594e+15
, which is3053700806959400
JSON floating number encoding is in general a bit messy but I think at least the default precision should be raised.
It's currently set to 14 in
neovim/src/cjson/lua_cjson.c
Line 86 in 1ee905a
Other implementations try 15, and if that changes the result, they go up to 17, see:
https://github.com/DaveGamble/cJSON/blob/cb8693b058ba302f4829ec6d03f609ac6f848546/cJSON.c#L572C20-L572C20
, e.g. from wikipedia:
Looking at the IEEE_754 article:
This popped up via: mfussenegger/nvim-dap#1004 - the dart debug adapter currently uses 53bit integers
I'm not sure how we deal with this given that cjson is vendored.
There is an upstream patch in that area, but it left the default: openresty/lua-cjson@f79aa68
Steps to reproduce
and run tests
Expected behavior
Tests pass and
vim.json.decode(vim.json.encode(3053700806959403))
doesn't loose precisionNeovim version (nvim -v)
NVIM v0.10.0-dev-752+g1ee905a63
Vim (not Nvim) behaves the same?
no, has no vim.json
Operating system/version
Linux
Terminal name/version
alacritty
$TERM environment variable
alacritty
Installation
From source
The text was updated successfully, but these errors were encountered: