Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Flutter Web timetoken big int in javascript - Precision loss in JavaScript when representing large numbers #110

Closed
RicharC293 opened this issue Apr 19, 2023 · 4 comments · Fixed by #111
Assignees
Labels
status: accepted This issue is accepted as valid, but it's not being worked on yet. type: bug This issue reports a bug.

Comments

@RicharC293
Copy link

Pubnub uses timestamp in nanoseconds (1 billionth of a second)
For example timetoken = 16819214431624417

I encountered an issue while working with large numbers in JavaScript. Specifically, when attempting to represent the number '16819214431624417', JavaScript is representing it as '16819214431624416', which results in a loss of precision.

After researching this issue, I discovered that this is due to the fact that '16819214431624417' is larger than the safe maximum value that can be accurately represented in JavaScript using the "Number" data type, which is (2^53 - 1) or 9007199254740991.

While there are libraries available, such as BigNumber.js, that can provide an arbitrary-precision number data type for performing calculations with extremely large or small numbers, it would be helpful if there was a built-in solution in JavaScript to handle these types of scenarios.

I believe that this issue is important to address, as it can cause unexpected behavior in JavaScript applications that rely on accurate representations of large numbers.

This issue cause a infinity loop here (Only in Javascript - Flutter Web)

  Future<void> fetch() async {
    var _cursor = from;
    _messages = [];

    do {
      var result = await defaultFlow(
          keyset: _keyset,
          core: _core,
          params: FetchHistoryParams(_keyset, _channel.name,
              reverse: true,
              count: 100,
              start: _cursor,
              end: to,
              includeToken: true),
          serialize: (object, [_]) => FetchHistoryResult.fromJson(object));

      _cursor = result.endTimetoken;
      print("CURSOR: $_cursor");
      _messages.addAll(await Future.wait(result.messages.map((message) async {
        if (_keyset.cipherKey != null) {
          message['message'] = await _core.parser.decode(_core.crypto
              .decrypt(_keyset.cipherKey!, message['message'] as String));
        }
        print(message);
        return BaseMessage(
          publishedAt: Timetoken(BigInt.from(message['timetoken'])),
          content: message['message'],
          originalMessage: message,
        );
      })));
    } while (_cursor.value != BigInt.from(0));
  }
}

Version package
pubnub: ^4.2.1

@are are self-assigned this Apr 20, 2023
@are are added status: accepted This issue is accepted as valid, but it's not being worked on yet. type: bug This issue reports a bug. labels Apr 20, 2023
@are
Copy link
Contributor

are commented Apr 20, 2023

I believe the issue stems from incorrect conversion to an int here.

@RicharC293
Copy link
Author

Notice that if you print this number 16819214431624417 in chrome console this print as 16819214431624416.

@are
Copy link
Contributor

are commented Apr 20, 2023

Precisely! Using BigInts for timetoken was supposed to mitigate that, but the endpoint here needs to return a string instead of a number, so it can be properly serialized without losing precision.

@jjsebastianfuertes
Copy link

@are This bug is affecting our production environment.
Do you have an ETA for this fix? or do you suggest an alternative to use meanwhile

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
status: accepted This issue is accepted as valid, but it's not being worked on yet. type: bug This issue reports a bug.
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants