Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Decimal Accuracy #31

Closed
kolektiv opened this issue Apr 21, 2015 · 10 comments
Closed

Decimal Accuracy #31

kolektiv opened this issue Apr 21, 2015 · 10 comments

Comments

@kolektiv
Copy link
Member

Currently decimals may lose data when roundtripping between values representing number with > 128bit precision. The parser should potentially be changed to accomodate this, or potentially warn/throw.

@haf
Copy link
Contributor

haf commented Apr 23, 2015

Example failing test that I got now that you removed the string-based serialisation:

Expected: {amount = 7922816247737080000M;
 currency = {name = "Lempira";
             code = HNL;
             symbol = "L";
             minorUnit = Some 2.0;};}
Actual: {amount = 7922816247737084945M;
 currency = {name = "Lempira";
             code = HNL;
             symbol = "L";
             minorUnit = Some 2.0;};}

@kolektiv
Copy link
Member Author

Aha! Cool, good to have something concrete :) I'll take a look at this - if
you've got a good fix idea I'm happy to try that too!

On 23 April 2015 at 12:57, Henrik Feldt notifications@github.com wrote:

Example failing test that I got now that you removed the string-based
serialisation:

Expected: {amount = 7922816247737080000M;
currency = {name = "Lempira";
code = HNL;
symbol = "L";
minorUnit = Some 2.0;};}
Actual: {amount = 7922816247737084945M;
currency = {name = "Lempira";
code = HNL;
symbol = "L";
minorUnit = Some 2.0;};}


Reply to this email directly or view it on GitHub
#31 (comment).

@haf
Copy link
Contributor

haf commented Apr 23, 2015

I think a sane way would be to default Json.Number to decimal - better safe than sorry -- that way we have the basics down. Then provide FromJson that also accept strings, like I had in my PR. Then provide ToJson which check whether all the bits can accurately be represented; if they can; serialise to Number, otherwise to string. It's pretty hard to do the check-for-accuracy well because you'd have to know how the 64 bits are split between sign, exponent and mantissa if using floats and the 128 bits if using decimal.

I also think this library could be made better with a BigInt overload; with the same semantics as above.

@haf
Copy link
Contributor

haf commented Apr 23, 2015

Here's another value that is invalid for my domain: 1.84467440694146E+19 from 18446744069414584320M

@haf
Copy link
Contributor

haf commented Apr 27, 2015

Hi Andrew,

How was your weekend? Have you given this any more thought?

@kolektiv
Copy link
Member Author

Hey, hectic, but home now! I've opened pr #35 which switches the underlying number type to decimal - I agree that's a sane default here.

The number/string switch I'm less sure of, to be honest. Although it would work, it feels somehow as if it breaks some promises in terms of how a serialization should work - and especially if i'm planning on doing things like supporting JSON Schema (I hope so at some point). It feels like if { Num = 35 } serializes to { "num": 35 } but { Num = 35xxxxxxxxx... } serializes to { "num": "35xxxx..." } that breaks some expectations.

It seems like it's only really an issue on the "from JSON" side (I think?). I'd almost rather something like a warning or failure happened when a genuinely too big JSON number came along. The string fix doesn't really change the fact that JSON created elsewhere could have numbers like that, so it doesn't feel like a total (in the mathematical, or loose) sense fix anyway...

I don't know though - would changing to decimal fix the actual problems you have now? Is there a better way of dealing with the edge cases if it doesn't? If it doesn't, perhaps that's what some thinking around further combinators might be useful...

@haf
Copy link
Contributor

haf commented Apr 28, 2015

I think decimal solves my immediate problems.

From JSON yes; you might batch up a more total fix by supporting BigInt. We could file an issue about that?

Yes, FromJson is the most obvious culprit -- if you have a BigInt, it's already represented as accurately as can be.

Can you make the parser parse 'something else' if a really large number comes along? Like a state/count threaded through the number parser?

@kolektiv
Copy link
Member Author

Cool, I've just merged that PR so a new package should be available shortly! I don't see any reason not to support BigInt as an option, I'll implement that.

In terms of parsing "alternatives", the number parser itself must be a parser for a single type - that type could be a union, but a union based on size seems like some complexity that would quickly get horrible! Worth having a think about though, and whether combinators plus a return of a parsed Choice2 or similar might be a possible approach...

@haf
Copy link
Contributor

haf commented May 1, 2015

I've tested with the latest release and nicely enough, tests are passing :)

@kolektiv
Copy link
Member Author

kolektiv commented May 1, 2015

Cool! I'll close this for now and make sure to remember the relevant bits when looking at combinators...

@kolektiv kolektiv closed this as completed May 1, 2015
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants