Join GitHub today
GitHub is home to over 28 million developers working together to host and review code, manage projects, and build software together.Sign up
Do NOT return BigDecimal as string in JSON - breaks DynamoDB support #25017
Steps to reproduce
First, thanks for all of your hard work Rails!
Amazon AWS DynamoDB stores ALL numbers as BigDecimal. This is beyond my control and not something I can change. I have a client app sending in arbitrary JSON data like the following:
Some of the field values are strings, some are floats and some are integers. I am not doing anything with giant numbers or arbitrary precision floats. I'm just sending in regular numbers.
This now means that the client sends in actual JSON numbers like
I don't have control over the client and it is sending arbitrary JSON, so I can't get the client to force these numbers returned as strings back into being numbers clientside. And because I have to return arbitrary JSON, I cannot just use Rabl or some other serialization templating tool to force the BigDecimal back to regular numbers serverside. So this is making it difficult for me to back Rails with DynamoDB as my persistent data store and also send and receive data to my client via a JSON API.
The client is _NOT_ sending in BigDecimal and knows nothing about BigDecimal, the client is sending regular numbers and expects regular numbers back, so the comment "the other other end knows by contract that the data is supposed to be a BigDecimal" (see) is totally inapplicable in this case.
I suppose I could write some code that descends through the returned dictionary, before converting it to JSON, and convert all BigDecimal to Ruby numbers (e.g.
Please bring back encode_big_decimal_as_string, otherwise you are making it difficult for people to back a Rails apps with DynamoDB. And this is thousands of people, not just me.
If you can provide some guidance, I'd be happy to submit a PR to restore encode_big_decimal_as_string.
(Guidelines for creating a bug report are available
I should have a choice as to whether BigDecimal are returned as a string or regular old number in JSON.
All BigDecimal are returned as string no matter what. This totally breaks DynamoDB support because DynamoDB stores all numbers as BigDecimal.
@rafaelfranca thanks for responding. :-)
Look, I don't mean to sound like a whiner and I'm not arguing for the sake of arguing. But I feel disappointed by this response to be honest. The decision to remove
I'd like to respectfully ask that we re-open this issue. I don't feel that it has been solved or that it should be marked as closed.
By "the gem recommended in the deprecation message" I assume you mean activesupport-json_encoder? I see that that gem has not had any commits since Oct 22, 2015 which makes me nervous. Is that gem even maintained any more? Can I count on that gem getting security updates?
Also, what is the performance hit when resorting back to
You're also asking me to change the JSON encoding gem for an active production app used by hundreds of thousands of active users - I don't want to do that lightly. Of course I have full automated test suites, but still, I'm reluctant to change json encoding just to get around this.
And I _still_ argue that this is a violation of the principle of least surprise: if a user PUTs
Is there another way to work around this other than using
Thanks again for all of your help and enjoy your weekend. I really appreciate your hard work on Rails.
The gem was extracted from the Rails framework. It is still maintainer and will be maintainer as any other gem that was extracted. We don't have plans to integrate it again. You can find more about this decision here #12183. I was not involved on it so I'll defer to @chancancode the decision to reopen this issue or not.
@kamen-hursev since the workaround is simple, then adding a layer of configuration for what is a very application specific requirement seems offer little benefit to the wider Rails userbase. There are all sorts of scenarios where the basic JSON datatypes don't cover what's needed - dates and times being the obvious one. What if there was an API that needed to encode a time as a UNIX timestamp - should we add that as an option?
The main point of these configuration options is to provide compatibility during a transition - a recent example is the change of
@joelpresence sorry for not getting back to you earlier, here is an explanation with more details.
I personally find that number very difficult to imagine, and I don't think that kind of exaggeration is helping your case here. For the future, let's stick to facts.
However, even if we do go with that, I still believe this to be a better default. I'll try to explain.
Presumably, the number is represented in
In this case, that decision was made for you by the database driver. Presumably, that is for a good and important reason (otherwise, you should just file that as a bug against your DB driver, since paying the performance penalty of
If your DB offers that kind of feature, then presumably at least some of those "thousands of apps" would try to use that feature and depend on those guarantees. While the JSON spec does not put a limit on the precision of the number data type, in practice all clients would parse that into a float/double data type, so suddenly the
Perhaps it is not a big deal in isolation, but if you process thousands or tens of thousands of transactions per day, you could end up losing a lot of money. I don't get paid doing this, and I definitely couldn't afford people coming after me for their losses. (Dear lawyers: this statement does not imply warranty – see our LICENSE for details.)
Hopefully you can now see why this is the better default for
However, since you are asking this, I assume you are not sending Bitcoins across the wire. Maybe you are just transmitting a count of the number of kitten gifs you have in your collection, or perhaps a timestamp. I understand that this protection mechanism could be annoying there.
No problem. If you assert that none of these matters at all, anywhere in your application, and you are 100% sure that none of your gems, engines and so on depend on it, you can simply drop the precision yourself, by down-casting them into a
require 'active_support' require 'active_support/core_ext/object/json' class BigDecimal def as_json(*) to_f end end
In fact, if you are sure that you are only transmitting kitten gif counts, and since those counts must be integers, you can even do this:
require 'active_support' require 'active_support/core_ext/object/json' class BigDecimal def as_json(*) to_i end end
This snippet converts all
If that sounded like a terrible idea, that's because it is a terrible idea! You see, changing global defaults like that would affect everything in your application. Even though your code – incidentally – only uses
The point is, your claim that it "doesn't matter" to your application is probably overly strong. My personally recommendation is that you do this on a case-by-case scenario, either on the server or on the client.
On the server, you can use the serializer pattern (something like Active Model Serializers or jsonapi-resources) to transform your data before encoding them. If you go with those, you can easily customize the type for each field based on your domain knowledge, and/or use the usual OO/Ruby/meta-programming techniques to make things more concise. Alternatively, you can also used view-based libraries like jbuilder, which you can write custom helpers for the same purpose.
On the client, you would need to use whatever transformation hooks your framework provides, which is out-of-scope for the discussion here.
Two extra data-points:
Arguably, it might be more "unexpected and buggy" to not encode them as strings.
@kamen-hursev the gem is scheduled to be EOL'ed with Rails 5. However, I believe that it should be fairly easy to support it for another minor release or so. If, after reading this response, you still think you have a good reason to go with that route, I can investigate extending the support for that gem.
referenced this issue
Oct 13, 2017
I'm developing JSON APIs using Scala + Play Framework and found this issue whilst Googling to find out about how other frameworks handle raw (i.e. not string) numbers in JSON.
This is not actually true of Play Framework's JSON support - when reading JSON into a BigDecimal field Play does not parse via a float/double, it parses the JSON directly into a BigDecimal so there is no loss of precision.
Just thought it might be useful for you to know that there are some clients out there that can parse the non-string version losslessly. I'm not making any suggestion about what Rails should do just passing on this info :)