Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Worst performing JSON mapping framework #422

Closed
eimantas opened this issue Oct 21, 2016 · 4 comments
Closed

Worst performing JSON mapping framework #422

eimantas opened this issue Oct 21, 2016 · 4 comments

Comments

@eimantas
Copy link

@eimantas eimantas commented Oct 21, 2016

Hi guys

Any comment on these benchmarks?

@jshier
Copy link
Contributor

@jshier jshier commented Oct 21, 2016

As an Argo user, benchmarks like this are largely irrelevant. Only the worst made backend APIs would return so much data that the performance between JSON frameworks would be noticeably different. Plus it's all done asynchronously anyway, so it would hardly be noticeable. I've also found that enabling whole module optimization can have a big impact on Argo, and the benchmarked doesn't note whether it was enabled or not.

That said, the biggest thing Argo could do for performance is write or adopt a native JSON parser to replace JSONSerialization. The Any -> JSON conversion step is the slowest part of Argo's parsing right now, so replacing it with a single step would greatly improve performance. Something like Freddy's JSONParser. Once that's taken care of, various micro-optimizations are likely possible. Also, I know there's a multithreaded array decoding PR up right now, which may help if you're decoding lots of huge arrays but for most people I don't think it would do anything.

@jshier
Copy link
Contributor

@jshier jshier commented Oct 21, 2016

And it looks like the tester didn't even make his Argo implementation available, which is poor form.

@jakecraige
Copy link

@jakecraige jakecraige commented Oct 21, 2016

@tonyd256
Copy link
Contributor

@tonyd256 tonyd256 commented Oct 24, 2016

The main focus for Argo has been safety, conciseness, and flexibility through Swift's strong types and common functional concepts. I believe Argo does the best job in these areas. Argo does have some shortcomings that we've been transparent about from its birth. You can look back through many of our issues and see that runtime and compile time performance hasn't always been stellar. Much of this is due to the Swift compiler not being as good as some of the functional languages where Argo's concept came from. The Swift compiler is still young and is constantly improving and Argo's performance will improve with it.

That being said, we are always trying to make Argo as performant as we can without giving up on what it is. We don't always have a ton of time to experiment with different ways to improve performance but we like to encourage anyone to create PRs.

I think the main thing to keep in mind when reading any article online is that the whole story might not be captured. I agree with @jshier in that 7mb of JSON is quite a large chunk of data and applications should probably consider alternatives (like pagination) long before hitting this payload size. I also understand that one might not always have control over the backend they're using in their app, so one must make a decision on which dependencies to use based on their app's requirements and that could be different for different apps.

@tonyd256 tonyd256 closed this Oct 24, 2016
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Linked pull requests

Successfully merging a pull request may close this issue.

None yet
4 participants
You can’t perform that action at this time.