Worst performing JSON mapping framework #422

Closed
eimantas opened this Issue Oct 21, 2016 · 4 comments

Comments

Projects
None yet
4 participants
@eimantas

Hi guys

Any comment on these benchmarks?

@jshier

This comment has been minimized.

Show comment
Hide comment
@jshier

jshier Oct 21, 2016

Contributor

As an Argo user, benchmarks like this are largely irrelevant. Only the worst made backend APIs would return so much data that the performance between JSON frameworks would be noticeably different. Plus it's all done asynchronously anyway, so it would hardly be noticeable. I've also found that enabling whole module optimization can have a big impact on Argo, and the benchmarked doesn't note whether it was enabled or not.

That said, the biggest thing Argo could do for performance is write or adopt a native JSON parser to replace JSONSerialization. The Any -> JSON conversion step is the slowest part of Argo's parsing right now, so replacing it with a single step would greatly improve performance. Something like Freddy's JSONParser. Once that's taken care of, various micro-optimizations are likely possible. Also, I know there's a multithreaded array decoding PR up right now, which may help if you're decoding lots of huge arrays but for most people I don't think it would do anything.

Contributor

jshier commented Oct 21, 2016

As an Argo user, benchmarks like this are largely irrelevant. Only the worst made backend APIs would return so much data that the performance between JSON frameworks would be noticeably different. Plus it's all done asynchronously anyway, so it would hardly be noticeable. I've also found that enabling whole module optimization can have a big impact on Argo, and the benchmarked doesn't note whether it was enabled or not.

That said, the biggest thing Argo could do for performance is write or adopt a native JSON parser to replace JSONSerialization. The Any -> JSON conversion step is the slowest part of Argo's parsing right now, so replacing it with a single step would greatly improve performance. Something like Freddy's JSONParser. Once that's taken care of, various micro-optimizations are likely possible. Also, I know there's a multithreaded array decoding PR up right now, which may help if you're decoding lots of huge arrays but for most people I don't think it would do anything.

@jshier

This comment has been minimized.

Show comment
Hide comment
@jshier

jshier Oct 21, 2016

Contributor

And it looks like the tester didn't even make his Argo implementation available, which is poor form.

Contributor

jshier commented Oct 21, 2016

And it looks like the tester didn't even make his Argo implementation available, which is poor form.

@jakecraige

This comment has been minimized.

Show comment
Hide comment
@tonyd256

This comment has been minimized.

Show comment
Hide comment
@tonyd256

tonyd256 Oct 24, 2016

Contributor

The main focus for Argo has been safety, conciseness, and flexibility through Swift's strong types and common functional concepts. I believe Argo does the best job in these areas. Argo does have some shortcomings that we've been transparent about from its birth. You can look back through many of our issues and see that runtime and compile time performance hasn't always been stellar. Much of this is due to the Swift compiler not being as good as some of the functional languages where Argo's concept came from. The Swift compiler is still young and is constantly improving and Argo's performance will improve with it.

That being said, we are always trying to make Argo as performant as we can without giving up on what it is. We don't always have a ton of time to experiment with different ways to improve performance but we like to encourage anyone to create PRs.

I think the main thing to keep in mind when reading any article online is that the whole story might not be captured. I agree with @jshier in that 7mb of JSON is quite a large chunk of data and applications should probably consider alternatives (like pagination) long before hitting this payload size. I also understand that one might not always have control over the backend they're using in their app, so one must make a decision on which dependencies to use based on their app's requirements and that could be different for different apps.

Contributor

tonyd256 commented Oct 24, 2016

The main focus for Argo has been safety, conciseness, and flexibility through Swift's strong types and common functional concepts. I believe Argo does the best job in these areas. Argo does have some shortcomings that we've been transparent about from its birth. You can look back through many of our issues and see that runtime and compile time performance hasn't always been stellar. Much of this is due to the Swift compiler not being as good as some of the functional languages where Argo's concept came from. The Swift compiler is still young and is constantly improving and Argo's performance will improve with it.

That being said, we are always trying to make Argo as performant as we can without giving up on what it is. We don't always have a ton of time to experiment with different ways to improve performance but we like to encourage anyone to create PRs.

I think the main thing to keep in mind when reading any article online is that the whole story might not be captured. I agree with @jshier in that 7mb of JSON is quite a large chunk of data and applications should probably consider alternatives (like pagination) long before hitting this payload size. I also understand that one might not always have control over the backend they're using in their app, so one must make a decision on which dependencies to use based on their app's requirements and that could be different for different apps.

@tonyd256 tonyd256 closed this Oct 24, 2016

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment