Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allow JSON Serializers to be swapped out #4

Closed
mythz opened this issue Mar 9, 2013 · 14 comments
Closed

Allow JSON Serializers to be swapped out #4

mythz opened this issue Mar 9, 2013 · 14 comments

Comments

@mythz
Copy link

mythz commented Mar 9, 2013

Just having the scan through the code and it looks like it might be tied to .NET's built-in JavaScriptSerializer? I've found the implementation for that to be quite slow, it would be nice if you could allow the JSON Serialization to be swappable, e.g. we could allow inject a different serializer by implementing a bespoke interface like IJsonConverter and hook it up in a similar way we do Startup.Invoke - e.g. maybe if Startup implements IJsonConverter we can use that instead?

Otherwise if that's not possible, maybe just allowing JSON strings to be marshaled back and forth, i.e. without the overhead of JavaScriptSerializer, so we can deserialize it ourselves in .NET land.

@tjanczuk
Copy link
Owner

The use of JSON serializer to help marshal data from .NET to node.js is temporary. Going forward it will be removed completely in favor of direct type marshaling between CLR and V8, similarly to how marshaling from node.js to CLR is implemented currently. Given that exposing such abstraction at this point is not desired.

@mythz
Copy link
Author

mythz commented Mar 10, 2013

ok cool sounds good.

@mythz mythz closed this as completed Mar 10, 2013
@mythz mythz reopened this Mar 28, 2013
@mythz
Copy link
Author

mythz commented Mar 28, 2013

Actually I'm re-opening this so you can let us know when you've removed the JSON Serializer in favor of the direct type marshaling proposed.

Any guess of an ETA on this?

@tjanczuk
Copy link
Owner

Do you have a specific problem with the use of JavaScriptSerializer? You mentioned performance before, is this the motivation? I think I'd rather have some perf benchmarks first to be able to quantify the effect of any changes in this space. Perf bechmarks are covered by #27.

Also, pull requests welcome ;)

@mythz
Copy link
Author

mythz commented Mar 28, 2013

Yes performance of JSS sucks, I've had to pull it out of being measured Northwind Benchmarks because it was unfeasible to wait for them to run them with any high number of N.

Basically I want to marshal the entire request and response through to a ServiceStack back-end to handle and I prefer to start work once the final solution is in place rather than before it gets changed again.

I think you're the best person to implement the ideal marshaling as you would know the most optimal way to do it better than anyone else.

@tjanczuk
Copy link
Owner

Can you contribute some benchmarks based on your payload profile so that I can have something real to measure against?

@mythz
Copy link
Author

mythz commented Mar 28, 2013

Benchmarks with different JSON Serializers in #edgejs? Are the JSON Serializers swappable atm?

@tjanczuk
Copy link
Owner

I mean as part of #27, can you add a measurement of how long it takes to marshal your sample playload from JS to .NET (or back). Run it in a loop 10000 times, average the time kind of thing. Once we have that benchmark we will be able to assess the effect of changes I making in code on your scenario.

@mythz
Copy link
Author

mythz commented Mar 28, 2013

okie cool, I'll see what I can stitch together.

@tjanczuk
Copy link
Owner

I've started some work in the perf branch. Here is the first test: https://github.com/tjanczuk/edge/blob/perf/performance/marshal_clr2v8.js. It measures the latency of returning simple object from .NET to node.js and also provides a baseline of doing the same purely in node.js. The baseline is there not because this is a scenario to compare to (a proper baseline would be to make a cross-process call instead of a call from node.js to CLR in-process). But having the baseline allows to assess latency in relative terms and compare performance results across machines in a meaningful way.

So in my case it appears calling .NET from node.js is currently about 23x slower than calling node.js from node.js.

C:\projects\edge\performance>node marshal_clr2v8.js clr2v8
{ rss: 44965888,
  heapTotal: 12504832,
  heapUsed: 3033500,
  latency: 0.09741 }

C:\projects\edge\performance>node marshal_clr2v8.js baseline
{ rss: 14454784,
  heapTotal: 12504832,
  heapUsed: 1508136,
  latency: 0.00429 }

Now, let's get to work.

@mythz
Copy link
Author

mythz commented Mar 29, 2013

brilliant, thx for the info, i'll try put some benchmarks together on the weekend as well.

@tjanczuk
Copy link
Owner

With b857ca2 the dependency on JSON serialization for marshaling from CLR to V8 is removed. Out of the gate this gives 25% improvement.

C:\projects\edge\performance>node marshal_clr2v8.js clr2v8
{ rss: 41701376,
  heapTotal: 12504832,
  heapUsed: 2389464,
  latency: 0.07374 }

Time to break into profiler.

@tjanczuk
Copy link
Owner

Here is one low hanging fruit:

perf1

Making support for ScriptIgnoreAttribute optional (set EDGE_ENABLE_SCRIPTSUPPORTATTRIBUTE=1 if you want it) improves perf by 47%, and 60% cumulative compared to the starting point. Fix in 1e81d94.

C:\projects\edge\performance>node marshal_clr2v8.js clr2v8
{ rss: 41496576,
  heapTotal: 12504832,
  heapUsed: 2367296,
  latency: 0.03906 }

@tjanczuk
Copy link
Owner

I am going to close the issue and open a new one with the targeted investigaton that remains.

After all these changes, marshaling from CLR to V8 takes 36% of the process time, inclusive. There is no obvious low hanging fruit any more, except marshaling from byte[] to Buffer, which currently allocates memory to copy the data. I will open a new issue to investigate if this clone can be avoided. It would shave off another ~17% of time.

perf3

perf2

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants