Conversation
- Introduce CharArrayTextReader which is a simple TextReader implemented over a utf8 buffer. It encodes directly into the buffer allocated by JSON.NET.
ac4e05d
to
128ce47
Compare
- Renamed CharArrayTextReader to Utf8BufferTextReader - Fixed length issue on netstandard version
} | ||
var textReader = new Utf8BufferTextReader(payload); | ||
messages.Add(ParseMessage(textReader, binder)); | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nit: remove line
{ | ||
using (var reader = new JsonTextReader(new StreamReader(input))) | ||
using (var reader = new JsonTextReader(input)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Add an internal helper method for creating a JsonTextReader
with shared array pool?
{ | ||
private ReadOnlyMemory<byte> _utf8Buffer; | ||
|
||
public Utf8BufferTextReader(ReadOnlyMemory<byte> utf8Buffer) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Every time I read HubEndPoint
and see the ToArray
line 150 I don't like it. See below.
What about if HubProtocol
took a sequence, which was then passed to this text reader. The logic in Utf8BufferTextReader
would stay largely the same expect it could read over each memory block in the sequence instead of just one. We can avoid copying the sequence in its entirety into a new array.
SignalR/src/Microsoft.AspNetCore.SignalR.Core/HubEndPoint.cs
Lines 138 to 168 in 8b8c710
var result = await connection.Input.ReadAsync(connection.ConnectionAbortedToken); | |
var buffer = result.Buffer; | |
var consumed = buffer.End; | |
var examined = buffer.End; | |
try | |
{ | |
if (!buffer.IsEmpty) | |
{ | |
var hubMessages = new List<HubMessage>(); | |
// TODO: Make this incremental | |
if (connection.Protocol.TryParseMessages(buffer.ToArray(), _dispatcher, hubMessages)) | |
{ | |
foreach (var hubMessage in hubMessages) | |
{ | |
// Don't wait on the result of execution, continue processing other | |
// incoming messages on this connection. | |
_ = _dispatcher.DispatchMessageAsync(connection, hubMessage); | |
} | |
} | |
} | |
else if (result.IsCompleted) | |
{ | |
break; | |
} | |
} | |
finally | |
{ | |
connection.Input.AdvanceTo(consumed, examined); | |
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This change will be made after @anurse 's refactoring. The problem is that right now the client and server are asymmetrical and they don't both parse ReadOnlySequence<byte>
. Client uses a byte[] and the server uses ReadOnlySequence<byte>
. This will be done as part of #1508.
The plan is to:
- Move the reconnect logic to HubConnection instead of IConnection
- Make IConnection single use, single reader and writer
- Change the IConnection client to use pipelines
- Change the parsers to be incremental
- Change Utf8TextReader to take the
ReadOnySequence<byte>
directly from the pipe
- Change Utf8TextReader to take the
I'm hoping after we do that, we're be blazingly fast on the client and server and will allocate very little.
Next after that will be object pooling of various non buffer things.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could we make the server and Utf8BufferTextReader
work with ReadOnlySequence<byte>
now? On the client we can just wrap the byte[]
in a ReadOnlySequence<byte>
with little to no overhead before parsing, no? Memory and sequence are just structs.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't really see the point. I mean I actually want to do it, just not prematurely.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I also hate ToArray() with a passion but there is a master plan here and it's going to be glorious when it comes together. These are mostly incremental steps towards that shared goal. I expect to see dramatic performance improvements in the next couple of weeks.
Closing in favor of #1635 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM (post-merge)
@@ -7,13 +7,13 @@ | |||
"@std/esm": { | |||
"version": "0.18.0", | |||
"resolved": "https://registry.npmjs.org/@std/esm/-/esm-0.18.0.tgz", | |||
"integrity": "sha512-oeHSSVp/WxC08ngpKgyYR4LcI0+EBwZiJcB58jvIqyJnOGxudSkxTgAQKsVfpNsMXfOoILgu9PWhuzIZ8GQEjw==", | |||
"integrity": "sha1-4hK1Zcdl+Tsp7FIqSToZLOeuK6Y=", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Upgrade your NPM bro
Baseline
This PR
Fixes #1618