-
Notifications
You must be signed in to change notification settings - Fork 447
Conversation
@@ -184,7 +184,7 @@ public class HubConnectionHandler<THub> : ConnectionHandler where THub : Hub | |||
{ | |||
// Don't wait on the result of execution, continue processing other | |||
// incoming messages on this connection. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not valid, remove.
@@ -184,7 +184,7 @@ public class HubConnectionHandler<THub> : ConnectionHandler where THub : Hub | |||
{ | |||
// Don't wait on the result of execution, continue processing other | |||
// incoming messages on this connection. | |||
_ = _dispatcher.DispatchMessageAsync(connection, message); | |||
await _dispatcher.DispatchMessageAsync(connection, message); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Does this mean only one message can be invoked at a time for a connection? What happens if a hub method takes 5 seconds to complete because it is calling a slow DB? Do all other messages wait for it to finish?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Scenario: I am connected to jabbr Core. I send David a private message containing a link to github. On the server jabbr parses the message and sees the github link, makes a request to github to include more details.
At the same time I am sending messages in another chat room. Will those other messages only be invoked and sent out to other users in the room once the private message has finished being processed?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It does, but I believe this is also how ASP.NET SignalR works. This helps propagate back-pressure on the server back up through the client. If the server is backed up waiting to send data (in response to received messages), it slows down processing received messages, which eventually slows down the client, etc...
If you want to invoke a long-running thing in the background, you can fire and forget a task in the body of your Hub method and use IHubContext<T>
to send a response to the Client via a Client invocation when the long-running task is complete.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ok, as long as everyone who knows more about SignalR understands and expects that behavior. It doesn't seem like something can can easily be undone. If V1 runs messages serially then V2 won't be able to change to run in parallel without breaking some hearts.
I think it should be added as a comment here, and also spelled out in documentation. Devs are use to firing 10 requests at MVC and having them run in parallel.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Although I guess if there is load balancing and multiple instances of the SignalR server then messages could still end up running in parallel so people shouldn't rely on that logic.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I agree that we should make sure we understand the behavior. @davidfowl can you confirm how ASP.NET SignalR did dispatching? Last we talked I think you said it did it serially, like this.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes this is how old SignalR worked when you used websockets. Regardless this is a better way to handle dispatching. Running multiple calls to the hub in parallel means there's no self throttling and another rate limiting mechanism would have to be introduced to make this reasonable.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ok!
I'd like a couple of unit tests
|
var connectionHandler = serviceProvider.GetService<HubConnectionHandler<BlockingHub>>(); | ||
|
||
// Because we use PipeScheduler.Inline the hub invocations will run inline until they wait, which happens inside the BlockingMethod call | ||
using (var client = new TestClient(new JsonHubProtocol())) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Isn't JsonHubProtocol the default?
@@ -491,6 +493,34 @@ public class SimpleTypedHub : Hub<ITypedHubClient> | |||
} | |||
} | |||
|
|||
public class BlockingHub : Hub | |||
{ | |||
private TcsService _tcs; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
_tcsService not tcs
_tcs = tcs; | ||
} | ||
|
||
public async Task<int> BlockingMethod() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This isn't blocking...
await tcsService.StartedMethod.Task.OrTimeout(); | ||
|
||
// Invoke another hub method which will wait for the first method to finish | ||
var nonBlockingTask = client.SendInvocationAsync(nameof(LongRunningHub.SimpleMethod), nonBlocking: false); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Assert.False(nonBlockingTask.IsCompleted)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Add the assert
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
shiproom approved 👍
// Don't wait on the result of execution, continue processing other | ||
// incoming messages on this connection. | ||
_ = _dispatcher.DispatchMessageAsync(connection, message); | ||
await _dispatcher.DispatchMessageAsync(connection, message); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Add a comment like:
Messages are dispatched sequentially and will block others until complete (except streaming invocations). Note that developers can't depend on messages running one after another after scaling out
@@ -88,7 +88,8 @@ public partial class DefaultHubDispatcher<THub> : HubDispatcher<THub> where THub | |||
|
|||
case StreamInvocationMessage streamInvocationMessage: | |||
Log.ReceivedStreamHubInvocation(_logger, streamInvocationMessage); | |||
await ProcessInvocation(connection, streamInvocationMessage, isStreamedInvocation: true); | |||
// Fire-and-forget stream invocations, otherwise they would block other hub invocations from being able to run | |||
_ = ProcessInvocation(connection, streamInvocationMessage, isStreamedInvocation: true); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do we truly want to fire and forget the entire streaming invocation like this? I wonder if streaming invocations should run sequentially like other messages until the IAsyncEnumerator<T>
is returned from the method, and it is only iterating over the IAsyncEnumerator<T>
that is fire and forget. That would make them more consistent with other invocations.
Using the current logic in this PR I think it would still be possible to wreak havoc doing something like this:
public class BroadcastHub : Hub
{
public async Task<ChannelReader<int>> Broadcast(string message)
{
await Clients.All.SendAsync(message);
var channel = Channel.CreateUnbounded<int>();
channel.Writer.Complete();
return channel;
}
}
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I mostly agree with @JamesNK here. Though to be fair, this isn't a full defense against bad code, there's still nothing stopping a poorly-written Hub method from Task.Run
ing stuff. Still, if we await
invocation of the method and then only fire-and-forget the logic to drain the stream, it would fit user expectations more I think.
Re-review |
} | ||
finally | ||
{ | ||
if (disposeScope) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You forgot to rename this but meh...
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Haha
#2084