Skip to content
This repository has been archived by the owner on Dec 18, 2018. It is now read-only.

Run hub invocations serially #2086

Merged
merged 6 commits into from
Apr 19, 2018
Merged

Run hub invocations serially #2086

merged 6 commits into from
Apr 19, 2018

Conversation

BrennanConroy
Copy link
Member

@@ -184,7 +184,7 @@ public class HubConnectionHandler<THub> : ConnectionHandler where THub : Hub
{
// Don't wait on the result of execution, continue processing other
// incoming messages on this connection.
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not valid, remove.

@@ -184,7 +184,7 @@ public class HubConnectionHandler<THub> : ConnectionHandler where THub : Hub
{
// Don't wait on the result of execution, continue processing other
// incoming messages on this connection.
_ = _dispatcher.DispatchMessageAsync(connection, message);
await _dispatcher.DispatchMessageAsync(connection, message);
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Does this mean only one message can be invoked at a time for a connection? What happens if a hub method takes 5 seconds to complete because it is calling a slow DB? Do all other messages wait for it to finish?

Copy link
Member

@JamesNK JamesNK Apr 19, 2018

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Scenario: I am connected to jabbr Core. I send David a private message containing a link to github. On the server jabbr parses the message and sees the github link, makes a request to github to include more details.

At the same time I am sending messages in another chat room. Will those other messages only be invoked and sent out to other users in the room once the private message has finished being processed?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It does, but I believe this is also how ASP.NET SignalR works. This helps propagate back-pressure on the server back up through the client. If the server is backed up waiting to send data (in response to received messages), it slows down processing received messages, which eventually slows down the client, etc...

If you want to invoke a long-running thing in the background, you can fire and forget a task in the body of your Hub method and use IHubContext<T> to send a response to the Client via a Client invocation when the long-running task is complete.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok, as long as everyone who knows more about SignalR understands and expects that behavior. It doesn't seem like something can can easily be undone. If V1 runs messages serially then V2 won't be able to change to run in parallel without breaking some hearts.

I think it should be added as a comment here, and also spelled out in documentation. Devs are use to firing 10 requests at MVC and having them run in parallel.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Although I guess if there is load balancing and multiple instances of the SignalR server then messages could still end up running in parallel so people shouldn't rely on that logic.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I agree that we should make sure we understand the behavior. @davidfowl can you confirm how ASP.NET SignalR did dispatching? Last we talked I think you said it did it serially, like this.

Copy link
Member

@davidfowl davidfowl Apr 19, 2018

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes this is how old SignalR worked when you used websockets. Regardless this is a better way to handle dispatching. Running multiple calls to the hub in parallel means there's no self throttling and another rate limiting mechanism would have to be introduced to make this reasonable.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ok!

@JamesNK
Copy link
Member

JamesNK commented Apr 19, 2018

I'd like a couple of unit tests

  1. Sequentially dispatching invoke messages
  2. Multiple streams returning results and not blocking other messages being dispatched

var connectionHandler = serviceProvider.GetService<HubConnectionHandler<BlockingHub>>();

// Because we use PipeScheduler.Inline the hub invocations will run inline until they wait, which happens inside the BlockingMethod call
using (var client = new TestClient(new JsonHubProtocol()))
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Isn't JsonHubProtocol the default?

@@ -491,6 +493,34 @@ public class SimpleTypedHub : Hub<ITypedHubClient>
}
}

public class BlockingHub : Hub
{
private TcsService _tcs;
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

_tcsService not tcs

_tcs = tcs;
}

public async Task<int> BlockingMethod()
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This isn't blocking...

await tcsService.StartedMethod.Task.OrTimeout();

// Invoke another hub method which will wait for the first method to finish
var nonBlockingTask = client.SendInvocationAsync(nameof(LongRunningHub.SimpleMethod), nonBlocking: false);
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Assert.False(nonBlockingTask.IsCompleted)

Copy link
Member

@davidfowl davidfowl left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Add the assert

Copy link

@muratg muratg left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

shiproom approved 👍

// Don't wait on the result of execution, continue processing other
// incoming messages on this connection.
_ = _dispatcher.DispatchMessageAsync(connection, message);
await _dispatcher.DispatchMessageAsync(connection, message);
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Add a comment like:
Messages are dispatched sequentially and will block others until complete (except streaming invocations). Note that developers can't depend on messages running one after another after scaling out

@@ -88,7 +88,8 @@ public partial class DefaultHubDispatcher<THub> : HubDispatcher<THub> where THub

case StreamInvocationMessage streamInvocationMessage:
Log.ReceivedStreamHubInvocation(_logger, streamInvocationMessage);
await ProcessInvocation(connection, streamInvocationMessage, isStreamedInvocation: true);
// Fire-and-forget stream invocations, otherwise they would block other hub invocations from being able to run
_ = ProcessInvocation(connection, streamInvocationMessage, isStreamedInvocation: true);
Copy link
Member

@JamesNK JamesNK Apr 19, 2018

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we truly want to fire and forget the entire streaming invocation like this? I wonder if streaming invocations should run sequentially like other messages until the IAsyncEnumerator<T> is returned from the method, and it is only iterating over the IAsyncEnumerator<T> that is fire and forget. That would make them more consistent with other invocations.

Using the current logic in this PR I think it would still be possible to wreak havoc doing something like this:

public class BroadcastHub : Hub
{
    public async Task<ChannelReader<int>> Broadcast(string message)
    {
        await Clients.All.SendAsync(message);

        var channel = Channel.CreateUnbounded<int>();
        channel.Writer.Complete();

        return channel;
    }
}

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I mostly agree with @JamesNK here. Though to be fair, this isn't a full defense against bad code, there's still nothing stopping a poorly-written Hub method from Task.Runing stuff. Still, if we await invocation of the method and then only fire-and-forget the logic to drain the stream, it would fit user expectations more I think.

@BrennanConroy
Copy link
Member Author

Re-review

@BrennanConroy BrennanConroy merged commit 1957655 into release/2.1 Apr 19, 2018
@BrennanConroy BrennanConroy deleted the brecon/serialHub branch April 19, 2018 23:59
}
finally
{
if (disposeScope)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You forgot to rename this but meh...

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Haha

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

5 participants