Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Long pooling requests taking close to 2 minutes to complete after upgrading from SignalR 2.2.2 to 2.4.2 from chrome v92 #4583

Open
o-sousa opened this issue Aug 26, 2021 · 4 comments
Labels
more-info-needed We are currently waiting for a response. No further triage action is needed at this time.

Comments

@o-sousa
Copy link

o-sousa commented Aug 26, 2021

Hi,

Hope anyone has any clues for this issue.

We have an app running for almost 3 years on Azure using an older version of SignalR (v2.2.2 specifically). Because of increased users we wanted to start using a scale out scenario using either Service Bus or Redis. We have upgraded first to v2.3.0 and now to v2.4.2 using nuget manager.

After this upgrade we started seeing regular cases where the CPU would go above 90% sometimes even causing 502 errors.

We have been chasing this for weeks and discovered a pattern where multiple http post calls to /signalr/poll are very slow and can take upto to 2 minutes to complete. This causes severe slowdown and makes the cpu % go up.

Bellow is the screenshot from my web server logs.
Long Polling Slow Issue

Any idea what could be causing this?
btw the number of concurrent users can be between 200-500 users in normal scenarios.

Best regards
Osvaldo

@BrennanConroy
Copy link
Member

multiple http post calls to /signalr/poll are very slow and can take upto to 2 minutes to complete

This is normal, poll requests wait 110 seconds if the server doesn't have any messages for the client then they cancel the request and do another poll.

This causes severe slowdown and makes the cpu % go up.

Are you sure that's the cause? If you go back to 2.2.2 you should also see poll requests taking 110 seconds.

@o-sousa
Copy link
Author

o-sousa commented Aug 26, 2021

To be direct, not sure if long polling is the reason. What I see in logs is a series of requests to signalR/poll and requests to /signalr/connect (GET taking more than 60 seconds) for serverSentEvents. After that even requests to static content starts to take much longer to complete (for a small png resource).

On the other hand I have completed a .net profiler trace. I see several GET requests for /signal/connect lasting more than 62 seconds. Does this makes sense? See below screenshot

Performance Analyser

@BrennanConroy
Copy link
Member

On the other hand I have completed a .net profiler trace. I see several GET requests for /signal/connect lasting more than 62 seconds. Does this makes sense?

The connect request will be for the duration of your SignalR connection, so if you are closing the connections every 60 seconds then this makes sense.

Can you share a CPU trace and memory dump?

Because of increased users we wanted to start using a scale out scenario using either Service Bus or Redis.

Did you switch to using scale out and start seeing issues, or did you upgrade and start seeing issues, or did your user count increase and you started seeing issues?

@BrennanConroy BrennanConroy added the more-info-needed We are currently waiting for a response. No further triage action is needed at this time. label Sep 16, 2021
@linkmanishgupta
Copy link

@o-sousa
We are having exactly same issue as you described above after doing exactly same update which you mentioned.
Great to see this as an issue reported here.

Just want to know if you found anything else relevant to this or any action you took.
It would be very helpful if you can share your findings/opinion here.

Thank you lot in advance.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
more-info-needed We are currently waiting for a response. No further triage action is needed at this time.
Projects
None yet
Development

No branches or pull requests

4 participants
@linkmanishgupta @BrennanConroy @o-sousa and others