Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Start each request on fresh ExecutionContext #14146

Merged
merged 1 commit into from Apr 3, 2020

Conversation

benaadams
Copy link
Member

@benaadams benaadams commented Sep 19, 2019

Check with Hide Whitespace on; as indents changed.

Fixes #13991

Per request allocations for Plaintext sample remains the same

Allocations for 17k requests across 256 connections

image

@benaadams
Copy link
Member Author

benaadams commented Sep 19, 2019

Per request allocations for Plaintext sample remains the same
image

@benaadams benaadams force-pushed the fresh-ExecutionContext branch 2 times, most recently from 113f621 to 5fc4604 Compare September 19, 2019 22:59
@benaadams
Copy link
Member Author

Cleaned up the encapsulation

@benaadams
Copy link
Member Author

Since we are calling an extra async method anyway; realised it could be greatly simplified.

(ignore whitespace)


KestrelEventSource.Log.RequestStop(this);
private async ValueTask ProcessRequest<TContext>(IHttpApplication<TContext> application)
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Task is a more performant return here than ValueTask and will allocate the same; i.e none in the sync completion and the async box in the async wait scenario (subject to dotnet/coreclr#26310 when ValueTask becomes better as it doesn't allocate in async)

@halter73
Copy link
Member

So much nicer! We probably still want to take the tests from #14027 though.

Copy link
Member

@halter73 halter73 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

One subtle difference between this and #14027 is that the Response.OnCompleted callback will run with the same ExecutionContext as the middleware if the middleware isn't "async". With #14027, the Response.OnCompleted gets a fresh ExecutionContext no matter what. I think this probably is fine though as long as we add a test and don't change behavior.

@benaadams
Copy link
Member Author

benaadams commented Sep 21, 2019

Added tests

I think this probably is fine though as long as we add a test and don't change behavior.

Also added compat test and commented it as such https://github.com/aspnet/AspNetCore/pull/14146/files#diff-d47b9baa7ec5338570136a89e5e8a4deR441-R444

public async Task ExecutionContextMutationsOfValueTypeFlowThroughNonAsyncEvents()
{
    // Note: This test is for future compat behviour checking
    // to ensure non-async OnStarting/OnCompleted flow AsyncLocal changes out

@davidfowl
Copy link
Member

We need to run performance tests

@benaadams
Copy link
Member Author

We need to run performance tests

Don't think it will be great :(

However... might as well make the OnStarting/OnComplete async as they won't regress performance; since normally aren't called?

@benaadams
Copy link
Member Author

Changed it to flow normally for non-async events also; since they won't run if you haven't defined any so it shouldn't effect performance if you choose not to use them.

@analogrelay
Copy link
Contributor

@halter73 are you good to merge this in for 5.0?

@davidfowl
Copy link
Member

@halter73 are you good to merge this in for 5.0?

Not without performance tests.

@benaadams
Copy link
Member Author

Is that auto perf test ci thing working? #16999 (seems to be having versioning issues)

@davidfowl
Copy link
Member

davidfowl commented Nov 15, 2019

Maybe we can use the in memory perf tests

@Tratcher
Copy link
Member

/azp run

@azure-pipelines
Copy link

Azure Pipelines successfully started running 2 pipeline(s).

@benaadams
Copy link
Member Author

@aspnet-hello benchmark

@pr-benchmarks
Copy link

pr-benchmarks bot commented Feb 15, 2020

Starting 'Default' pipelined plaintext benchmark with session ID 'f4894b6a4a114f84ae664bf363da98e7'. This could take up to 30 minutes...

@pr-benchmarks
Copy link

pr-benchmarks bot commented Feb 15, 2020

Baseline

Starting baseline run on '9aacea8903cd07a0aec6926ee77ac8bd5ca61dec'...
RequestsPerSecond:           349,064
Max CPU (%):                 99
WorkingSet (MB):             84
Avg. Latency (ms):           6.59
Startup (ms):                451
First Request (ms):          146.42
Latency (ms):                0.58
Total Requests:              5,269,024
Duration: (ms)               15,090
Socket Errors:               0
Bad Responses:               0
Build Time (ms):             18,506
Published Size (KB):         118,930
SDK:                         5.0.100-preview.1.20106.1
Runtime:                     5.0.0-preview.1.20113.7
ASP.NET Core:                5.0.0-preview.1.20114.15


PR

Starting PR run on '591115741701f3ebec1ebff61ff3ea2921d2ac5a'...
| Description |     RPS | CPU (%) | Memory (MB) | Avg. Latency (ms) | Startup (ms) | Build Time (ms) | Published Size (KB) | First Request (ms) | Latency (ms) | Errors | Ratio |
| ----------- | ------- | ------- | ----------- | ----------------- | ------------ | --------------- | ------------------- | ------------------ | ------------ | ------ | ----- |
|      Before | 349,064 |      99 |          84 |              6.59 |          451 |           18506 |              118930 |             146.42 |         0.58 |      0 |  1.00 |
|       After | 345,123 |      99 |          85 |              6.67 |          445 |            6503 |              118930 |             111.46 |         0.52 |      0 |  0.99 |


@analogrelay
Copy link
Contributor

@davidfowl are you good with the perf on this? I'd like to close it out and preferably by merging it ;P

@analogrelay
Copy link
Contributor

Ping @davidfowl @halter73

@analogrelay
Copy link
Contributor

Hearing nothing from @davidfowl, I'm going to go ahead and tell @halter73 he can merge away ;).

@davidfowl
Copy link
Member

Merge it we can measure things now

@benaadams
Copy link
Member Author

Rebased

@benaadams
Copy link
Member Author

@aspnet-hello benchmark

@pr-benchmarks
Copy link

pr-benchmarks bot commented Apr 2, 2020

Starting 'Default' pipelined plaintext benchmark with session ID '256b16f3fd1a44faa4552049ed33f796'. This could take up to 30 minutes...

@pr-benchmarks
Copy link

pr-benchmarks bot commented Apr 2, 2020

Baseline

Starting baseline run on 'f1145fb6d7f418b3f1987f94e9f457f8f4c2b18b'...
RequestsPerSecond:           346,907
Max CPU (%):                 99
WorkingSet (MB):             87
Avg. Latency (ms):           6.62
Startup (ms):                513
First Request (ms):          155.12
Latency (ms):                0.53
Total Requests:              5,238,465
Duration: (ms)               15,100
Socket Errors:               0
Bad Responses:               0
Build Time (ms):             20,520
Published Size (KB):         120,308
SDK:                         5.0.100-preview.2.20120.3
Runtime:                     5.0.0-preview.4.20201.1
ASP.NET Core:                5.0.0-preview.4.20201.16


PR

Starting PR run on '116424d3df2f099567e4a7cec337374c8816010b'...
| Description |     RPS | CPU (%) | Memory (MB) | Avg. Latency (ms) | Startup (ms) | Build Time (ms) | Published Size (KB) | First Request (ms) | Latency (ms) | Errors | Ratio |
| ----------- | ------- | ------- | ----------- | ----------------- | ------------ | --------------- | ------------------- | ------------------ | ------------ | ------ | ----- |
|      Before | 346,907 |      99 |          87 |              6.62 |          513 |           20520 |              120308 |             155.12 |         0.53 |      0 |  1.00 |
|       After | 347,472 |      99 |          88 |              6.69 |          519 |            7503 |              120308 |             117.76 |         0.52 |      0 |  1.00 |


@halter73 halter73 merged commit f17520c into dotnet:master Apr 3, 2020
@benaadams benaadams deleted the fresh-ExecutionContext branch April 3, 2020 01:06
halter73 added a commit that referenced this pull request Apr 4, 2020
Fixed by #14146 which starts each request on a fresh ExecutionContext
halter73 added a commit that referenced this pull request Apr 9, 2020
Fixed by #14146 which starts each request on a fresh ExecutionContext
@BIOsborne
Copy link

BIOsborne commented Mar 18, 2021

I have observed an issue (on AspNetCore version 3.1.13) with AsyncLocal context being leaked between requests. (Never in parallel, the Context from one request which finishes appears to be present at the start of a subsequent one, and so on in a chain of many requests).

I have not attempted (yet) to reproduce this locally, but have observed it reliably in our Production systems.
Interestingly - we are running 50% production traffic through Kestrel (running on Linux) and 50% running through Azure Windows hosted with IIS (as part of a live migration process). The issue appears only with Kestrel on Linux. Not sure if that is relevant - it is clear however that the hosting is affecting whether this issue is observed or not.

I have seen this issue is closed now: #13991
I assume we would not expect this issue to be live still?
If this is of interest, I can create a new issue, but won't do this if there are live issues already open around this that I have not been able to find.

A little more context - the AsyncLocal state tracked in our case gets initialised in Log sink which is NOT in an Async method. (The only reason I mention that is that some comments in the linked issue refer to the method setting the AsyncLocal state should be async.
(#13991 (comment))

Any input gratefully received - hope this isn't wasting your time! (I can open this as a new issue if that's best)
@anurse @davidfowl @benaadams
Thanks.

@ghost
Copy link

ghost commented Mar 18, 2021

Hi @BIOsborne. It looks like you just commented on a closed PR. The team will most probably miss it. If you'd like to bring something important up to their attention, consider filing a new issue and add enough details to build context.

@alefranz
Copy link
Contributor

@BIOsborne this fix is in .NET 5 and has not been backported to 3.1.
As far as I am aware, the only way to work around the problem in 3.1 is what David suggested in the comment you pointed out.

@ghost
Copy link

ghost commented Mar 18, 2021

Hi @alefranz. It looks like you just commented on a closed PR. The team will most probably miss it. If you'd like to bring something important up to their attention, consider filing a new issue and add enough details to build context.

@BIOsborne
Copy link

@BIOsborne this fix is in .NET 5 and has not been backported to 3.1.
As far as I am aware, the only way to work around the problem in 3.1 is what David suggested in the comment you pointed out.

Thanks @alefranz for quick reply - good to know.
We have been able to work around this in 3.1.* for now no problem.

@ghost
Copy link

ghost commented Mar 18, 2021

Hi @BIOsborne. It looks like you just commented on a closed PR. The team will most probably miss it. If you'd like to bring something important up to their attention, consider filing a new issue and add enough details to build context.

@amcasey amcasey added area-networking Includes servers, yarp, json patch, bedrock, websockets, http client factory, and http abstractions and removed area-runtime labels Jun 6, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area-networking Includes servers, yarp, json patch, bedrock, websockets, http client factory, and http abstractions
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Kestrel is causing AsyncLocal to bleed context data across requests on same HTTP/1.1 connection
8 participants