Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[ASP.NET Core] Add support for .NET8.0 specific metrics #4934

Merged

Conversation

vishweshbankwar
Copy link
Member

@vishweshbankwar vishweshbankwar commented Oct 7, 2023

Towards open-telemetry/opentelemetry-dotnet-contrib#1753

Details: open-telemetry/opentelemetry-dotnet-contrib#1753
open-telemetry/opentelemetry-dotnet-contrib#1753

Changes

This PR adds support for ASP.NET Core metrics that are available natively when targeting .NET8.0 framework.

Following metrics will now be enabled by default. The metrics will be exported based on the individual application configuration/scenarios

  • Meter : Microsoft.AspNetCore.Hosting

    • http.server.request.duration
    • http.server.active_requests
  • Meter : Microsoft.AspNetCore.Server.Kestrel

    • kestrel.active_connections
    • kestrel.connection.duration
    • kestrel.rejected_connections
    • kestrel.queued_connections
    • kestrel.queued_requests
    • kestrel.upgraded_connections
    • kestrel.tls_handshake.duration
    • kestrel.active_tls_handshakes
  • Meter : Microsoft.AspNetCore.Http.Connections

    • signalr.server.connection.duration
    • signalr.server.active_connections
  • Meter : Microsoft.AspNetCore.Routing

    • aspnetcore.routing.match_attempts
  • Meter : Microsoft.AspNetCore.Diagnostics

    • aspnetcore.diagnostics.exceptions
  • Meter : Microsoft.AspNetCore.RateLimiting

    • aspnetcore.rate_limiting.active_request_leases
    • aspnetcore.rate_limiting.request_lease.duration
    • aspnetcore.rate_limiting.queued_requests
    • aspnetcore.rate_limiting.request.time_in_queue
    • aspnetcore.rate_limiting.requests

TODO: Update Readme

Merge requirement checklist

  • CONTRIBUTING guidelines followed (nullable enabled, static analysis, etc.)
  • Unit tests added/updated
  • Appropriate CHANGELOG.md files updated for non-trivial changes
  • Changes in public API reviewed (if applicable)

@codecov
Copy link

codecov bot commented Oct 9, 2023

Codecov Report

Merging #4934 (8066515) into main (50d3af0) will decrease coverage by 0.04%.
The diff coverage is 86.66%.

❗ Current head 8066515 differs from pull request most recent head 3b0e27e. Consider uploading reports for the commit 3b0e27e to get more accurate results

Impacted file tree graph

@@            Coverage Diff             @@
##             main    open-telemetry/opentelemetry-dotnet#4934      +/-   ##
==========================================
- Coverage   83.31%   83.28%   -0.04%     
==========================================
  Files         295      295              
  Lines       12361    12375      +14     
==========================================
+ Hits        10299    10306       +7     
- Misses       2062     2069       +7     
Flag Coverage Δ
unittests 83.28% <86.66%> (-0.04%) ⬇️

Flags with carried forward coverage won't be shown. Click here to find out more.

Files Coverage Δ
...ry.Instrumentation.AspNetCore/AspNetCoreMetrics.cs 100.00% <ø> (ø)
...NetCore/AspNetCoreMetricsInstrumentationOptions.cs 71.42% <ø> (ø)
...ementation/AspNetCoreInstrumentationEventSource.cs 72.72% <100.00%> (+2.72%) ⬆️
...AspNetCore/Implementation/HttpInMetricsListener.cs 70.88% <100.00%> (+0.37%) ⬆️
...ation.AspNetCore/MeterProviderBuilderExtensions.cs 94.59% <83.33%> (-5.41%) ⬇️

... and 7 files with indirect coverage changes

@vishweshbankwar vishweshbankwar marked this pull request as ready for review October 10, 2023 16:00
@vishweshbankwar vishweshbankwar requested a review from a team as a code owner October 10, 2023 16:00
@samsp-msft
Copy link

@JamesNK @davidfowl PTAL

@samsp-msft
Copy link

Do we want the rate limiting to be included by default or does it only get emitted if used?

@cijothomas
Copy link
Member

Do we want the rate limiting to be included by default or does it only get emitted if used?

what is rate limiting?

@vishweshbankwar
Copy link
Member Author

Do we want the rate limiting to be included by default or does it only get emitted if used?

what is rate limiting?

Please see

public async Task ValidateNet8RateLimitingMetricsAsync()
. These metrics should only be emitted when user has configured rate limiting.

@cijothomas
Copy link
Member

Do we want the rate limiting to be included by default or does it only get emitted if used?

what is rate limiting?

Please see

public async Task ValidateNet8RateLimitingMetricsAsync()

. These metrics should only be emitted when user has configured rate limiting.

oops sorry! I misread as something related to throttling within the SDK !

…pNetCoreInstrumentationEventSource.cs

Co-authored-by: Utkarsh Umesan Pillai <66651184+utpilla@users.noreply.github.com>
vishweshbankwar and others added 2 commits October 19, 2023 14:12
…derExtensions.cs

Co-authored-by: Utkarsh Umesan Pillai <66651184+utpilla@users.noreply.github.com>
// We need to let metric callback execute as it is executed AFTER response was returned.
// In unit tests environment there may be a lot of parallel unit tests executed, so
// giving some breezing room for the callbacks to complete
await Task.Delay(TimeSpan.FromSeconds(1)).ConfigureAwait(false);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This seems super unreliable. Why not use a mock listener ?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

what is a mock listener here? MeterListener?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@vishweshbankwar If you performed await app.DisposeAsync(); here would that cause everything to flush out in a more deterministic way?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@CodeBlanch - That would require configuring meterprovider via services.AddOpenTelemetry() - is that right?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You could keep it how you have it. Just dispose both...

await app.DisposeAsync(); // Flush AspNetCore
this.meterProvider.Dispose(); // Flush manual provider

Should be fine.

You could do it all from the host if you wanted to like this...

        var metricItems = new List<Metric>();

        var builder = WebApplication.CreateBuilder();
        builder.Logging.ClearProviders();

        builder.Services.AddOpenTelemetry()
            .WithMetrics(metrics => metrics
                 .AddAspNetCoreInstrumentation()
                 .AddInMemoryExporter(metricItems));

        var app = builder.Build();

        app.MapGet("/", () => "Hello");

        _ = app.RunAsync();

        using var client = new HttpClient();
        var res = await client.GetStringAsync("http://localhost:5000/").ConfigureAwait(false);
        Assert.NotNull(res);

       await app.DisposeAsync(); // This should flush AspNetCore and the MeterProvider in this case

     // inspect metricItems here it should be populated with everything that happened

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks! - I will follow up on this. Opened #4972 for tracking

Copy link
Member

@cijothomas cijothomas left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM for the pre-release.
Need the following before stable:

  1. Readme updates, with View recipes to drop metrics not part of semantic conventions. Also indicate the enrich/filter part linking to .NET docs where appropriate.
  2. Benchmarks to see the extra metrics affects perf. If yes, we need to only enable the ones from semantic convention. Irrespective of that, I'd like to see why would enable the extra metrics by default? I think they should be opt-in only.

_ = app.RunAsync();

using var client = new HttpClient();
var res = await client.GetStringAsync("http://localhost:5000/").ConfigureAwait(false);
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If we really are expecting these tests to run in parallel (I don't think they actually do) we should probably randomize the port and handle the case where it is in use? I know we do that for other tests where we need to start up a listener.

@davidfowl Any other patterns perhaps we could use to cause the pieces of aspnetcore to fire we need instead of exercising the whole pipeline?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes I think these tests need to be rewritten to use the WebApplicationFactory and fakes for the meter listener.

cc @JamesNK as he wrote tests for meters in ASP.NET Core.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Randomizing port is something I was planning to do to avoid conflicts (we do have issues right now related to it).

we use WebApplicationFactory in most of the tests here. This new pattern is used as it relatively lightweight and does not require an additional template app like WebApplicationFactory does. If WebApplicationFactory is the recommended way then we can stick with it.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

same as #4934 (comment)

Copy link
Member

@CodeBlanch CodeBlanch left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Some test feedback, but LGTM. We could improve the tests as a follow-up.

@utpilla utpilla merged commit 38e21a9 into open-telemetry:main Oct 20, 2023
68 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

7 participants