Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

PrometheusExporter: New concurrency handling for scrape middleware + http server #2610

Merged
merged 17 commits into from
Nov 15, 2021

Conversation

CodeBlanch
Copy link
Member

@CodeBlanch CodeBlanch commented Nov 15, 2021

Changes

Removed the semaphore and added logic that returns the same response to concurrent scrape requests and then caches the response for a configurable period of time (10 seconds by default).

Public API Changes

public class PrometheusExporterOptions
{
   public int ScrapeResponseCacheDurationMilliseconds { get; set; }
}

TODOs

  • CHANGELOG.md updated for non-trivial changes
  • README.md update for new setting
  • Changes in public API reviewed

@CodeBlanch CodeBlanch requested a review from a team as a code owner November 15, 2021 07:50
@codecov
Copy link

codecov bot commented Nov 15, 2021

Codecov Report

Merging #2610 (33b341e) into main (770a367) will decrease coverage by 0.05%.
The diff coverage is 75.43%.

❗ Current head 33b341e differs from pull request most recent head f94bc5f. Consider uploading reports for the commit f94bc5f to get more accurate results
Impacted file tree graph

@@            Coverage Diff             @@
##             main    #2610      +/-   ##
==========================================
- Coverage   80.27%   80.22%   -0.06%     
==========================================
  Files         256      257       +1     
  Lines        8826     8844      +18     
==========================================
+ Hits         7085     7095      +10     
- Misses       1741     1749       +8     
Impacted Files Coverage Δ
...y.Exporter.Prometheus/PrometheusExporterOptions.cs 60.00% <33.33%> (-40.00%) ⬇️
...eus/Implementation/PrometheusExporterMiddleware.cs 61.53% <60.00%> (-16.52%) ⬇️
...eus/Implementation/PrometheusExporterHttpServer.cs 54.05% <65.00%> (-17.00%) ⬇️
...heus/Implementation/PrometheusCollectionManager.cs 80.88% <80.88%> (ø)
...ometheus/Implementation/PrometheusSerializerExt.cs 33.33% <100.00%> (-3.38%) ⬇️
...elemetry.Exporter.Prometheus/PrometheusExporter.cs 90.90% <100.00%> (-1.10%) ⬇️
...ter.ZPages/Implementation/ZPagesActivityTracker.cs 100.00% <0.00%> (+2.85%) ⬆️
....Prometheus/Implementation/PrometheusSerializer.cs 62.88% <0.00%> (+14.43%) ⬆️

@@ -218,5 +148,46 @@ private void WorkerProc()
}
}
}

private async Task ProcessRequestAsync(HttpListenerContext context)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This looks much cleaner 👍.

Interlocked.Increment(ref this.readerCount);
this.ExitGlobalLock();
#if NETCOREAPP3_1_OR_GREATER
return new ValueTask<ArraySegment<byte>>(this.previousDataView);
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: for troubleshooting purposes, i think it'd be good to add some logging to indicate whether we re-used previous view, or triggered new collection.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Or maybe use the Last-Modified response header. I think we should exclude the change from this PR (to keep it small).

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OK I was working on a log message but I'll hold off and do this as a follow-up. I like the idea of using the header.


for (var i = 0; i < keys.Length; i++)
int numberOfKeys = keys?.Length ?? 0;
if (numberOfKeys > 0)
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@reyang I think this was a bug. We were getting nullref exception when keys/labels were not specified.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, I noticed this.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let me add a test case for it once this PR is merged.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@CodeBlanch CodeBlanch merged commit e871c27 into open-telemetry:main Nov 15, 2021
@CodeBlanch CodeBlanch deleted the prometheus-thread-poc branch November 15, 2021 20:02
Comment on lines +77 to +80
if (i > 0)
{
buffer[cursor++] = unchecked((byte)',');
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@CodeBlanch Would there be benefit to moving this to above the for loop, starting i = 1 and also adding cursor = WriteLabel(buffer, cursor, keys[i], values[i]); or would the compiler already do this loop unrolling?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oops I misread the code. another idea would be to always add the comma character after the WriteLabel call and then decrement the cursor after the loop so the next statement that modifies the buffer will overwrite it.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Cool idea! Updated on #2749

[MethodImpl(MethodImplOptions.AggressiveInlining)]
private bool ExecuteCollect()
{
this.exporter.OnExport = this.onCollectRef;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this.exporter.OnExport = this.OnCollect;
Would this not have worked?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So what this is doing is preventing the allocation of a delegate. The only sure-fire way I know how to do that is using a field. If you do it locally, sometimes the compiler will generate a field and initialize it on the first invocation, other times it will just allocate a delegate for each call. Not totally sure how it decides. I can't remember if this particular case fell into the delegate path or if I was just being cautious. I did do some benchmarks over this when I was working on it so I probably saw allocations and used the field, but I can't remember exactly.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the explanation, I had a hunch it was for perf reasons

Comment on lines +68 to +71
if (value < 0)
{
throw new ArgumentOutOfRangeException(nameof(value), "Value should be greater than or equal to zero.");
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

use Guard.Range(value, nameof(value), min: 0)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants