Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: do not clear other labelsets when updating metrics #941

Merged

Conversation

dyladan
Copy link
Member

@dyladan dyladan commented Apr 7, 2020

Originally reported by @obecny in gitter:

The following code:

'use strict';

const { PrometheusExporter } = require('@opentelemetry/exporter-prometheus');
const { MeterProvider } = require('@opentelemetry/metrics');
const exporter = new PrometheusExporter(
  {
    startServer: true,
  },
  () => {
    console.log('prometheus scrape endpoint: http://localhost:9464/metrics');
  },
);

const meter = new MeterProvider({
  interval: 5000,
  exporter: exporter,
}).getMeter('foo');


const counter = meter.createCounter('foo', {
  monotonic: true,
  labelKeys: ['name'],
  description: 'foo',
});

const boundCounter1 = counter.bind({ name: 'foo1' });
const boundCounter2 = counter.bind({ name: 'foo2' });

boundCounter1.add(12);
boundCounter2.add(15);

Gave the following prometheus export:

# HELP foo foo
# TYPE foo counter
foo{name="foo2"} 15

Expected:

# HELP foo foo
# TYPE foo counter
foo{name="foo1"} 12
foo{name="foo2"} 15

This was caused by two issues:

  1. in the batcher, there is no longer a labelset identifier that can be used to identify a batch
  2. in the exporter, clearing the metric cleared it for all labels

@dyladan dyladan self-assigned this Apr 7, 2020
@dyladan dyladan added the bug Something isn't working label Apr 7, 2020
Copy link
Member

@obecny obecny left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can you add some unit test for that ?

@codecov-io
Copy link

codecov-io commented Apr 7, 2020

Codecov Report

Merging #941 into master will decrease coverage by 0.00%.
The diff coverage is 94.73%.

@@            Coverage Diff             @@
##           master     #941      +/-   ##
==========================================
- Coverage   94.94%   94.94%   -0.01%     
==========================================
  Files         247      248       +1     
  Lines       11142    11173      +31     
  Branches     1065     1068       +3     
==========================================
+ Hits        10579    10608      +29     
- Misses        563      565       +2     
Impacted Files Coverage Δ
...ackages/opentelemetry-metrics/test/Batcher.test.ts 92.59% <92.59%> (ø)
...pentelemetry-exporter-prometheus/src/prometheus.ts 92.07% <100.00%> (+0.07%) ⬆️
...emetry-exporter-prometheus/test/prometheus.test.ts 98.70% <100.00%> (+<0.01%) ⬆️
...ckages/opentelemetry-metrics/src/export/Batcher.ts 100.00% <100.00%> (ø)

@dyladan
Copy link
Member Author

dyladan commented Apr 7, 2020

@obecny @mayurkale22 added tests

Copy link
Member

@mayurkale22 mayurkale22 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great :)

Copy link
Member

@obecny obecny left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm

@mayurkale22 mayurkale22 merged commit 083b2d6 into open-telemetry:master Apr 7, 2020
@dyladan dyladan deleted the counter-labels branch April 8, 2020 18:41
pichlermarc pushed a commit to dynatrace-oss-contrib/opentelemetry-js that referenced this pull request Dec 15, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

6 participants