Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Expand test coverage of OpenCensus-Shim by testing metrics integrations. #3835

Merged
merged 12 commits into from Nov 10, 2021
3 changes: 2 additions & 1 deletion dependencyManagement/build.gradle.kts
Expand Up @@ -41,7 +41,8 @@ val DEPENDENCY_SETS = listOf(
"opencensus-api",
"opencensus-impl-core",
"opencensus-impl",
"opencensus-exporter-metrics-util"
"opencensus-exporter-metrics-util",
"opencensus-contrib-exemplar-util"
)
),
DependencySet(
Expand Down
27 changes: 14 additions & 13 deletions opencensus-shim/README.md
Expand Up @@ -22,28 +22,29 @@ Applications only need to set up OpenTelemetry exporters, not OpenCensus.

To allow the shim to work for metrics, add the shim as a dependency.

Applications also need to pass the configured metric exporter to the shim:
Applications also need to attach OpenCensus metrics to their metric readers on registration.

```
OpenTelemetryMetricsExporter exporter =
OpenTelemetryMetricsExporter.createAndRegister(metricExporter);
SdkMeterProvider.builder()
.registerMetricReader(
OpenCensusMetrics.attachTo(readerFactory)
)
.buildAndRegisterGlobal();
```

For example, if a logging exporter were configured, the following would be
added:

```
LoggingMetricExporter metricExporter = new LoggingMetricExporter();
OpenTelemetryMetricsExporter exporter =
OpenTelemetryMetricsExporter.createAndRegister(metricExporter);
```

The export interval can also be set:

```
OpenTelemetryMetricsExporter exporter =
OpenTelemetryMetricsExporter.createAndRegister(metricExporter,
Duration.create(0, 500));
SdkMeterProvider.builder()
.registerMetricReader(
OpenCensusMetrics.attachTo(
PeriodicMetricReader.builder(metricExporter)
.newMetricReaderFactory()
)
)
.buildAndRegisterGlobal();
```

## Known Problems
Expand Down
8 changes: 8 additions & 0 deletions opencensus-shim/build.gradle.kts
Expand Up @@ -21,4 +21,12 @@ dependencies {

testImplementation("org.slf4j:slf4j-simple")
testImplementation("io.opencensus:opencensus-impl")
testImplementation("io.opencensus:opencensus-contrib-exemplar-util")
}

tasks.named<Test>("test") {
// We must force a fork per-test class because OpenCensus pollutes globals with no restorative
Copy link
Contributor

@anuraaga anuraaga Nov 9, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is it possible to split test sets to work around this to not fork on every test case? forkEvery(1) goes sloooow

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Only for some of the tests. I've tried to limit what's there to just the ones we need, but for some of the coverage I do needs to actually set up OpenCensus.

OpenCensus itself tended to use heavy mocks/mocking for various metric things. Really appreciate in OTel the global-cleaning utiities.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah if we can identify those tests that need it and split them into test suites at least it doesn't have to fork for every single one.

// methods available.
setForkEvery(1)
maxParallelForks = 3
}
Expand Up @@ -13,46 +13,27 @@
import io.opencensus.metrics.Metrics;
import io.opencensus.metrics.export.Metric;
import io.opencensus.metrics.export.MetricDescriptor;
import io.opencensus.metrics.export.Point;
import io.opencensus.metrics.export.Summary;
import io.opencensus.metrics.export.Summary.Snapshot;
import io.opencensus.metrics.export.TimeSeries;
import io.opentelemetry.api.common.Attributes;
import io.opentelemetry.api.common.AttributesBuilder;
import io.opentelemetry.sdk.common.InstrumentationLibraryInfo;
import io.opentelemetry.sdk.metrics.data.AggregationTemporality;
import io.opentelemetry.sdk.metrics.data.DoubleGaugeData;
import io.opentelemetry.sdk.metrics.data.DoublePointData;
import io.opentelemetry.sdk.metrics.data.DoubleSumData;
import io.opentelemetry.sdk.metrics.data.DoubleSummaryData;
import io.opentelemetry.sdk.metrics.data.DoubleSummaryPointData;
import io.opentelemetry.sdk.metrics.data.LongGaugeData;
import io.opentelemetry.sdk.metrics.data.LongPointData;
import io.opentelemetry.sdk.metrics.data.LongSumData;
import io.opentelemetry.opencensusshim.internal.metrics.MetricAdapter;
import io.opentelemetry.sdk.metrics.data.MetricData;
import io.opentelemetry.sdk.metrics.data.PointData;
import io.opentelemetry.sdk.metrics.data.ValueAtPercentile;
import io.opentelemetry.sdk.resources.Resource;
import java.util.ArrayList;
import java.util.Collection;
import java.util.HashSet;
import java.util.List;
import java.util.Set;
import java.util.concurrent.TimeUnit;
import java.util.logging.Logger;
import javax.annotation.Nonnull;
import javax.annotation.Nullable;

@Deprecated
public final class OpenTelemetryMetricsExporter extends MetricExporter {
private static final Logger LOGGER =
Logger.getLogger(OpenTelemetryMetricsExporter.class.getName());

private static final String EXPORTER_NAME = "OpenTelemetryMetricExporter";
private static final InstrumentationLibraryInfo INSTRUMENTATION_LIBRARY_INFO =
InstrumentationLibraryInfo.create("io.opentelemetry.opencensusshim", null);

private final IntervalMetricReader intervalMetricReader;
private final io.opentelemetry.sdk.metrics.export.MetricExporter otelExporter;
// TODO - find this from OTel SDK.
private final Resource resource = Resource.getDefault();

public static OpenTelemetryMetricsExporter createAndRegister(
io.opentelemetry.sdk.metrics.export.MetricExporter otelExporter) {
Expand Down Expand Up @@ -84,28 +65,7 @@ public void export(Collection<Metric> metrics) {
List<MetricData> metricData = new ArrayList<>();
Set<MetricDescriptor.Type> unsupportedTypes = new HashSet<>();
for (Metric metric : metrics) {
for (TimeSeries timeSeries : metric.getTimeSeriesList()) {
AttributesBuilder attributesBuilder = Attributes.builder();
for (int i = 0; i < metric.getMetricDescriptor().getLabelKeys().size(); i++) {
if (timeSeries.getLabelValues().get(i).getValue() != null) {
attributesBuilder.put(
metric.getMetricDescriptor().getLabelKeys().get(i).getKey(),
timeSeries.getLabelValues().get(i).getValue());
}
}
Attributes attributes = attributesBuilder.build();
List<PointData> points = new ArrayList<>();
MetricDescriptor.Type type = null;
for (Point point : timeSeries.getPoints()) {
type = mapAndAddPoint(unsupportedTypes, metric, attributes, points, point);
}
if (type != null) {
MetricData md = toMetricData(type, metric.getMetricDescriptor(), points);
if (md != null) {
metricData.add(md);
}
}
}
metricData.add(MetricAdapter.convert(resource, metric));
}
if (!unsupportedTypes.isEmpty()) {
LOGGER.warning(
Expand All @@ -117,148 +77,7 @@ public void export(Collection<Metric> metrics) {
}
}

private static MetricDescriptor.Type mapAndAddPoint(
Set<MetricDescriptor.Type> unsupportedTypes,
Metric metric,
Attributes attributes,
List<PointData> points,
Point point) {
long timestampNanos =
TimeUnit.SECONDS.toNanos(point.getTimestamp().getSeconds())
+ point.getTimestamp().getNanos();
MetricDescriptor.Type type = metric.getMetricDescriptor().getType();
switch (type) {
case GAUGE_INT64:
case CUMULATIVE_INT64:
points.add(mapLongPoint(attributes, point, timestampNanos));
break;
case GAUGE_DOUBLE:
case CUMULATIVE_DOUBLE:
points.add(mapDoublePoint(attributes, point, timestampNanos));
break;
case SUMMARY:
points.add(mapSummaryPoint(attributes, point, timestampNanos));
break;
default:
unsupportedTypes.add(type);
break;
}
return type;
}

public void stop() {
intervalMetricReader.stop();
}

@Nonnull
private static DoubleSummaryPointData mapSummaryPoint(
Attributes attributes, Point point, long timestampNanos) {
return DoubleSummaryPointData.create(
timestampNanos,
timestampNanos,
attributes,
point
.getValue()
.match(arg -> null, arg -> null, arg -> null, Summary::getCount, arg -> null),
point.getValue().match(arg -> null, arg -> null, arg -> null, Summary::getSum, arg -> null),
point
.getValue()
.match(
arg -> null,
arg -> null,
arg -> null,
OpenTelemetryMetricsExporter::mapPercentiles,
arg -> null));
}

private static List<ValueAtPercentile> mapPercentiles(Summary arg) {
List<ValueAtPercentile> percentiles = new ArrayList<>();
for (Snapshot.ValueAtPercentile percentile : arg.getSnapshot().getValueAtPercentiles()) {
percentiles.add(ValueAtPercentile.create(percentile.getPercentile(), percentile.getValue()));
}
return percentiles;
}

@Nonnull
private static DoublePointData mapDoublePoint(
Attributes attributes, Point point, long timestampNanos) {
return DoublePointData.create(
timestampNanos,
timestampNanos,
attributes,
point
.getValue()
.match(arg -> arg, Long::doubleValue, arg -> null, arg -> null, arg -> null));
}

@Nonnull
private static LongPointData mapLongPoint(
Attributes attributes, Point point, long timestampNanos) {
return LongPointData.create(
timestampNanos,
timestampNanos,
attributes,
point
.getValue()
.match(Double::longValue, arg -> arg, arg -> null, arg -> null, arg -> null));
}

@Nullable
@SuppressWarnings("unchecked")
private static MetricData toMetricData(
MetricDescriptor.Type type,
MetricDescriptor metricDescriptor,
List<? extends PointData> points) {
if (metricDescriptor.getType() == null) {
return null;
}
switch (type) {
case GAUGE_INT64:
return MetricData.createLongGauge(
Resource.getDefault(),
INSTRUMENTATION_LIBRARY_INFO,
metricDescriptor.getName(),
metricDescriptor.getDescription(),
metricDescriptor.getUnit(),
LongGaugeData.create((List<LongPointData>) points));

case GAUGE_DOUBLE:
return MetricData.createDoubleGauge(
Resource.getDefault(),
INSTRUMENTATION_LIBRARY_INFO,
metricDescriptor.getName(),
metricDescriptor.getDescription(),
metricDescriptor.getUnit(),
DoubleGaugeData.create((List<DoublePointData>) points));

case CUMULATIVE_INT64:
return MetricData.createLongSum(
Resource.getDefault(),
INSTRUMENTATION_LIBRARY_INFO,
metricDescriptor.getName(),
metricDescriptor.getDescription(),
metricDescriptor.getUnit(),
LongSumData.create(
true, AggregationTemporality.CUMULATIVE, (List<LongPointData>) points));
case CUMULATIVE_DOUBLE:
return MetricData.createDoubleSum(
Resource.getDefault(),
INSTRUMENTATION_LIBRARY_INFO,
metricDescriptor.getName(),
metricDescriptor.getDescription(),
metricDescriptor.getUnit(),
DoubleSumData.create(
true, AggregationTemporality.CUMULATIVE, (List<DoublePointData>) points));
case SUMMARY:
return MetricData.createDoubleSummary(
Resource.getDefault(),
INSTRUMENTATION_LIBRARY_INFO,
metricDescriptor.getName(),
metricDescriptor.getDescription(),
metricDescriptor.getUnit(),
DoubleSummaryData.create((List<DoubleSummaryPointData>) points));
default:
return null;
}
}
}