Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Spans are not being exported by OTEL Collector when using API/SDK version 1.11.0 #2624

Closed
mjmorales opened this issue Apr 20, 2022 · 3 comments
Labels
bug Something isn't working

Comments

@mjmorales
Copy link

environment setup

  • Python version 3.9.5
  • otel-contrib collector version 0.48.0
  • OTEL Collector Config
# otel-agent.config.yaml
receivers:
  otlp:
    protocols:
      grpc:
      http:

processors:
  batch:

exporters:
  logging:
    loglevel: debug
  file: 
    path: spans.json

extensions:
  health_check:
  pprof:
  zpages:

service:
  extensions: [health_check,pprof,zpages]
  pipelines:
    traces:
      receivers: [otlp]
      processors: [batch]
      exporters: [logging, file]
  • Installed dependencies in requirements.txt (more configurations tested below)
opentelemetry-api==1.11.0
opentelemetry-sdk==1.11.0
opentelemetry-exporter-otlp==1.11.0

Steps to reproduce

  1. Run the 0.48.0 version of the OpenTelemetry Contribs Collector:
otel-test % ./otelcol-contrib --version
2022/04/20 08:59:19 proto: duplicate proto type registered: jaeger.api_v2.PostSpansRequest
2022/04/20 08:59:19 proto: duplicate proto type registered: jaeger.api_v2.PostSpansResponse
otelcol-contrib version 0.48.0

otel-test % ./otelcol-contrib --config otel-agent.config.yaml
2022/04/20 08:59:56 proto: duplicate proto type registered: jaeger.api_v2.PostSpansRequest
2022/04/20 08:59:56 proto: duplicate proto type registered: jaeger.api_v2.PostSpansResponse
2022-04-20T08:59:56.831-0400    info    builder/exporters_builder.go:255   Exporter was built.     {"kind": "exporter", "name": "logging"}
2022-04-20T08:59:56.831-0400    info    builder/pipelines_builder.go:224   Pipeline was built.     {"kind": "pipeline", "name": "traces"}
2022-04-20T08:59:56.832-0400    info    builder/receivers_builder.go:226   Receiver was built.     {"kind": "receiver", "name": "otlp", "datatype": "traces"}
2022-04-20T08:59:56.832-0400    info    service/service.go:82      Starting extensions...
2022-04-20T08:59:56.832-0400    info    extensions/extensions.go:38        Extension is starting...  {"kind": "extension", "name": "health_check"}
2022-04-20T08:59:56.832-0400    info    healthcheckextension@v0.48.0/healthcheckextension.go:44    Starting health_check extension    {"kind": "extension", "name": "health_check", "config": {"Port":0,"TCPAddr":{"Endpoint":"0.0.0.0:13133"},"Path":"/","CheckCollectorPipeline":{"Enabled":false,"Interval":"5m","ExporterFailureThreshold":5}}}
2022-04-20T08:59:56.832-0400    info    extensions/extensions.go:42        Extension started.      {"kind": "extension", "name": "health_check"}
2022-04-20T08:59:56.832-0400    info    extensions/extensions.go:38        Extension is starting...  {"kind": "extension", "name": "pprof"}
2022-04-20T08:59:57.307-0400    info    pprofextension@v0.48.0/pprofextension.go:78        Starting net/http/pprof server     {"kind": "extension", "name": "pprof", "config": {"TCPAddr":{"Endpoint":"localhost:1777"},"BlockProfileFraction":0,"MutexProfileFraction":0,"SaveToFile":""}}
2022-04-20T08:59:57.307-0400    info    extensions/extensions.go:42        Extension started.      {"kind": "extension", "name": "pprof"}
2022-04-20T08:59:57.307-0400    info    extensions/extensions.go:38        Extension is starting...  {"kind": "extension", "name": "zpages"}
2022-04-20T08:59:57.307-0400    info    zpagesextension/zpagesextension.go:40      Register Host's zPages     {"kind": "extension", "name": "zpages"}
2022-04-20T08:59:57.772-0400    info    zpagesextension/zpagesextension.go:53      Starting zPages extension  {"kind": "extension", "name": "zpages", "config": {"TCPAddr":{"Endpoint":"localhost:55679"}}}
2022-04-20T08:59:57.772-0400    info    extensions/extensions.go:42        Extension started.      {"kind": "extension", "name": "zpages"}
2022-04-20T08:59:57.772-0400    info    service/service.go:87      Starting exporters...
2022-04-20T08:59:57.772-0400    info    builder/exporters_builder.go:40    Exporter is starting... {"kind": "exporter", "name": "logging"}
2022-04-20T08:59:57.772-0400    info    builder/exporters_builder.go:48    Exporter started.       {"kind": "exporter", "name": "logging"}
2022-04-20T08:59:57.772-0400    info    service/service.go:92      Starting processors...
2022-04-20T08:59:57.772-0400    info    builder/pipelines_builder.go:54    Pipeline is starting... {"kind": "pipeline", "name": "traces"}
2022-04-20T08:59:57.772-0400    info    builder/pipelines_builder.go:65    Pipeline is started.    {"kind": "pipeline", "name": "traces"}
2022-04-20T08:59:57.772-0400    info    service/service.go:97      Starting receivers...
2022-04-20T08:59:57.772-0400    info    builder/receivers_builder.go:68    Receiver is starting... {"kind": "receiver", "name": "otlp"}
2022-04-20T08:59:57.772-0400    info    otlpreceiver/otlp.go:69    Starting GRPC server on endpoint 0.0.0.0:4317      {"kind": "receiver", "name": "otlp"}
2022-04-20T08:59:57.772-0400    info    otlpreceiver/otlp.go:87    Starting HTTP server on endpoint 0.0.0.0:4318      {"kind": "receiver", "name": "otlp"}
2022-04-20T08:59:57.772-0400    info    builder/receivers_builder.go:73    Receiver started.       {"kind": "receiver", "name": "otlp"}
2022-04-20T08:59:57.772-0400    info    healthcheck/handler.go:129 Health Check state change       {"kind": "extension", "name": "health_check", "status": "ready"}
2022-04-20T08:59:57.772-0400    info    service/telemetry.go:109   Setting up own telemetry...
2022-04-20T08:59:57.773-0400    info    service/telemetry.go:129   Serving Prometheus metrics      {"address": ":8888", "level": "basic", "service.instance.id": "ce96f224-b5ec-4242-87ff-f32cd5775140", "service.version": "latest"}
2022-04-20T08:59:57.773-0400    info    service/collector.go:252   Starting otelcol-contrib...     {"Version": "0.48.0", "NumCPU": 12}
2022-04-20T08:59:57.773-0400    info    service/collector.go:142   Everything is ready. Begin running and processing data.
  1. Run the following Python script to send a span to the OTEL Collector over OTLP
from opentelemetry import trace
from opentelemetry.exporter.otlp.proto.grpc.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk.resources import Resource
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import BatchSpanProcessor, ConsoleSpanExporter
from importlib.metadata import version

resource = Resource(attributes={
    "service.name": "service"
})

trace.set_tracer_provider(TracerProvider(resource=resource))
tracer = trace.get_tracer(__name__)

otlp_exporter = OTLPSpanExporter(endpoint="http://localhost:4317", insecure=True)
console_exporter = ConsoleSpanExporter()
span_processor = BatchSpanProcessor(otlp_exporter)
console_processor = BatchSpanProcessor(console_exporter)

trace.get_tracer_provider().add_span_processor(span_processor)
trace.get_tracer_provider().add_span_processor(console_processor)

otel_api_version = version('opentelemetry-api')
otel_sdk_version = version('opentelemetry-sdk')
otel_otlp_exporter_version = version('opentelemetry-exporter-otlp')

print(
    f"API Version: {otel_api_version}\nSDK Version: {otel_sdk_version}\nOTLP Exporter Version: {otel_otlp_exporter_version}"
)
with tracer.start_as_current_span(f"API-{otel_api_version}|SDK-{otel_sdk_version}|OTLP-{otel_otlp_exporter_version}"):
    print("Hello world!")

What is the expected behavior?
Spans should be logged to the console, logged by the otel contribs collector, and logged to spans.json

What is the actual behavior?
Spans are not logged by the collector nor logged to spans.json when using API/SDK version 1.11.0.
Message in logs instead is: 2022-04-20T08:56:13.103-0400 warn zapgrpc/zapgrpc.go:191 [transport] transport: http2Server.HandleStreams failed to read frame: read tcp 127.0.0.1:4317->127.0.0.1:50009: read: connection reset by peer {"grpc_log": true}

Additional context
When using API/SDK version 1.10.0 traces are being logged and exported properly

otel-test % cat requirements.txt
opentelemetry-api==1.10.0
opentelemetry-sdk==1.10.0
opentelemetry-exporter-otlp==1.10.0%     

otel-test % python trace.py
API Version: 1.10.0
SDK Version: 1.10.0
OTLP Exporter Version: 1.10.0
Hello world!
{
    "name": "API-1.10.0|SDK-1.10.0|OTLP-1.10.0",
    "context": {
        "trace_id": "0xa517e145ad45888fe8af4c85476c6ee0",
        "span_id": "0xca68dc1a0f6d9031",
        "trace_state": "[]"
    },
    "kind": "SpanKind.INTERNAL",
    "parent_id": null,
    "start_time": "2022-04-20T12:43:49.850101Z",
    "end_time": "2022-04-20T12:43:49.850148Z",
    "status": {
        "status_code": "UNSET"
    },
    "attributes": {},
    "events": [],
    "links": [],
    "resource": {
        "service.name": "service"
    }
}

# From OTEL Collector stdout
2022-04-20T08:43:50.223-0400    INFO    loggingexporter/logging_exporter.go:40     TracesExporter  {"#spans": 1}
2022-04-20T08:43:50.224-0400    DEBUG   loggingexporter/logging_exporter.go:49     ResourceSpans #0
Resource SchemaURL: 
Resource labels:
     -> service.name: STRING(service)
ScopeSpans #0
ScopeSpans SchemaURL: 
InstrumentationScope __main__ 
Span #0
    Trace ID       : a517e145ad45888fe8af4c85476c6ee0
    Parent ID      : 
    ID             : ca68dc1a0f6d9031
    Name           : API-1.10.0|SDK-1.10.0|OTLP-1.10.0
    Kind           : SPAN_KIND_INTERNAL
    Start time     : 2022-04-20 12:43:49.850101 +0000 UTC
    End time       : 2022-04-20 12:43:49.850148 +0000 UTC
    Status code    : STATUS_CODE_UNSET
    Status message : 

When using 1.11.0 console spans print but nothing is exported by the collector:

otel-test % cat requirements.txt
opentelemetry-api==1.11.0
opentelemetry-sdk==1.11.0
opentelemetry-exporter-otlp==1.11.0% 

otel-test % python trace.py
API Version: 1.11.0
SDK Version: 1.11.0
OTLP Exporter Version: 1.11.0
Hello world!
{
    "name": "API-1.11.0|SDK-1.11.0|OTLP-1.11.0",
    "context": {
        "trace_id": "0x402309114d73ad574c5859fb094dec07",
        "span_id": "0x1013daa861ebb08f",
        "trace_state": "[]"
    },
    "kind": "SpanKind.INTERNAL",
    "parent_id": null,
    "start_time": "2022-04-20T12:42:49.977595Z",
    "end_time": "2022-04-20T12:42:49.977654Z",
    "status": {
        "status_code": "UNSET"
    },
    "attributes": {},
    "events": [],
    "links": [],
    "resource": {
        "service.name": "service"
    }
}

# From OTEL Collector stdout
2022-04-20T08:42:50.274-0400    warn    zapgrpc/zapgrpc.go:191     [transport] transport: http2Server.HandleStreams failed to read frame: read tcp 127.0.0.1:4317->127.0.0.1:49675: read: connection reset by peer  {"grpc_log": true}

When using main branch:

otel-test % cat requirements.txt
./opentelemetry-python/opentelemetry-api
./opentelemetry-python/opentelemetry-sdk
./opentelemetry-python/exporter/opentelemetry-exporter-otlp

otel-test % python trace.py
API Version: 1.11.0
SDK Version: 1.11.0
OTLP Exporter Version: 1.11.0
Hello world!
{
    "name": "API-1.11.0|SDK-1.11.0|OTLP-1.11.0",
    "context": {
        "trace_id": "0x9d727d7a5ad07cb52a19a9ab44357782",
        "span_id": "0x7d67888d9b88188d",
        "trace_state": "[]"
    },
    "kind": "SpanKind.INTERNAL",
    "parent_id": null,
    "start_time": "2022-04-20T12:56:12.815962Z",
    "end_time": "2022-04-20T12:56:12.816006Z",
    "status": {
        "status_code": "UNSET"
    },
    "attributes": {},
    "events": [],
    "links": [],
    "resource": {
        "service.name": "service"
    }
}

# From OTEL Collector stdout
2022-04-20T08:56:13.103-0400    warn    zapgrpc/zapgrpc.go:191     [transport] transport: http2Server.HandleStreams failed to read frame: read tcp 127.0.0.1:4317->127.0.0.1:50009: read: connection reset by peer  {"grpc_log": true}
@mjmorales mjmorales added the bug Something isn't working label Apr 20, 2022
@srikanthccv
Copy link
Member

I believe this was an issue with collector and fixed open-telemetry/opentelemetry-collector#5188.

// @codeboten

@codeboten
Copy link
Contributor

Yes this was fixed in the collector and is part of the 0.49.0 release

@mjmorales
Copy link
Author

Awesome, thanks for the reply and fix. I'll close this issue and test again against the next contrib release when it's ready.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants