Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ASGI Instrumentation ValueError: too many values to unpack (expected 2) during JaegerPropagator #1487

Closed
XuhuiSun95 opened this issue Dec 12, 2022 · 1 comment · Fixed by #1435
Labels
bug Something isn't working

Comments

@XuhuiSun95
Copy link

Describe your environment

  • python:3.11-slim docker image
  • OTEL
opentelemetry-api                           1.14.0
opentelemetry-exporter-jaeger-thrift        1.14.0
opentelemetry-instrumentation               0.35b0
opentelemetry-instrumentation-asgi          0.35b0
opentelemetry-instrumentation-fastapi       0.35b0
opentelemetry-propagator-jaeger             1.14.0
opentelemetry-sdk                           1.14.0
opentelemetry-semantic-conventions          0.35b0
opentelemetry-util-http                     0.35b0

Steps to reproduce

  1. Setup an upstream application that previously create the jaeger header during the original request(example: Traefik)
  2. Set up JaegerPropagator and Instrumentation on FastAPI

example Traefik config

  traefik:
    image: traefik:2.6
    restart: always
    command:
      --api.dashboard=true
      --providers.docker=true
      --providers.docker.exposedbydefault=false
      --entrypoints.web.address=:80
      --entrypoints.web.http.redirections.entryPoint.to=websecure
      --entrypoints.web.http.redirections.entryPoint.scheme=https
      --entrypoints.websecure.address=:443
      --accesslog=true
      --tracing.jaeger=true
      --tracing.jaeger.gen128Bit=true
      --tracing.jaeger.samplingServerURL=http://jaeger:5778/sampling
      --tracing.jaeger.samplingType=const
      --tracing.jaeger.samplingParam=1.0
      --tracing.jaeger.localAgentHostPort=jaeger:6831
    ports:
      - 80:80
      - 443:443
    volumes:
      - /var/run/docker.sock:/var/run/docker.sock:ro
    labels:
      - traefik.enable=true
      - traefik.http.routers.dashboard.rule=Host(`traefik.docker.localhost`) && (PathPrefix(`/api/`) || PathPrefix(`/dashboard/`))
      - traefik.http.routers.dashboard.tls=true
      - traefik.http.routers.dashboard.service=api@internal
      
  svc:
    build:
      context: .
    command: tail -f /dev/null
    restart: always
    volumes:
      - .:/usr/src
    environment:
      - SQLALCHEMY_DEBUG_ECHO=true
      - OTEL_RESOURCE_ATTRIBUTES=service.name=svc
      - OTEL_EXPORTER_JAEGER_AGENT_HOST=jaeger
      - OTEL_EXPORTER_JAEGER_AGENT_PORT=6831
      - OTEL_TRACES_SAMPLER=parentbased_always_on
      # - OTEL_TRACES_SAMPLER=parentbased_traceidratio
      # - OTEL_TRACES_SAMPLER_ARG=1.0
      - OTEL_INSTRUMENTATION_HTTP_CAPTURE_HEADERS_SERVER_REQUEST=x-forwarded-for,x-real-ip
    expose:
      - 80
    labels:
      - traefik.enable=true
      - traefik.http.routers.svc.rule=Host(`api.docker.localhost`) && PathPrefix(`/`)
      - traefik.http.routers.svc.tls=true
    healthcheck:
      test: ["CMD", "curl", "-f", "http://localhost/health"]
      interval: 10s
      timeout: 10s
      retries: 3
      start_period: 10s

import fastapi
from opentelemetry.instrumentation.fastapi import FastAPIInstrumentor
from opentelemetry import trace
from opentelemetry.exporter.jaeger.thrift import JaegerExporter
from opentelemetry.propagate import set_global_textmap
from opentelemetry.propagators.jaeger import JaegerPropagator
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import BatchSpanProcessor

set_global_textmap(JaegerPropagator())
provider = TracerProvider()
span_processor = BatchSpanProcessor(JaegerExporter())
provider.add_span_processor(span_processor)
trace.set_tracer_provider(provider)

app = fastapi.FastAPI()


@app.get("/health")
async def health():
    return {"message": "hello world"}


FastAPIInstrumentor.instrument_app(app)

What is the expected behavior?
If the tracing already propagates with the Jaeger header(example from Traefik as the reverse proxy and with tracing enabled), we should still have the trace propagated successfully in FastAPI Instrumentation.

What is the actual behavior?
now even with default swagger pages, it will have exceptions on opening. ofc, I have reverse proxy both API and Documents.

  • any request coming with the Jaeger header will have issues as below
  • any request that does not have a previous jaeger header will still be fine (because the asgi getter will have no issue with processing empty headers)
INFO:     172.20.0.7:55510 - "GET /docs HTTP/1.1" 500 Internal Server Error
ERROR:    Exception in ASGI application
Traceback (most recent call last):
  File "/home/ssp-py-sdk/.local/lib/python3.11/site-packages/uvicorn/protocols/http/httptools_impl.py", line 419, in run_asgi
    result = await app(  # type: ignore[func-returns-value]
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/ssp-py-sdk/.local/lib/python3.11/site-packages/uvicorn/middleware/proxy_headers.py", line 78, in __call__
    return await self.app(scope, receive, send)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/ssp-py-sdk/.local/lib/python3.11/site-packages/fastapi/applications.py", line 270, in __call__
    await super().__call__(scope, receive, send)
  File "/home/ssp-py-sdk/.local/lib/python3.11/site-packages/starlette/applications.py", line 124, in __call__
    await self.middleware_stack(scope, receive, send)
  File "/home/ssp-py-sdk/.local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 184, in __call__
    raise exc
  File "/home/ssp-py-sdk/.local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 162, in __call__
    await self.app(scope, receive, _send)
  File "/home/ssp-py-sdk/.local/lib/python3.11/site-packages/opentelemetry/instrumentation/asgi/__init__.py", line 531, in __call__
    span, token = _start_internal_or_server_span(
                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/ssp-py-sdk/.local/lib/python3.11/site-packages/opentelemetry/instrumentation/utils.py", line 111, in _start_internal_or_server_span
    ctx = extract(context_carrier, getter=context_getter)
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/ssp-py-sdk/.local/lib/python3.11/site-packages/opentelemetry/propagate/__init__.py", line 102, in extract
    return get_global_textmap().extract(carrier, context, getter=getter)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/ssp-py-sdk/.local/lib/python3.11/site-packages/opentelemetry/propagators/jaeger/__init__.py", line 54, in extract
    context = self._extract_baggage(getter, carrier, context)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/ssp-py-sdk/.local/lib/python3.11/site-packages/opentelemetry/propagators/jaeger/__init__.py", line 119, in _extract_baggage
    for key in getter.keys(carrier)
               ^^^^^^^^^^^^^^^^^^^^
  File "/home/ssp-py-sdk/.local/lib/python3.11/site-packages/opentelemetry/instrumentation/asgi/__init__.py", line 263, in keys
    return [_key.decode("utf8") for (_key, _value) in carrier]
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/ssp-py-sdk/.local/lib/python3.11/site-packages/opentelemetry/instrumentation/asgi/__init__.py", line 263, in <listcomp>
    return [_key.decode("utf8") for (_key, _value) in carrier]
                                    ^^^^^^^^^^^^^^

Additional context
I believe the issue from this specific line https://github.com/open-telemetry/opentelemetry-python-contrib/blob/main/instrumentation/opentelemetry-instrumentation-asgi/src/opentelemetry/instrumentation/asgi/__init__.py#L263, and I can confirm the older version 0.34b0 with this impl https://github.com/open-telemetry/opentelemetry-python-contrib/blob/v0.34b0/instrumentation/opentelemetry-instrumentation-asgi/src/opentelemetry/instrumentation/asgi/__init__.py#L217 have no issue.

@XuhuiSun95 XuhuiSun95 added the bug Something isn't working label Dec 12, 2022
@kartoch
Copy link

kartoch commented Dec 14, 2022

MR in progress : #1435

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants