Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Zipkin V1 Receiver + V2 Exporter creates empty annotations #960

Closed
chris-smith-zocdoc opened this issue May 13, 2020 · 1 comment · Fixed by #993
Closed

Zipkin V1 Receiver + V2 Exporter creates empty annotations #960

chris-smith-zocdoc opened this issue May 13, 2020 · 1 comment · Fixed by #993
Labels
bug Something isn't working help wanted Good issue for contributors to OpenTelemetry Service to pick up
Projects

Comments

@chris-smith-zocdoc
Copy link
Contributor

In the json zipkin v1 format, the span kind is inferred from the annotations (sr, ss = kind SERVER). The inference is working correctly, but ends up creating empty annotations when used with a v2 Exporter. These annotations contain just a timestamp and no other information. I believe these should be removed.

Example v1 span

[
    {
        "id": "cfa93400b6c64114",
        "name": "GET",
        "annotations": [
            {
                "timestamp": 1589329047638966,
                "value": "sr",
                "endpoint": {
                    "ipv4": "172.29.0.3",
                    "port": 0,
                    "serviceName": "my-service"
                }
            },
            {
                "timestamp": 1589329047643097,
                "value": "ss",
                "endpoint": {
                    "ipv4": "172.29.0.3",
                    "port": 0,
                    "serviceName": "my-service"
                }
            }
        ],
        "binaryAnnotations": [
            {
                "key": "http.host",
                "value": "localhost:5000",
                "endpoint": {
                    "ipv4": "172.29.0.3",
                    "port": 0,
                    "serviceName": "my-service"
                }
            },
            {
                "key": "http.uri",
                "value": "http://localhost:5000/healthcheck",
                "endpoint": {
                    "ipv4": "172.29.0.3",
                    "port": 0,
                    "serviceName": "my-service"
                }
            },
            {
                "key": "http.path",
                "value": "/healthcheck",
                "endpoint": {
                    "ipv4": "172.29.0.3",
                    "port": 0,
                    "serviceName": "my-service"
                }
            }
        ],
        "debug": false,
        "traceId": "778fff36dc3e06cd",
        "timestamp": 1589329047638966,
        "duration": 4130
    }
]

I expect this to produce a span that looks like this

[
    {
        "traceId": "778fff36dc3e06cd",
        "id": "cfa93400b6c64114",
        "kind": "SERVER",
        "name": "get",
        "timestamp": 1589329047638966,
        "duration": 4130,
        "localEndpoint": {
            "serviceName": "my-service",
            "ipv4": "172.29.0.3"
        },
        "tags": {
            "http.host": "localhost:5000",
            "http.path": "/healthcheck",
            "http.uri": "http://localhost:5000/healthcheck"
        }
    }
]

But instead it produces this

[
    {
        "traceId": "690dba9272ff7f43",
        "id": "1691fe50b928203c",
        "kind": "SERVER",
        "name": "get",
        "timestamp": 1589329265584397,
        "duration": 10734,
        "localEndpoint": {
            "serviceName": "my-service",
            "ipv4": "172.29.0.3"
        },
        "annotations": [
            {
                "timestamp": 1589329265584397,
                "value": ""
            },
            {
                "timestamp": 1589329265595132,
                "value": ""
            }
        ],
        "tags": {
            "http.host": "localhost:5000",
            "http.path": "/healthcheck",
            "http.uri": "http://localhost:5000/healthcheck"
        }
    }
]

docker-compose.yml

version: "2"
services:
  otel-collector:
    image: otelcontribcol:latest
    command: ["--config=/etc/otel-collector-config.yaml", "--mem-ballast-size-mib", "4000"]
    volumes:
      - ./otel-collector-config.yaml:/etc/otel-collector-config.yaml
    ports:
      - "8080:9411" # zipkin
    depends_on:
      - zipkin-all-in-one

  zipkin-all-in-one:
    image: openzipkin/zipkin:latest
    ports:
      - "9411:9411"

otel-collector-config.yaml

receivers:
  zipkin:
    endpoint: :9411

exporters:
  zipkin:
    url: "http://zipkin-all-in-one:9411/api/v2/spans"

processors:
  queued_retry:

service:
  pipelines:
    traces:
      receivers: [zipkin]
      processors: [queued_retry]
      exporters: [zipkin]
chris-smith-zocdoc added a commit to Zocdoc/opentelemetry-collector that referenced this issue May 13, 2020
@tigrannajaryan tigrannajaryan added the help wanted Good issue for contributors to OpenTelemetry Service to pick up label May 13, 2020
@flands flands added the bug Something isn't working label May 13, 2020
@flands flands added this to To do in Triaged via automation May 13, 2020
@chris-smith-zocdoc
Copy link
Contributor Author

One solution I came up with was to remove these kind annotations during parseZipkinV1Annotations Does this seems like a good solution? Zocdoc@0aa99de

If we agree on an approach I'd be happy to send a pr

chris-smith-zocdoc added a commit to Zocdoc/opentelemetry-collector that referenced this issue May 19, 2020
chris-smith-zocdoc added a commit to Zocdoc/opentelemetry-collector that referenced this issue May 19, 2020
chris-smith-zocdoc added a commit to Zocdoc/opentelemetry-collector that referenced this issue May 19, 2020
Triaged automation moved this from To do to Done Jun 2, 2020
bogdandrutu pushed a commit that referenced this issue Jun 2, 2020
* Remove extra send/receive annotations with using zipkin v1
Fixes #960

* Update test to use assert.EqualValues
wyTrivail pushed a commit to mxiamxia/opentelemetry-collector that referenced this issue Jul 13, 2020
…metry#993)

* Remove extra send/receive annotations with using zipkin v1
Fixes open-telemetry#960

* Update test to use assert.EqualValues
MovieStoreGuy pushed a commit to atlassian-forks/opentelemetry-collector that referenced this issue Nov 11, 2021
Co-authored-by: Tyler Yahn <MrAlias@users.noreply.github.com>
hughesjj pushed a commit to hughesjj/opentelemetry-collector that referenced this issue Apr 27, 2023
…y#960)

Bumps [boto3](https://github.com/boto/boto3) from 1.20.7 to 1.20.8.
- [Release notes](https://github.com/boto/boto3/releases)
- [Changelog](https://github.com/boto/boto3/blob/develop/CHANGELOG.rst)
- [Commits](boto/boto3@1.20.7...1.20.8)

---
updated-dependencies:
- dependency-name: boto3
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working help wanted Good issue for contributors to OpenTelemetry Service to pick up
Projects
No open projects
Triaged
  
Done
Development

Successfully merging a pull request may close this issue.

3 participants