Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: Appender-tracing crashes uptrace opentelemetry logs collector with logs produced during log transport #1684

Closed
QAston opened this issue Apr 28, 2024 · 4 comments
Labels
bug Something isn't working triage:todo Needs to be traiged.

Comments

@QAston
Copy link

QAston commented Apr 28, 2024

What happened?

Not sure if this is a bug in uptrace or in opentelemetry-rust appender tracing.

When using appender tracing with grpc transport and with debug log level enabled, hyper, tonic, and libraries they use will produce debug logs. If opentelemetry-appender-tracing is enabled and configured to send to uptrace (opentelemetry golang sdk) these logs crash the uptrace opentelemetry collector. Removing the log appender fixes the issue, as does removing the debug logs using this env filter: "debug,hyper=info,h2=info,tower=info,tonic=info".

Normal application debug logs work just fine, there's also no issues when using tracing-opentelemetry for uploading traces.

API Version

0.22

SDK Version

0.22

What Exporters are you seeing the problem on?

OTLP

Relevant log output

Log on the opentelemetry-rust side


  2024-04-28T04:12:57.190704Z DEBUG h2::codec::framed_write: send, frame: Ping { ack: true, payload: [2, 4, 16, 16, 9, 14, 7, 7] }
    at /usr/local/cargo/registry/src/index.crates.io-6f17d22bba15001f/h2-0.3.26/src/codec/framed_write.rs:213
    in h2::proto::connection::Connection with peer: Client

  2024-04-28T04:12:57.190751Z DEBUG h2::codec::framed_write: send, frame: Data { stream_id: StreamId(1), flags: (0x1: END_STREAM) }
    at /usr/local/cargo/registry/src/index.crates.io-6f17d22bba15001f/h2-0.3.26/src/codec/framed_write.rs:213
    in h2::proto::connection::Connection with peer: Client

  2024-04-28T04:12:57.190870Z DEBUG h2::codec::framed_read: received, frame: WindowUpdate { stream_id: StreamId(0), size_increment: 11910 }
    at /usr/local/cargo/registry/src/index.crates.io-6f17d22bba15001f/h2-0.3.26/src/codec/framed_read.rs:405
    in h2::proto::connection::Connection with peer: Client

  2024-04-28T04:12:57.190926Z DEBUG h2::codec::framed_read: received, frame: Ping { ack: false, payload: [2, 4, 16, 16, 9, 14, 7, 7] }
    at /usr/local/cargo/registry/src/index.crates.io-6f17d22bba15001f/h2-0.3.26/src/codec/framed_read.rs:405
    in h2::proto::connection::Connection with peer: Client

  2024-04-28T04:12:57.190964Z DEBUG h2::codec::framed_write: send, frame: Ping { ack: true, payload: [2, 4, 16, 16, 9, 14, 7, 7] }
    at /usr/local/cargo/registry/src/index.crates.io-6f17d22bba15001f/h2-0.3.26/src/codec/framed_write.rs:213
    in h2::proto::connection::Connection with peer: Client

OpenTelemetry log error occurred. Exporter otlp encountered the following errors: the grpc server returns error (Unknown error): , detailed error message: transport error
  2024-04-28T04:12:57.197313Z DEBUG hyper::client::service: connection error: hyper::Error(Io, Custom { kind: BrokenPipe, error: "connection closed because of a broken pipe" })
    at /usr/local/cargo/registry/src/index.crates.io-6f17d22bba15001f/hyper-0.14.28/src/client/service.rs:79

  2024-04-28T04:12:57.197313Z DEBUG hyper::proto::h2::client: client response error: stream closed because of a broken pipe
    at /usr/local/cargo/registry/src/index.crates.io-6f17d22bba15001f/hyper-0.14.28/src/proto/h2/client.rs:326

  2024-04-28T04:12:57.197616Z DEBUG hyper::client::service: connection error: hyper::Error(Io, Custom { kind: BrokenPipe, error: "connection closed because of a broken pipe" })
    at /usr/local/cargo/registry/src/index.crates.io-6f17d22bba15001f/hyper-0.14.28/src/client/service.rs:79

  2024-04-28T04:12:57.205515Z DEBUG hyper_util::client::legacy::pool: pooling idle connection for ("https", hopper.prod01.europe-west4.prod.hopper.chorus1.net)
    at /usr/local/cargo/registry/src/index.crates.io-6f17d22bba15001f/hyper-util-0.1.3/src/client/legacy/pool.rs:397

Failing log on the uptrace side:

panic: runtime error: invalid memory address or nil pointer dereference
[signal SIGSEGV: segmentation violation code=0x1 addr=0x28 pc=0xfbbdd0]

goroutine 135 [running]:
github.com/uptrace/uptrace/pkg/tracing.(*otlpLogProcessor).processLogRecord(0xc003c51568, 0xc003c51538, 0xc003c31c30)
	github.com/uptrace/uptrace/pkg/tracing/otlp_logs.go:204 +0x3f0
github.com/uptrace/uptrace/pkg/tracing.(*LogsServiceServer).export(0xc000892280, {0x3610a20, 0xc003c05ad0}, {0xc003c6c700, 0x1b, 0xc00068ccd0?}, 0xc003d82e10)
	github.com/uptrace/uptrace/pkg/tracing/otlp_logs.go:167 +0x45b
github.com/uptrace/uptrace/pkg/tracing.(*LogsServiceServer).Export(0xc000892280, {0x3610a20, 0xc003c05ad0}, 0xc003c2a0c0)
	github.com/uptrace/uptrace/pkg/tracing/otlp_logs.go:136 +0xaf
go.opentelemetry.io/proto/otlp/collector/logs/v1._LogsService_Export_Handler.func1({0x3610a20?, 0xc003c05ad0?}, {0x1893e60?, 0xc003c2a0c0?})
	go.opentelemetry.io/proto/otlp@v1.1.0/collector/logs/v1/logs_service_grpc.pb.go:90 +0xcb
go.opentelemetry.io/contrib/instrumentation/google.golang.org/grpc/otelgrpc.UnaryServerInterceptor.func1({0x3610a20, 0xc003c04540}, {0x1893e60, 0xc003c2a0c0}, 0xc003c0cb80, 0xc003c4a948)
	go.opentelemetry.io/contrib/instrumentation/google.golang.org/grpc/otelgrpc@v0.48.0/interceptor.go:326 +0x5a4
go.opentelemetry.io/proto/otlp/collector/logs/v1._LogsService_Export_Handler({0x19bb080, 0xc000892280}, {0x3610a20, 0xc003c04540}, 0xc003c02100, 0xc000786820)
	go.opentelemetry.io/proto/otlp@v1.1.0/collector/logs/v1/logs_service_grpc.pb.go:92 +0x143
google.golang.org/grpc.(*Server).processUnaryRPC(0xc00063ec00, {0x3610a20, 0xc003a98840}, {0x361a880, 0xc003ac4680}, 0xc003aae360, 0xc000896720, 0x42e85c0, 0x0)
	google.golang.org/grpc@v1.61.1/server.go:1385 +0xdd1
google.golang.org/grpc.(*Server).handleStream(0xc00063ec00, {0x361a880, 0xc003ac4680}, 0xc003aae360)
	google.golang.org/grpc@v1.61.1/server.go:1796 +0xfb8
google.golang.org/grpc.(*Server).serveStreams.func2.1()
	google.golang.org/grpc@v1.61.1/server.go:1029 +0x8b
created by google.golang.org/grpc.(*Server).serveStreams.func2 in goroutine 134
	google.golang.org/grpc@v1.61.1/server.go:1040 +0x125
+ '[' 0 -eq 0 ]
@QAston QAston added bug Something isn't working triage:todo Needs to be traiged. labels Apr 28, 2024
@cijothomas
Copy link
Member

This is same issue as #1171 ? Suppressing the events are the only option, sacrificing visibility into them :(

@QAston
Copy link
Author

QAston commented Apr 28, 2024

The linked issues are likely related to this one, yes.

@lalitb
Copy link
Member

lalitb commented Apr 29, 2024

should we close this issue in favor of #1171? Not sure if we need multiple issues for same problem.

@cijothomas
Copy link
Member

should we close this issue in favor of #1171? Not sure if we need multiple issues for same problem.

Agree!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working triage:todo Needs to be traiged.
Projects
None yet
Development

No branches or pull requests

3 participants