We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
$ kubectl logs kuma-tcp-echo-b968575b8-dkn5x -n kuma-app -c kuma-sidecar 2019-09-25T02:13:23.670Z INFO Skipping reading config from file 2019-09-25T02:13:23.671Z INFO kuma-dp.run effective configuration {"config": "controlPlane:\n bootstrapServer:\n url: http://kuma-control-plane.kuma-system:5682\ndataplane:\n mesh: default\n name: kuma-tcp-echo-b968575b8-dkn5x.kuma-app\n adminPort: 9901\ndataplaneRuntime:\n binaryPath: envoy\n configDir: /tmp/kuma.io/envoy\n"} 2019-09-25T02:13:23.671Z INFO kuma-dp.run starting Dataplane (Envoy) ... [2019-09-25 02:13:25.071][15][info][main] [source/server/server.cc:238] initializing epoch 0 (hot restart version=11.104) [2019-09-25 02:13:25.071][15][info][main] [source/server/server.cc:240] statically linked extensions: [2019-09-25 02:13:25.071][15][info][main] [source/server/server.cc:242] access_loggers: envoy.file_access_log,envoy.http_grpc_access_log [2019-09-25 02:13:25.071][15][info][main] [source/server/server.cc:245] filters.http: envoy.buffer,envoy.cors,envoy.csrf,envoy.ext_authz,envoy.fault,envoy.filters.http.dynamic_forward_proxy,envoy.filters.http.grpc_http1_reverse_bridge,envoy.filters.http.header_to_metadata,envoy.filters.http.jwt_authn,envoy.filters.http.original_src,envoy.filters.http.rbac,envoy.filters.http.tap,envoy.grpc_http1_bridge,envoy.grpc_json_transcoder,envoy.grpc_web,envoy.gzip,envoy.health_check,envoy.http_dynamo_filter,envoy.ip_tagging,envoy.lua,envoy.rate_limit,envoy.router,envoy.squash [2019-09-25 02:13:25.071][15][info][main] [source/server/server.cc:248] filters.listener: envoy.listener.original_dst,envoy.listener.original_src,envoy.listener.proxy_protocol,envoy.listener.tls_inspector [2019-09-25 02:13:25.071][15][info][main] [source/server/server.cc:251] filters.network: envoy.client_ssl_auth,envoy.echo,envoy.ext_authz,envoy.filters.network.dubbo_proxy,envoy.filters.network.mysql_proxy,envoy.filters.network.rbac,envoy.filters.network.sni_cluster,envoy.filters.network.thrift_proxy,envoy.filters.network.zookeeper_proxy,envoy.http_connection_manager,envoy.mongo_proxy,envoy.ratelimit,envoy.redis_proxy,envoy.tcp_proxy [2019-09-25 02:13:25.071][15][info][main] [source/server/server.cc:253] stat_sinks: envoy.dog_statsd,envoy.metrics_service,envoy.stat_sinks.hystrix,envoy.statsd [2019-09-25 02:13:25.071][15][info][main] [source/server/server.cc:255] tracers: envoy.dynamic.ot,envoy.lightstep,envoy.tracers.datadog,envoy.tracers.opencensus,envoy.zipkin [2019-09-25 02:13:25.071][15][info][main] [source/server/server.cc:258] transport_sockets.downstream: envoy.transport_sockets.alts,envoy.transport_sockets.tap,raw_buffer,tls [2019-09-25 02:13:25.071][15][info][main] [source/server/server.cc:261] transport_sockets.upstream: envoy.transport_sockets.alts,envoy.transport_sockets.tap,raw_buffer,tls [2019-09-25 02:13:25.071][15][info][main] [source/server/server.cc:267] buffer implementation: old (libevent) [2019-09-25 02:13:25.271][15][info][main] [source/server/server.cc:322] admin address: 127.0.0.1:9901 [2019-09-25 02:13:25.409][15][info][main] [source/server/server.cc:432] runtime: layers:
[2019-09-25 02:13:25.468][15][info][config] [source/server/configuration_impl.cc:61] loading 0 static secret(s) [2019-09-25 02:13:25.468][15][info][config] [source/server/configuration_impl.cc:67] loading 1 cluster(s) [2019-09-25 02:13:25.471][15][warning][config] [bazel-out/k8-opt/bin/source/common/config/_virtual_includes/grpc_stream_lib/common/config/grpc_stream.h:87] gRPC config stream closed: 14, no healthy upstream [2019-09-25 02:13:25.471][15][warning][config] [bazel-out/k8-opt/bin/source/common/config/_virtual_includes/grpc_stream_lib/common/config/grpc_stream.h:50] Unable to establish new stream [2019-09-25 02:13:25.471][15][info][config] [source/server/configuration_impl.cc:71] loading 0 listener(s) [2019-09-25 02:13:25.471][15][info][config] [source/server/configuration_impl.cc:96] loading tracing configuration [2019-09-25 02:13:25.471][15][info][config] [source/server/configuration_impl.cc:116] loading stats sink configuration [2019-09-25 02:13:25.471][15][info][main] [source/server/server.cc:516] starting main dispatch loop [2019-09-25 02:13:25.571][15][info][upstream] [source/common/upstream/cluster_manager_impl.cc:144] cm init: initializing cds [2019-09-25 02:13:25.618][15][info][upstream] [source/common/upstream/cluster_manager_impl.cc:489] add/update cluster localhost:8000 during init [2019-09-25 02:13:25.670][15][info][upstream] [source/common/upstream/cluster_manager_impl.cc:489] add/update cluster kuma-tcp-echo.kuma-app.svc:8000 during init [2019-09-25 02:13:25.673][15][info][upstream] [source/common/upstream/cluster_manager_impl.cc:489] add/update cluster theshark.kuma-app.svc:3000 during init [2019-09-25 02:13:25.771][15][info][upstream] [source/common/upstream/cluster_manager_impl.cc:489] add/update cluster thewhale.kuma-app.svc:3000 during init [2019-09-25 02:13:25.772][15][info][upstream] [source/common/upstream/cluster_manager_impl.cc:489] add/update cluster pass_through during init [2019-09-25 02:13:25.772][15][info][upstream] [source/common/upstream/cluster_manager_impl.cc:124] cm init: initializing secondary clusters [2019-09-25 02:13:26.670][55][critical][main] [source/exe/terminate_handler.cc:13] std::terminate called! (possible uncaught exception, see trace) [2019-09-25 02:13:26.670][55][critical][backtrace] [bazel-out/k8-opt/bin/source/server/_virtual_includes/backtrace_lib/server/backtrace.h:69] Backtrace (use tools/stack_decode.py to get line numbers): [2019-09-25 02:13:26.671][55][critical][backtrace] [bazel-out/k8-opt/bin/source/server/_virtual_includes/backtrace_lib/server/backtrace.h:75] #0: [0x1192ec8] [2019-09-25 02:13:26.671][55][critical][backtrace] [bazel-out/k8-opt/bin/source/server/_virtual_includes/backtrace_lib/server/backtrace.h:75] #1: [0x1192dd9] [2019-09-25 02:13:26.671][55][critical][backtrace] [bazel-out/k8-opt/bin/source/server/_virtual_includes/backtrace_lib/server/backtrace.h:75] #2: [0x1309fa6] [2019-09-25 02:13:26.769][55][critical][backtrace] [bazel-out/k8-opt/bin/source/server/_virtual_includes/backtrace_lib/server/backtrace.h:81] Caught Aborted, suspect faulting address 0x162e0000000f [2019-09-25 02:13:26.769][55][critical][backtrace] [bazel-out/k8-opt/bin/source/server/_virtual_includes/backtrace_lib/server/backtrace.h:69] Backtrace (use tools/stack_decode.py to get line numbers): [2019-09-25 02:13:26.769][55][critical][backtrace] [bazel-out/k8-opt/bin/source/server/_virtual_includes/backtrace_lib/server/backtrace.h:75] #0: [0x7f4715a384b0] [2019-09-25 02:13:26.769][55][critical][backtrace] [bazel-out/k8-opt/bin/source/server/_virtual_includes/backtrace_lib/server/backtrace.h:75] #1: [0x1192dd9] [2019-09-25 02:13:26.769][55][critical][backtrace] [bazel-out/k8-opt/bin/source/server/_virtual_includes/backtrace_lib/server/backtrace.h:75] #2: [0x1309fa6] 2019-09-25T02:13:27.873Z ERROR kuma-dp.run.envoy Envoy terminated with an error {"error": "signal: aborted"} 2019-09-25T02:13:27.873Z ERROR kuma-dp.run problem running Dataplane (Envoy) {"error": "signal: aborted"}
The text was updated successfully, but these errors were encountered:
Successfully merging a pull request may close this issue.
$ kubectl logs kuma-tcp-echo-b968575b8-dkn5x -n kuma-app -c kuma-sidecar
2019-09-25T02:13:23.670Z INFO Skipping reading config from file
2019-09-25T02:13:23.671Z INFO kuma-dp.run effective configuration {"config": "controlPlane:\n bootstrapServer:\n url: http://kuma-control-plane.kuma-system:5682\ndataplane:\n mesh: default\n name: kuma-tcp-echo-b968575b8-dkn5x.kuma-app\n adminPort: 9901\ndataplaneRuntime:\n binaryPath: envoy\n configDir: /tmp/kuma.io/envoy\n"}
2019-09-25T02:13:23.671Z INFO kuma-dp.run starting Dataplane (Envoy) ...
[2019-09-25 02:13:25.071][15][info][main] [source/server/server.cc:238] initializing epoch 0 (hot restart version=11.104)
[2019-09-25 02:13:25.071][15][info][main] [source/server/server.cc:240] statically linked extensions:
[2019-09-25 02:13:25.071][15][info][main] [source/server/server.cc:242] access_loggers: envoy.file_access_log,envoy.http_grpc_access_log
[2019-09-25 02:13:25.071][15][info][main] [source/server/server.cc:245] filters.http: envoy.buffer,envoy.cors,envoy.csrf,envoy.ext_authz,envoy.fault,envoy.filters.http.dynamic_forward_proxy,envoy.filters.http.grpc_http1_reverse_bridge,envoy.filters.http.header_to_metadata,envoy.filters.http.jwt_authn,envoy.filters.http.original_src,envoy.filters.http.rbac,envoy.filters.http.tap,envoy.grpc_http1_bridge,envoy.grpc_json_transcoder,envoy.grpc_web,envoy.gzip,envoy.health_check,envoy.http_dynamo_filter,envoy.ip_tagging,envoy.lua,envoy.rate_limit,envoy.router,envoy.squash
[2019-09-25 02:13:25.071][15][info][main] [source/server/server.cc:248] filters.listener: envoy.listener.original_dst,envoy.listener.original_src,envoy.listener.proxy_protocol,envoy.listener.tls_inspector
[2019-09-25 02:13:25.071][15][info][main] [source/server/server.cc:251] filters.network: envoy.client_ssl_auth,envoy.echo,envoy.ext_authz,envoy.filters.network.dubbo_proxy,envoy.filters.network.mysql_proxy,envoy.filters.network.rbac,envoy.filters.network.sni_cluster,envoy.filters.network.thrift_proxy,envoy.filters.network.zookeeper_proxy,envoy.http_connection_manager,envoy.mongo_proxy,envoy.ratelimit,envoy.redis_proxy,envoy.tcp_proxy
[2019-09-25 02:13:25.071][15][info][main] [source/server/server.cc:253] stat_sinks: envoy.dog_statsd,envoy.metrics_service,envoy.stat_sinks.hystrix,envoy.statsd
[2019-09-25 02:13:25.071][15][info][main] [source/server/server.cc:255] tracers: envoy.dynamic.ot,envoy.lightstep,envoy.tracers.datadog,envoy.tracers.opencensus,envoy.zipkin
[2019-09-25 02:13:25.071][15][info][main] [source/server/server.cc:258] transport_sockets.downstream: envoy.transport_sockets.alts,envoy.transport_sockets.tap,raw_buffer,tls
[2019-09-25 02:13:25.071][15][info][main] [source/server/server.cc:261] transport_sockets.upstream: envoy.transport_sockets.alts,envoy.transport_sockets.tap,raw_buffer,tls
[2019-09-25 02:13:25.071][15][info][main] [source/server/server.cc:267] buffer implementation: old (libevent)
[2019-09-25 02:13:25.271][15][info][main] [source/server/server.cc:322] admin address: 127.0.0.1:9901
[2019-09-25 02:13:25.409][15][info][main] [source/server/server.cc:432] runtime: layers:
static_layer:
{}
admin_layer:
{}
[2019-09-25 02:13:25.409][15][warning][runtime] [source/common/runtime/runtime_impl.cc:497] Skipping unsupported runtime layer: name: "base"
static_layer {
}
[2019-09-25 02:13:25.468][15][info][config] [source/server/configuration_impl.cc:61] loading 0 static secret(s)
[2019-09-25 02:13:25.468][15][info][config] [source/server/configuration_impl.cc:67] loading 1 cluster(s)
[2019-09-25 02:13:25.471][15][warning][config] [bazel-out/k8-opt/bin/source/common/config/_virtual_includes/grpc_stream_lib/common/config/grpc_stream.h:87] gRPC config stream closed: 14, no healthy upstream
[2019-09-25 02:13:25.471][15][warning][config] [bazel-out/k8-opt/bin/source/common/config/_virtual_includes/grpc_stream_lib/common/config/grpc_stream.h:50] Unable to establish new stream
[2019-09-25 02:13:25.471][15][info][config] [source/server/configuration_impl.cc:71] loading 0 listener(s)
[2019-09-25 02:13:25.471][15][info][config] [source/server/configuration_impl.cc:96] loading tracing configuration
[2019-09-25 02:13:25.471][15][info][config] [source/server/configuration_impl.cc:116] loading stats sink configuration
[2019-09-25 02:13:25.471][15][info][main] [source/server/server.cc:516] starting main dispatch loop
[2019-09-25 02:13:25.571][15][info][upstream] [source/common/upstream/cluster_manager_impl.cc:144] cm init: initializing cds
[2019-09-25 02:13:25.618][15][info][upstream] [source/common/upstream/cluster_manager_impl.cc:489] add/update cluster localhost:8000 during init
[2019-09-25 02:13:25.670][15][info][upstream] [source/common/upstream/cluster_manager_impl.cc:489] add/update cluster kuma-tcp-echo.kuma-app.svc:8000 during init
[2019-09-25 02:13:25.673][15][info][upstream] [source/common/upstream/cluster_manager_impl.cc:489] add/update cluster theshark.kuma-app.svc:3000 during init
[2019-09-25 02:13:25.771][15][info][upstream] [source/common/upstream/cluster_manager_impl.cc:489] add/update cluster thewhale.kuma-app.svc:3000 during init
[2019-09-25 02:13:25.772][15][info][upstream] [source/common/upstream/cluster_manager_impl.cc:489] add/update cluster pass_through during init
[2019-09-25 02:13:25.772][15][info][upstream] [source/common/upstream/cluster_manager_impl.cc:124] cm init: initializing secondary clusters
[2019-09-25 02:13:26.670][55][critical][main] [source/exe/terminate_handler.cc:13] std::terminate called! (possible uncaught exception, see trace)
[2019-09-25 02:13:26.670][55][critical][backtrace] [bazel-out/k8-opt/bin/source/server/_virtual_includes/backtrace_lib/server/backtrace.h:69] Backtrace (use tools/stack_decode.py to get line numbers):
[2019-09-25 02:13:26.671][55][critical][backtrace] [bazel-out/k8-opt/bin/source/server/_virtual_includes/backtrace_lib/server/backtrace.h:75] #0: [0x1192ec8]
[2019-09-25 02:13:26.671][55][critical][backtrace] [bazel-out/k8-opt/bin/source/server/_virtual_includes/backtrace_lib/server/backtrace.h:75] #1: [0x1192dd9]
[2019-09-25 02:13:26.671][55][critical][backtrace] [bazel-out/k8-opt/bin/source/server/_virtual_includes/backtrace_lib/server/backtrace.h:75] #2: [0x1309fa6]
[2019-09-25 02:13:26.769][55][critical][backtrace] [bazel-out/k8-opt/bin/source/server/_virtual_includes/backtrace_lib/server/backtrace.h:81] Caught Aborted, suspect faulting address 0x162e0000000f
[2019-09-25 02:13:26.769][55][critical][backtrace] [bazel-out/k8-opt/bin/source/server/_virtual_includes/backtrace_lib/server/backtrace.h:69] Backtrace (use tools/stack_decode.py to get line numbers):
[2019-09-25 02:13:26.769][55][critical][backtrace] [bazel-out/k8-opt/bin/source/server/_virtual_includes/backtrace_lib/server/backtrace.h:75] #0: [0x7f4715a384b0]
[2019-09-25 02:13:26.769][55][critical][backtrace] [bazel-out/k8-opt/bin/source/server/_virtual_includes/backtrace_lib/server/backtrace.h:75] #1: [0x1192dd9]
[2019-09-25 02:13:26.769][55][critical][backtrace] [bazel-out/k8-opt/bin/source/server/_virtual_includes/backtrace_lib/server/backtrace.h:75] #2: [0x1309fa6]
2019-09-25T02:13:27.873Z ERROR kuma-dp.run.envoy Envoy terminated with an error {"error": "signal: aborted"}
2019-09-25T02:13:27.873Z ERROR kuma-dp.run problem running Dataplane (Envoy) {"error": "signal: aborted"}
The text was updated successfully, but these errors were encountered: