Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Migrate filelog operators to follow opentelemetry-log-collection v0.29.0 changes #436

Merged
merged 9 commits into from
Apr 28, 2022
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
9 changes: 9 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,15 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/).

## Unreleased

### Changed

- Migrate filelog operators to follow opentelemetry-log-collection v0.29.0 changes ([#436](https://github.com/signalfx/splunk-otel-collector-chart/pull/436))
- [BREAKING CHANGE] Several breaking changes were made that affect the
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It may be a good time to add an unreleased section to https://github.com/signalfx/splunk-otel-collector-chart/blob/main/UPGRADING.md so we can associate some updated examples with these changes.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

To move forward, I'd like to merge this PR the way it is. I'll open up a new PR today for updating UPGRADING.md.

filelog, syslog, tcplog, udplog, and journald receivers. Any use of the
[extraFileLogs](https://github.com/signalfx/splunk-otel-collector-chart/blob/941ad7f255cce585f4c06dd46c0cd63ef57d9903/helm-charts/splunk-otel-collector/values.yaml#L488) config, [logsCollection.containers.extraOperators](https://github.com/signalfx/splunk-otel-collector-chart/blob/941ad7f255cce585f4c06dd46c0cd63ef57d9903/helm-charts/splunk-otel-collector/values.yaml#L431) config,
and affected receivers in a custom manner should be reviewed. See
[upgrading guidelines](https://github.com/open-telemetry/opentelemetry-log-collection/blob/v0.29.0/CHANGELOG.md#upgrading-to-v0290)

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

A lot of this looks good, I'd suggest making these updates. We should also update the title of this PR to match this (include opentelemetry-log-collection v0.29.0).

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Today my team merged in the code for the splunk-otel-collector v0.49.0 release. Let's try to update this, I'll approve it, and I'll merge it.

## [0.48.0] - 2022-04-13

### Changed
Expand Down
6 changes: 3 additions & 3 deletions helm-charts/splunk-otel-collector/templates/_helpers.tpl
Original file line number Diff line number Diff line change
Expand Up @@ -250,15 +250,15 @@ Create a filter expression for multiline logs configuration.
{{- $expr := "" }}
{{- if .namespaceName }}
{{- $useRegexp := eq (toString .namespaceName.useRegexp | default "false") "true" }}
{{- $expr = cat "($$resource[\"k8s.namespace.name\"])" (ternary "matches" "==" $useRegexp) (quote .namespaceName.value) "&&" }}
{{- $expr = cat "(resource[\"k8s.namespace.name\"])" (ternary "matches" "==" $useRegexp) (quote .namespaceName.value) "&&" }}
{{- end }}
{{- if .podName }}
{{- $useRegexp := eq (toString .podName.useRegexp | default "false") "true" }}
{{- $expr = cat $expr "($$resource[\"k8s.pod.name\"])" (ternary "matches" "==" $useRegexp) (quote .podName.value) "&&" }}
{{- $expr = cat $expr "(resource[\"k8s.pod.name\"])" (ternary "matches" "==" $useRegexp) (quote .podName.value) "&&" }}
{{- end }}
{{- if .containerName }}
{{- $useRegexp := eq (toString .containerName.useRegexp | default "false") "true" }}
{{- $expr = cat $expr "($$resource[\"k8s.container.name\"])" (ternary "matches" "==" $useRegexp) (quote .containerName.value) "&&" }}
{{- $expr = cat $expr "(resource[\"k8s.container.name\"])" (ternary "matches" "==" $useRegexp) (quote .containerName.value) "&&" }}
{{- end }}
{{- $expr | trimSuffix "&&" | trim }}
{{- end -}}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -261,30 +261,30 @@ receivers:
id: get-format
routes:
- output: parser-docker
expr: '$$body matches "^\\{"'
expr: 'body matches "^\\{"'
- output: parser-crio
expr: '$$body matches "^[^ Z]+ "'
expr: 'body matches "^[^ Z]+ "'
- output: parser-containerd
expr: '$$body matches "^[^ Z]+Z"'
expr: 'body matches "^[^ Z]+Z"'
{{- end }}
{{- if or (not .Values.logsCollection.containers.containerRuntime) (eq .Values.logsCollection.containers.containerRuntime "cri-o") }}
# Parse CRI-O format
- type: regex_parser
id: parser-crio
regex: '^(?P<time>[^ Z]+) (?P<stream>stdout|stderr) (?P<logtag>[^ ]*) (?P<log>.*)$'
timestamp:
parse_from: time
parse_from: attributes.time
layout_type: gotime
layout: '2006-01-02T15:04:05.000000000-07:00'
- type: recombine
id: crio-recombine
combine_field: log
is_last_entry: "($$.logtag) == 'F'"
combine_field: body.log
is_last_entry: "(body.logtag) == 'F'"
- type: add
id: crio-handle_empty_log
output: filename
if: $$.log == nil
field: $$body.log
if: body.log == nil
field: body.log
value: ""
{{- end }}
{{- if or (not .Values.logsCollection.containers.containerRuntime) (eq .Values.logsCollection.containers.containerRuntime "containerd") }}
Expand All @@ -293,31 +293,32 @@ receivers:
id: parser-containerd
regex: '^(?P<time>[^ ^Z]+Z) (?P<stream>stdout|stderr) (?P<logtag>[^ ]*) (?P<log>.*)$'
timestamp:
parse_from: time
parse_from: attributes.time
layout: '%Y-%m-%dT%H:%M:%S.%LZ'
- type: recombine
id: containerd-recombine
combine_field: log
is_last_entry: "($$.logtag) == 'F'"
combine_field: body.log
is_last_entry: "(body.logtag) == 'F'"
- type: add
id: containerd-handle_empty_log
output: filename
if: $$.log == nil
field: $$body.log
if: body.log == nil
field: body.log
value: ""
{{- end }}
{{- if or (not .Values.logsCollection.containers.containerRuntime) (eq .Values.logsCollection.containers.containerRuntime "docker") }}
# Parse Docker format
- type: json_parser
id: parser-docker
parse_to: body
jvoravong marked this conversation as resolved.
Show resolved Hide resolved
timestamp:
parse_from: time
parse_from: body.time
layout: '%Y-%m-%dT%H:%M:%S.%LZ'
{{- end }}
- type: add
id: filename
field: $$resource["com.splunk.source"]
value: EXPR($$attributes["file.path"])
field: resource["com.splunk.source"]
value: EXPR(attributes["log.file.path"])
# Extract metadata from file path
- type: regex_parser
id: extract_metadata_from_filepath
Expand All @@ -326,29 +327,30 @@ receivers:
{{- else }}
regex: '^\/var\/log\/pods\/(?P<namespace>[^_]+)_(?P<pod_name>[^_]+)_(?P<uid>[^\/]+)\/(?P<container_name>[^\._]+)\/(?P<restart_count>\d+)\.log$'
{{- end }}
parse_from: $$attributes["file.path"]
parse_to: body
parse_from: attributes["log.file.path"]
# Move out attributes to Attributes
- type: add
field: $$resource["k8s.pod.uid"]
value: EXPR($$.uid)
field: resource["k8s.pod.uid"]
value: EXPR(body.uid)
- type: add
field: $$resource["k8s.container.restart_count"]
value: EXPR($$.restart_count)
field: resource["k8s.container.restart_count"]
value: EXPR(body.restart_count)
- type: add
field: $$resource["k8s.container.name"]
value: EXPR($$.container_name)
field: resource["k8s.container.name"]
value: EXPR(body.container_name)
- type: add
field: $$resource["k8s.namespace.name"]
value: EXPR($$.namespace)
field: resource["k8s.namespace.name"]
value: EXPR(body.namespace)
- type: add
field: $$resource["k8s.pod.name"]
value: EXPR($$.pod_name)
field: resource["k8s.pod.name"]
value: EXPR(body.pod_name)
- type: add
field: $$resource["com.splunk.sourcetype"]
value: EXPR("kube:container:"+$$.container_name)
field: resource["com.splunk.sourcetype"]
value: EXPR("kube:container:"+body.container_name)
- type: add
field: $$attributes["log.iostream"]
value: EXPR($$.stream)
field: attributes["log.iostream"]
value: EXPR(body.stream)
{{- if .Values.logsCollection.containers.multilineConfigs }}
- type: router
routes:
Expand All @@ -361,9 +363,9 @@ receivers:
- type: recombine
id: {{ include "splunk-otel-collector.newlineKey" . | quote}}
output: clean-up-log-record
source_identifier: $$resource["com.splunk.source"]
combine_field: log
is_first_entry: '($$.log) matches {{ .firstEntryRegex | quote }}'
source_identifier: resource["com.splunk.source"]
combine_field: body.log
is_first_entry: '(body.log) matches {{ .firstEntryRegex | quote }}'
{{- end }}
{{- end }}
{{- with .Values.logsCollection.containers.extraOperators }}
Expand All @@ -372,8 +374,8 @@ receivers:
# Clean up log record
- type: move
id: clean-up-log-record
from: $$body.log
to: $$
from: body.log
to: body
{{- end }}

{{- if .Values.logsCollection.extraFileLogs }}
Expand All @@ -389,29 +391,29 @@ receivers:
priority: {{ $unit.priority }}
operators:
- type: add
field: $$resource["com.splunk.source"]
field: resource["com.splunk.source"]
value: {{ $.Values.logsCollection.journald.directory }}
- type: add
field: $$resource["com.splunk.sourcetype"]
value: 'EXPR("kube:journald:"+$$._SYSTEMD_UNIT)'
field: resource["com.splunk.sourcetype"]
value: 'EXPR("kube:journald:"+body._SYSTEMD_UNIT)'
- type: add
field: $$resource["com.splunk.index"]
field: resource["com.splunk.index"]
value: {{ $.Values.logsCollection.journald.index | default $.Values.splunkPlatform.index }}
- type: add
field: $$resource["host.name"]
field: resource["host.name"]
value: 'EXPR(env("K8S_NODE_NAME"))'
- type: add
field: $$resource["journald.priority.number"]
value: 'EXPR($$.PRIORITY)'
field: resource["journald.priority.number"]
value: 'EXPR(body.PRIORITY)'
- type: add
field: $$resource["journald.unit.name"]
value: 'EXPR($$._SYSTEMD_UNIT)'
field: resource["journald.unit.name"]
value: 'EXPR(body._SYSTEMD_UNIT)'

# extract MESSAGE field into the log body and discard rest of the fields
- type: move
id: set-body
from: $$body.MESSAGE
to: $$
from: body.MESSAGE
to: body
{{- end }}
{{- end }}
{{- end }}
Expand Down
68 changes: 35 additions & 33 deletions rendered/manifests/otel-logs/configmap-agent.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -140,83 +140,85 @@ data:
operators:
- id: get-format
routes:
- expr: $$body matches "^\\{"
- expr: body matches "^\\{"
output: parser-docker
- expr: $$body matches "^[^ Z]+ "
- expr: body matches "^[^ Z]+ "
output: parser-crio
- expr: $$body matches "^[^ Z]+Z"
- expr: body matches "^[^ Z]+Z"
output: parser-containerd
type: router
- id: parser-crio
regex: ^(?P<time>[^ Z]+) (?P<stream>stdout|stderr) (?P<logtag>[^ ]*) (?P<log>.*)$
timestamp:
layout: "2006-01-02T15:04:05.000000000-07:00"
layout_type: gotime
parse_from: time
parse_from: attributes.time
type: regex_parser
- combine_field: log
- combine_field: body.log
id: crio-recombine
is_last_entry: ($$.logtag) == 'F'
is_last_entry: (body.logtag) == 'F'
type: recombine
- field: $$body.log
- field: body.log
id: crio-handle_empty_log
if: $$.log == nil
if: body.log == nil
output: filename
type: add
value: ""
- id: parser-containerd
regex: ^(?P<time>[^ ^Z]+Z) (?P<stream>stdout|stderr) (?P<logtag>[^ ]*) (?P<log>.*)$
timestamp:
layout: '%Y-%m-%dT%H:%M:%S.%LZ'
parse_from: time
parse_from: attributes.time
type: regex_parser
- combine_field: log
- combine_field: body.log
id: containerd-recombine
is_last_entry: ($$.logtag) == 'F'
is_last_entry: (body.logtag) == 'F'
type: recombine
- field: $$body.log
- field: body.log
id: containerd-handle_empty_log
if: $$.log == nil
if: body.log == nil
output: filename
type: add
value: ""
- id: parser-docker
parse_to: body
timestamp:
layout: '%Y-%m-%dT%H:%M:%S.%LZ'
parse_from: time
parse_from: body.time
type: json_parser
- field: $$resource["com.splunk.source"]
- field: resource["com.splunk.source"]
id: filename
type: add
value: EXPR($$attributes["file.path"])
value: EXPR(attributes["log.file.path"])
- id: extract_metadata_from_filepath
parse_from: $$attributes["file.path"]
parse_from: attributes["log.file.path"]
parse_to: body
regex: ^\/var\/log\/pods\/(?P<namespace>[^_]+)_(?P<pod_name>[^_]+)_(?P<uid>[^\/]+)\/(?P<container_name>[^\._]+)\/(?P<restart_count>\d+)\.log$
type: regex_parser
- field: $$resource["k8s.pod.uid"]
- field: resource["k8s.pod.uid"]
type: add
value: EXPR($$.uid)
- field: $$resource["k8s.container.restart_count"]
value: EXPR(body.uid)
- field: resource["k8s.container.restart_count"]
type: add
value: EXPR($$.restart_count)
- field: $$resource["k8s.container.name"]
value: EXPR(body.restart_count)
- field: resource["k8s.container.name"]
type: add
value: EXPR($$.container_name)
- field: $$resource["k8s.namespace.name"]
value: EXPR(body.container_name)
- field: resource["k8s.namespace.name"]
type: add
value: EXPR($$.namespace)
- field: $$resource["k8s.pod.name"]
value: EXPR(body.namespace)
- field: resource["k8s.pod.name"]
type: add
value: EXPR($$.pod_name)
- field: $$resource["com.splunk.sourcetype"]
value: EXPR(body.pod_name)
- field: resource["com.splunk.sourcetype"]
type: add
value: EXPR("kube:container:"+$$.container_name)
- field: $$attributes["log.iostream"]
value: EXPR("kube:container:"+body.container_name)
- field: attributes["log.iostream"]
type: add
value: EXPR($$.stream)
- from: $$body.log
value: EXPR(body.stream)
- from: body.log
id: clean-up-log-record
to: $$
to: body
type: move
poll_interval: 200ms
start_at: beginning
Expand Down
2 changes: 1 addition & 1 deletion rendered/manifests/otel-logs/daemonset.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@ spec:
app: splunk-otel-collector
release: default
annotations:
checksum/config: fa0d199a23871cace4f204a78d4801fe5062eefe4ba026add5e0c5940006e1f0
checksum/config: 83cf3114f436ca2f3ed5cdc2d3da8423aba575de223a6fa0b7d8ba1a420958c9
kubectl.kubernetes.io/default-container: otel-collector
spec:
hostNetwork: true
Expand Down