Skip to content
This repository has been archived by the owner on Jul 19, 2023. It is now read-only.

AWS Cloudwatch logs are limited to 256Kb, so big messages are separated #67

Open
smuryginim opened this issue Dec 26, 2018 · 0 comments

Comments

@smuryginim
Copy link

smuryginim commented Dec 26, 2018

Hello everybody.
I'm currently in progress of migration to Cloudwatch logs and want to use the current plugin to stream logs to ES.
Everything works fine, but when I have big messages in my scenario > 256 Kb, then they are broken into parts.
Doc is here https://docs.aws.amazon.com/AmazonCloudWatch/latest/events/CalculatePutEventsEntrySize.html

In my scenario, I publish log in JSON format in cloudwatch with some additional metadata, like "app", "profile", etc.

<appender name="json-output" class="ch.qos.logback.core.ConsoleAppender">
        <encoder class="net.logstash.logback.encoder.LoggingEventCompositeJsonEncoder">
            <providers>
                <timestamp/>
                <mdc/>
                <logLevel/>
                <loggerName/>
                <pattern>
                    <pattern>
                        {
                        "profile": "${PROFILE}",
                        "app": "${APPNAME}"
                        }
                    </pattern>
                </pattern>
                <threadName/>
                <message/>
            </providers>
        </encoder>
    </appender>

In Logstash

  cloudwatch_logs {
        start_position => "end"
        log_group => ["/myGroupPrefix"]
        codec => json
        log_group_prefix => "true"
  }

So when the message is broken into parts Cloudwatch produces broken JSON.
Do someone handle such cases with Cloudwatch plugin?

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant