Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

InvalidParameterException: Log event too large #6164

Closed
3 tasks done
hajinsuha1 opened this issue Jun 5, 2024 · 4 comments
Closed
3 tasks done

InvalidParameterException: Log event too large #6164

hajinsuha1 opened this issue Jun 5, 2024 · 4 comments
Assignees
Labels
bug This issue is a bug.

Comments

@hajinsuha1
Copy link

hajinsuha1 commented Jun 5, 2024

Checkboxes for prior research

Describe the bug

Hi,

we are using the PutLogEventsCommand from @aws-sdk/client-cloudwatch-logs in a Node and get the following error sometimes

InvalidParameterException: Log event too large: 315563 bytes exceeds limit of 262144
    at de_InvalidParameterExceptionRes (/var/runtime/node_modules/@aws-sdk/client-cloudwatch-logs/dist-cjs/index.js:2241:21)
    at de_CommandError (/var/runtime/node_modules/@aws-sdk/client-cloudwatch-logs/dist-cjs/index.js:2144:19)
    at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
    at async /var/runtime/node_modules/@aws-sdk/node_modules/@smithy/middleware-serde/dist-cjs/index.js:35:20
    at async /var/runtime/node_modules/@aws-sdk/node_modules/@smithy/core/dist-cjs/index.js:165:18
    at async /var/runtime/node_modules/@aws-sdk/node_modules/@smithy/middleware-retry/dist-cjs/index.js:320:38
    at async /var/runtime/node_modules/@aws-sdk/middleware-logger/dist-cjs/index.js:33:22
    at async putLogEvents (/opt/ssr-function-log-enhancer/cloudwatch-logs-api.js:82:9)

I know there is a limit of 256kb per log event.

Is it possible to handle this issue via truncation like in aws/amazon-cloudwatch-logs-for-fluent-bit#98

SDK version number

@aws-sdk/client-cloudwatch-logs@3.552.0

Which JavaScript Runtime is this issue in?

Node.js

Details of the browser/Node.js/ReactNative version

v18.20.2

Reproduction Steps

import {CloudWatchLogsClient, PutLogEventsCommand} from '@aws-sdk/client-cloudwatch-logs';

export const handler = async (event) => {
    const cloudWatchLogsClient = new CloudWatchLogsClient()
    const logStreamName = "LOG_GROUP_NAME"
    const logGroupName = "LOG_STREAM_NAME"
    const logEvents = [{
      timestamp: Date.now(), 
      message: 'A'.repeat(300 * 1024) // String that is 300 KB
    }]
    const logEventParams = {
        logEvents,
        logStreamName,
        logStreamName
    }
    await cloudWatchLogsClient.send(new PutLogEventsCommand(logEventParams))
  return true;
};

Observed Behavior

InvalidParameterException: Log event too large: 315563 bytes exceeds limit of 262144
    at de_InvalidParameterExceptionRes (/var/runtime/node_modules/@aws-sdk/client-cloudwatch-logs/dist-cjs/index.js:2241:21)
    at de_CommandError (/var/runtime/node_modules/@aws-sdk/client-cloudwatch-logs/dist-cjs/index.js:2144:19)
    at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
    at async /var/runtime/node_modules/@aws-sdk/node_modules/@smithy/middleware-serde/dist-cjs/index.js:35:20
    at async /var/runtime/node_modules/@aws-sdk/node_modules/@smithy/core/dist-cjs/index.js:165:18
    at async /var/runtime/node_modules/@aws-sdk/node_modules/@smithy/middleware-retry/dist-cjs/index.js:320:38
    at async /var/runtime/node_modules/@aws-sdk/middleware-logger/dist-cjs/index.js:33:22
    at async putLogEvents (/opt/ssr-function-log-enhancer/cloudwatch-logs-api.js:82:9)

Expected Behavior

Log message is truncated to the limit and successfully forwarded

Possible Solution

Truncate the log instead of raising an error and failing the request

Additional Information/Context

No response

@hajinsuha1 hajinsuha1 added bug This issue is a bug. needs-triage This issue or PR still needs to be triaged. labels Jun 5, 2024
@aBurmeseDev
Copy link
Member

HI @hajinsuha1 - thanks for reaching out.

This's actually more of service related rather than SDK, I would reach out on https://github.com/aws/amazon-cloudwatch-logs-for-fluent-bit or refer to comments on this issue: aws/amazon-cloudwatch-logs-for-fluent-bit#85

@aBurmeseDev aBurmeseDev self-assigned this Jun 7, 2024
@aBurmeseDev aBurmeseDev added response-requested Waiting on additional info and feedback. Will move to \"closing-soon\" in 7 days. closing-soon This issue will automatically close in 4 days unless further comments are made. and removed needs-triage This issue or PR still needs to be triaged. labels Jun 7, 2024
@hajinsuha1
Copy link
Author

hajinsuha1 commented Jun 7, 2024

Thanks for the response @aBurmeseDev. I guess I was wondering if the same solution of truncating rather than erroring could be used in the SDK. In my use case, we are using the aws-sdk-js-v3 but we're finding logs being dropped due to the log size exceeding the limit. It would be great if instead the message was truncated so that logs don't get dropped

@github-actions github-actions bot removed closing-soon This issue will automatically close in 4 days unless further comments are made. response-requested Waiting on additional info and feedback. Will move to \"closing-soon\" in 7 days. labels Jun 8, 2024
@aBurmeseDev
Copy link
Member

@hajinsuha1 - thanks for your patience. This was discussed by the team and unfortunately, this won't be considered at this time. Please reach out to https://github.com/aws/amazon-cloudwatch-logs-for-fluent-bit or refer to aws/amazon-cloudwatch-logs-for-fluent-bit#85

Closing the issue, please reach out again if you have any other SDK related questions!

@aBurmeseDev aBurmeseDev closed this as not planned Won't fix, can't repro, duplicate, stale Jun 11, 2024
Copy link

This thread has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs and link to relevant comments in this thread.

@github-actions github-actions bot locked as resolved and limited conversation to collaborators Jun 26, 2024
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
bug This issue is a bug.
Projects
None yet
Development

No branches or pull requests

2 participants