Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Sumo extension ship logs to HTTP endpoint only after receiving SHUTDOWN api #3

Open
Seiji-U opened this issue Nov 23, 2020 · 8 comments

Comments

@Seiji-U
Copy link

Seiji-U commented Nov 23, 2020

Description

I have a lambda function with Sumologic extension layer and invoked every minutes.
The extension does not send any logs to HTTP endpoint while the same runtime is kept.
I stopped the invocation then SHUTDOWN event was sent to the extension after 7 minutes.
At that time the extension ships logs to the HTTP endpoint.

Steps to Reproduce

  1. Create a lambda function with Sumologic extension.
  2. Set environment variables
  3. Invoke the function every minutes.
  4. Check nothing appears under the source category.
  5. Stop invocation and wait 7 minutes.
  6. SHUTDOWN event is sent to the extension.
  7. Check the source category and see logs are there now.

Expected Behavior

I want the logs to be shipped in every invocation.

Relevant Logs / Console output

Cloudwatch records an error on each invocation

level=error msg="Not able to post statuscode: Post "https://collectors.au.sumologic.com/receiver/v1/http/ZaVnC4dxxxxxxxxxxx\": context deadline exceeded (Client.Timeout exceeded while awaiting headers) \n" Name=sumologic-extension

Your Environment

  • Layer version:
    • 1
  • RunTime:
    • Python 3.8
  • Environment Variables
    • SUMO_HTTP_ENDPOINT=my endpoint
    • SUMO_LOG_TYPES=function
@SumoSourabh
Copy link
Contributor

@Seiji-U Can you please confirm what is the execution time of your lambda function?

@Seiji-U
Copy link
Author

Seiji-U commented Nov 23, 2020

@SumoSourabh I tested at 24 Nov 2020 9:25AM AEDT
Attachments are Lambda log and Sumo result
lambda_logs.xlsx
sumo-results.xlsx

@SumoSourabh
Copy link
Contributor

@Seiji-U From the logs seems like the lambda only run for ~10ms. During this time the extension is not able to send logs to Sumologic and keeps on retrying during next invocation and either fails or timeout. We are looking into the same issue to send logs in case of lambda has a duration of less than 500ms (as extension duration is governed by the function run duration).

Would it be possible for you to increase the lambda execution time to lets say 500ms and try the extension?

@SumoSourabh SumoSourabh added the bug Something isn't working label Nov 24, 2020
@jm15kim
Copy link

jm15kim commented Feb 18, 2021

@SumoSourabh I have a similar issue.
Do you have a plan to fix this issue or do I need to increase execution time for it?

@SumoSourabh
Copy link
Contributor

For now, I would suggest to increase the execution time. We are working with AWS on a fix.

Have kept the issue open till we push a fix.

@srikanthm-1
Copy link

Hi,
We are facing the same issue as well. Do you have an ETA for the fix @SumoSourabh ?

Thanks

@soleares
Copy link

soleares commented May 24, 2021

Does this AWS update fix the issue? https://aws.amazon.com/about-aws/whats-new/2021/05/aws-lambda-extensions-now-generally-available/

It mentions that the extensions run after the lambda runs.

@SumoSourabh
Copy link
Contributor

Hi All,

We have updated the README.md on how to solve the above issue. There is no code fix made for the issue.

As explained, if AWS lambda execution are short, it does not give enough time for extension to send logs to Sumo Logic and retries are done in the next invocation.

All logs which are not sent during the invoke / execution phase, are sent in shutdown phase.

Let us know if we can close the issue.

@SumoSourabh SumoSourabh removed the bug Something isn't working label May 25, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants