Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[receiver/awscloudwatch] Missing log stream events #32231

Open
AllanOricil opened this issue Apr 9, 2024 · 24 comments
Open

[receiver/awscloudwatch] Missing log stream events #32231

AllanOricil opened this issue Apr 9, 2024 · 24 comments
Labels
bug Something isn't working needs triage New item requiring triage receiver/awscloudwatch

Comments

@AllanOricil
Copy link

AllanOricil commented Apr 9, 2024

Component(s)

receiver/awscloudwatch

What happened?

Description

1 - Past log streams aren't exported
2 - Log streams are incomplete

Steps to Reproduce

1 - create a lambda function that logs more than 15 log lines in Cloud Watch
2 - run this function several times and ensure a single log stream has more than 15 log lines
3 - in an EC2 machine, run otel-collector with this receiver

WARNING: don't forget to change NAME_OF_YOUR_LAMBDA_FUNCTION_LOG_GROUP by the name of the log group of your lambda function

 awscloudwatch:
    region: us-east-2
    logs:
      poll_interval: 10s
      max_events_per_request: 1000
      groups:
        autodiscover:
          limit: 100
          prefix: /aws/lambda/NAME_OF_YOUR_LAMBDA_FUNCTION_LOG_GROUP

Don't forget to register the receiver in the logs pipeline

service:
  telemetry:
    logs:
      level: debug
    metrics:
      address: 0.0.0.0:8888
  extensions: [health_check, zpages]
  pipelines:
    metrics:
      receivers: [otlp]
      processors: [batch]
      exporters: [otlp]
    metrics/internal:
      receivers: [prometheus, hostmetrics]
      processors: [resourcedetection, batch]
      exporters: [otlp]
    traces:
      receivers: [otlp]
      processors: [batch]
      exporters: [otlp]
    logs:
      receivers: [otlp, awscloudwatch]
      processors: [batch]
      exporters: [otlp, debug]

4 - verify that logs are processed with no error. You should see something as shown below

2024-04-09T01:32:48.362Z	debug	awscloudwatchreceiver@v0.97.0/logs.go:287	attempting to discover log groups.	{"kind": "receiver", "name": "awscloudwatch", "data_type": "logs", "limit": 100}
2024-04-09T01:32:48.382Z	debug	awscloudwatchreceiver@v0.97.0/logs.go:323	discovered log group	{"kind": "receiver", "name": "awscloudwatch", "data_type": "logs", "log group": "{\n  Arn: \"arn:aws:logs:us-east-2:845044614340:log-group:/aws/lambda/get-instances-api-function:*\",\n  CreationTime: 1701809123724,\n  LogGroupArn: \"arn:aws:logs:us-east-2:845044614340:log-group:/aws/lambda/get-instances-api-function\",\n  LogGroupClass: \"STANDARD\",\n  LogGroupName: \"/aws/lambda/get-instances-api-function\",\n  MetricFilterCount: 0,\n  RetentionInDays: 7,\n  StoredBytes: 113613\n}"}
2024-04-09T01:32:48.468Z	info	LogsExporter	{"kind": "exporter", "data_type": "logs", "name": "debug", "resource logs": 1, "log records": 15}
2024-04-09T01:32:48.469Z	info	ResourceLog #0
Resource SchemaURL: 
Resource attributes:
     -> aws.region: Str(us-east-2)
     -> cloudwatch.log.group.name: Str(/aws/lambda/get-instances-api-function)
     -> cloudwatch.log.stream: Str(2024/04/09/[$LATEST]9a79cb34998a4037bf1e3ff5df35fea5)
ScopeLogs #0
ScopeLogs SchemaURL: 
InstrumentationScope  
LogRecord #0
ObservedTimestamp: 2024-04-09 01:32:48.468672478 +0000 UTC
Timestamp: 2024-04-09 01:32:41.354 +0000 UTC
SeverityText: 
SeverityNumber: Unspecified(0)
Body: Str(INIT_START Runtime Version: nodejs:18.v26	Runtime Version ARN: arn:aws:lambda:us-east-2::runtime:0cdcfbdefbc5e7d3343f73c2e2dd3cba17d61dea0686b404502a0c9ce83931b9
)
Attributes:
     -> id: Str(38192844104842187460300983962019760926010078374443876352)
Trace ID: 
Span ID: 
Flags: 0
LogRecord #1
ObservedTimestamp: 2024-04-09 01:32:48.468672478 +0000 UTC
Timestamp: 2024-04-09 01:32:42.498 +0000 UTC
SeverityText: 
SeverityNumber: Unspecified(0)
Body: Str(START RequestId: 9f7b878f-a62a-47d1-be33-9526adf64f30 Version: $LATEST
)
Attributes:
     -> id: Str(38192844130354239967420016835936622629919803937285472257)
Trace ID: 
Span ID: 
Flags: 0
LogRecord #2
ObservedTimestamp: 2024-04-09 01:32:48.468672478 +0000 UTC
Timestamp: 2024-04-09 01:32:42.571 +0000 UTC
SeverityText: 
SeverityNumber: Unspecified(0)
Body: Str({"body":null,"headers":{"Accept":"application/json, text/plain, */*","Accept-Encoding":"gzip, deflate, br, zstd","Accept-Language":"en-GB,en-US;q=0.9,en;q=0.8,pt;q=0.7","Authorization":"eyJraWQiOiJvSUdwUUVhaFIxOWlMc05GbWhpVHpsRkNYUEx3eDcxNDN0S3hGaEFUNEJBPSIsImFsZyI6IlJTMjU2In0.eyJhdF9oYXNoIjoiUTZHdUZyLXAtREJvd3ZacElwZW81QSIsInN1YiI6IjgxNGIxNTYwLWIwNTEtNzA0MC1iNThiLWIyMThiNDJjZTU5YiIsImN1c3RvbTpzdWJkb21haW4iOiJhbGxhbm9yaWNpbCIsImNvZ25pdG86Z3JvdXBzIjpbInVzLWVhc3QtMl84QmVYSm42M2RfR29vZ2xlIiwiZDFkNDI5NTAtZGFiYi00M2VmLWI3Y2ItMDY3ODRiOGNmNDViIl0sImlzcyI6Imh0dHBzOlwvXC9jb2duaXRvLWlkcC51cy1lYXN0LTIuYW1hem9uYXdzLmNvbVwvdXMtZWFzdC0yXzhCZVhKbjYzZCIsImlkZW50aXRpZXMiOlt7ImRhdGVDcmVhdGVkIjoiMTcxMTExNjcwOTcwNyIsInVzZXJJZCI6IjEwNDExNjkxMTE3NDI0NjE1NzU5NSIsInByb3ZpZGVyTmFtZSI6Ikdvb2dsZSIsInByb3ZpZGVyVHlwZSI6Ikdvb2dsZSIsImlzc3VlciI6bnVsbCwicHJpbWFyeSI6InRydWUifV0sImF1dGhfdGltZSI6MTcxMjQzNTE1MywiZXhwIjoxNzEyNzEyNzYwLCJpYXQiOjE3MTI2MjYzNjAsImp0aSI6ImZjNWE3NDkzLTFjOWQtNGUyMC1hMGViLThjYzMzYzlmMTdmMiIsImVtYWlsIjoiYWxsbGFub3JpY2lsQGdtYWlsLmNvbSIsImN1c3RvbTpwc19zdWJzY3JpcHRpb25faWQiOiJzdWJfMU94OHFWSFl3TUVKQjQ2OFFGWlNMeThHIiwiZW1haWxfdmVyaWZpZWQiOmZhbHNlLCJjdXN0b206cHNfY3VzdG9tZXJfaWQiOiJjdXNfUG1pSFowanV4RnAzTHYiLCJjdXN0b206b3JnYW5pemF0aW9uX2lkIjoiZDFkNDI5NTAtZGFiYi00M2VmLWI3Y2ItMDY3ODRiOGNmNDViIiwiY3VzdG9tOnBzX3JlZGlyZWN0X3VybCI6Imh0dHA6XC9cL25vZGUtcmVhZHkuY29tXC9pbnN0YW5jZS1wcm92aXNpb25pbmctam9ic1wvY2FsbGJhY2siLCJjdXN0b206b3JnYW5pemF0aW9uX25hbWUiOiJBbGxhbiBPcmljaWwiLCJjb2duaXRvOnVzZXJuYW1lIjoiR29vZ2xlXzEwNDExNjkxMTE3NDI0NjE1NzU5NSIsImN1c3RvbTpzZXR1cCI6InRydWUiLCJnaXZlbl9uYW1lIjoiQWxsYW4gR29uw6dhbHZlcyBHb21lcyIsIm9yaWdpbl9qdGkiOiIwMzI5NTkwOC1hYjIzLTQ2NzAtODVjNS1mOTQ3NjhiNmQ3OGYiLCJhdWQiOiIxazMxY2Q4NzRoMTRkNHU4aTcyNGlkNDJwbiIsImN1c3RvbTpjcmVhdGVfaW5zdGFuY2UiOiJ0cnVlIiwidG9rZW5fdXNlIjoiaWQiLCJmYW1pbHlfbmFtZSI6Ik9yaWNpbCJ9.Srb-vdOkRbkNcq0ZvHUDYT1cl9GJ-F_XM8est0yzehoOKsfD72e4_qhWcwCRGCp1ZRd9KmFxp7rhQiqzTjccXjNNSORl823c8fM3sDLSpUF6-llLvdf9zDaH5JZxOwMX-wIfJmtRCgWjYmCEJpZatPHG2M3e8jVvAFzB-EAaEs9AfR6Re5YhVtzy5aUcHUxkMw_mCtrEmg2V-zY2SEyIg5m6EMQ6J5AfKsDe3nteM28Sa0IoWOxGf5KhBySUC49RawQCfO0vYNYXC3PVJNfmTaKJ8P2g4kXtCpi11RS7ONMth2ynrPg-eWTiI_I6g9pQdvsAruomontiYrvH4kknNw","CloudFront-Forwarded-Proto":"https","CloudFront-Is-Desktop-Viewer":"true","CloudFront-Is-Mobile-Viewer":"false","CloudFront-Is-SmartTV-Viewer":"false","CloudFront-Is-Tablet-Viewer":"false","CloudFront-Viewer-ASN":"269073","CloudFront-Viewer-Country":"BR","Host":"vjydkvkskl.execute-api.us-east-2.amazonaws.com","Referer":"https://node-ready.com/","User-Agent":"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/123.0.0.0 Safari/537.36","Via":"2.0 6fe8e2d5db6a80353eb675f61c249810.cloudfront.net (CloudFront)","X-Amz-Cf-Id":"kgg1BBFHKR4861YdGx-Kcf1-149AGIp-EUSUm9_J4MFT5WvC0cH3PA==","X-Amzn-Trace-Id":"Root=1-66149ab8-0a47130e4ff50bbf715025c0","X-Forwarded-For":"138.94.127.146, 15.158.19.244","X-Forwarded-Port":"443","X-Forwarded-Proto":"https","origin":"https://node-ready.com","sec-ch-ua":"\"Brave\";v=\"123\", \"Not:A-Brand\";v=\"8\", \"Chromium\";v=\"123\"","sec-ch-ua-mobile":"?0","sec-ch-ua-platform":"\"macOS\"","sec-fetch-dest":"empty","sec-fetch-mode":"cors","sec-fetch-site":"cross-site","sec-gpc":"1"},"httpMethod":"GET","isBase64Encoded":false,"level":"info","message":"event","multiValueHeaders":{"Accept":["application/json, text/plain, */*"],"Accept-Encoding":["gzip, deflate, br, zstd"],"Accept-Language":["en-GB,en-US;q=0.9,en;q=0.8,pt;q=0.7"],"Authorization":["eyJraWQiOiJvSUdwUUVhaFIxOWlMc05GbWhpVHpsRkNYUEx3eDcxNDN0S3hGaEFUNEJBPSIsImFsZyI6IlJTMjU2In0.eyJhdF9oYXNoIjoiUTZHdUZyLXAtREJvd3ZacElwZW81QSIsInN1YiI6IjgxNGIxNTYwLWIwNTEtNzA0MC1iNThiLWIyMThiNDJjZTU5YiIsImN1c3RvbTpzdWJkb21haW4iOiJhbGxhbm9yaWNpbCIsImNvZ25pdG86Z3JvdXBzIjpbInVzLWVhc3QtMl84QmVYSm42M2RfR29vZ2xlIiwiZDFkNDI5NTAtZGFiYi00M2VmLWI3Y2ItMDY3ODRiOGNmNDViIl0sImlzcyI6Imh0dHBzOlwvXC9jb2duaXRvLWlkcC51cy1lYXN0LTIuYW1hem9uYXdzLmNvbVwvdXMtZWFzdC0yXzhCZVhKbjYzZCIsImlkZW50aXRpZXMiOlt7ImRhdGVDcmVhdGVkIjoiMTcxMTExNjcwOTcwNyIsInVzZXJJZCI6IjEwNDExNjkxMTE3NDI0NjE1NzU5NSIsInByb3ZpZGVyTmFtZSI6Ikdvb2dsZSIsInByb3ZpZGVyVHlwZSI6Ikdvb2dsZSIsImlzc3VlciI6bnVsbCwicHJpbWFyeSI6InRydWUifV0sImF1dGhfdGltZSI6MTcxMjQzNTE1MywiZXhwIjoxNzEyNzEyNzYwLCJpYXQiOjE3MTI2MjYzNjAsImp0aSI6ImZjNWE3NDkzLTFjOWQtNGUyMC1hMGViLThjYzMzYzlmMTdmMiIsImVtYWlsIjoiYWxsbGFub3JpY2lsQGdtYWlsLmNvbSIsImN1c3RvbTpwc19zdWJzY3JpcHRpb25faWQiOiJzdWJfMU94OHFWSFl3TUVKQjQ2OFFGWlNMeThHIiwiZW1haWxfdmVyaWZpZWQiOmZhbHNlLCJjdXN0b206cHNfY3VzdG9tZXJfaWQiOiJjdXNfUG1pSFowanV4RnAzTHYiLCJjdXN0b206b3JnYW5pemF0aW9uX2lkIjoiZDFkNDI5NTAtZGFiYi00M2VmLWI3Y2ItMDY3ODRiOGNmNDViIiwiY3VzdG9tOnBzX3JlZGlyZWN0X3VybCI6Imh0dHA6XC9cL25vZGUtcmVhZHkuY29tXC9pbnN0YW5jZS1wcm92aXNpb25pbmctam9ic1wvY2FsbGJhY2siLCJjdXN0b206b3JnYW5pemF0aW9uX25hbWUiOiJBbGxhbiBPcmljaWwiLCJjb2duaXRvOnVzZXJuYW1lIjoiR29vZ2xlXzEwNDExNjkxMTE3NDI0NjE1NzU5NSIsImN1c3RvbTpzZXR1cCI6InRydWUiLCJnaXZlbl9uYW1lIjoiQWxsYW4gR29uw6dhbHZlcyBHb21lcyIsIm9yaWdpbl9qdGkiOiIwMzI5NTkwOC1hYjIzLTQ2NzAtODVjNS1mOTQ3NjhiNmQ3OGYiLCJhdWQiOiIxazMxY2Q4NzRoMTRkNHU4aTcyNGlkNDJwbiIsImN1c3RvbTpjcmVhdGVfaW5zdGFuY2UiOiJ0cnVlIiwidG9rZW5fdXNlIjoiaWQiLCJmYW1pbHlfbmFtZSI6Ik9yaWNpbCJ9.Srb-vdOkRbkNcq0ZvHUDYT1cl9GJ-F_XM8est0yzehoOKsfD72e4_qhWcwCRGCp1ZRd9KmFxp7rhQiqzTjccXjNNSORl823c8fM3sDLSpUF6-llLvdf9zDaH5JZxOwMX-wIfJmtRCgWjYmCEJpZatPHG2M3e8jVvAFzB-EAaEs9AfR6Re5YhVtzy5aUcHUxkMw_mCtrEmg2V-zY2SEyIg5m6EMQ6J5AfKsDe3nteM28Sa0IoWOxGf5KhBySUC49RawQCfO0vYNYXC3PVJNfmTaKJ8P2g4kXtCpi11RS7ONMth2ynrPg-eWTiI_I6g9pQdvsAruomontiYrvH4kknNw"],"CloudFront-Forwarded-Proto":["https"],"CloudFront-Is-Desktop-Viewer":["true"],"CloudFront-Is-Mobile-Viewer":["false"],"CloudFront-Is-SmartTV-Viewer":["false"],"CloudFront-Is-Tablet-Viewer":["false"],"CloudFront-Viewer-ASN":["269073"],"CloudFront-Viewer-Country":["BR"],"Host":["vjydkvkskl.execute-api.us-east-2.amazonaws.com"],"Referer":["https://node-ready.com/"],"User-Agent":["Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/123.0.0.0 Safari/537.36"],"Via":["2.0 6fe8e2d5db6a80353eb675f61c249810.cloudfront.net (CloudFront)"],"X-Amz-Cf-Id":["kgg1BBFHKR4861YdGx-Kcf1-149AGIp-EUSUm9_J4MFT5WvC0cH3PA=="],"X-Amzn-Trace-Id":["Root=1-66149ab8-0a47130e4ff50bbf715025c0"],"X-Forwarded-For":["138.94.127.146, 15.158.19.244"],"X-Forwarded-Port":["443"],"X-Forwarded-Proto":["https"],"origin":["https://node-ready.com"],"sec-ch-ua":["\"Brave\";v=\"123\", \"Not:A-Brand\";v=\"8\", \"Chromium\";v=\"123\""],"sec-ch-ua-mobile":["?0"],"sec-ch-ua-platform":["\"macOS\""],"sec-fetch-dest":["empty"],"sec-fetch-mode":["cors"],"sec-fetch-site":["cross-site"],"sec-gpc":["1"]},"multiValueQueryStringParameters":null,"path":"/instances","pathParameters":null,"queryStringParameters":null,"requestContext":{"accountId":"845044614340","apiId":"vjydkvkskl","authorizer":{"claims":{"at_hash":"Q6GuFr-p-DBowvZpIpeo5A","aud":"1k31cd874h14d4u8i724id42pn","auth_time":"1712435153","cognito:groups":"us-east-2_8BeXJn63d_Google,d1d42950-dabb-43ef-b7cb-06784b8cf45b","cognito:username":"Google_104116911174246157595","custom:create_instance":"true","custom:organization_id":"d1d42950-dabb-43ef-b7cb-06784b8cf45b","custom:organization_name":"Allan Oricil","custom:ps_customer_id":"cus_PmiHZ0juxFp3Lv","custom:ps_redirect_url":"http://node-ready.com/instance-provisioning-jobs/callback","custom:ps_subscription_id":"sub_1Ox8qVHYwMEJB468QFZSLy8G","custom:setup":"true","custom:subdomain":"allanoricil","email":"alllanoricil@gmail.com","email_verified":"false","exp":"Wed Apr 10 01:32:40 UTC 2024","family_name":"Oricil","given_name":"Allan Gonçalves Gomes","iat":"Tue Apr 09 01:32:40 UTC 2024","identities":"{\"dateCreated\":\"1711116709707\",\"userId\":\"104116911174246157595\",\"providerName\":\"Google\",\"providerType\":\"Google\",\"issuer\":null,\"primary\":\"true\"}","iss":"https://cognito-idp.us-east-2.amazonaws.com/us-east-2_8BeXJn63d","jti":"fc5a7493-1c9d-4e20-a0eb-8cc33c9f17f2","origin_jti":"03295908-ab23-4670-85c5-f94768b6d78f","sub":"814b1560-b051-7040-b58b-b218b42ce59b","token_use":"id"}},"deploymentId":"ube8ur","domainName":"vjydkvkskl.execute-api.us-east-2.amazonaws.com","domainPrefix":"vjydkvkskl","extendedRequestId":"V70c7FCmiYcED6Q=","httpMethod":"GET","identity":{"accessKey":null,"accountId":null,"caller":null,"cognitoAuthenticationProvider":null,"cognitoAuthenticationType":null,"cognitoIdentityId":null,"cognitoIdentityPoolId":null,"principalOrgId":null,"sourceIp":"138.94.127.146","user":null,"userAgent":"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/123.0.0.0 Safari/537.36","userArn":null},"operationName":"This method can be used to list all instances from an account","path":"/v1/instances","protocol":"HTTP/1.1","requestId":"4b2c442e-a405-4659-8e62-6a58bb578993","requestTime":"09/Apr/2024:01:32:40 +0000","requestTimeEpoch":1712626360898,"resourceId":"t4baq8","resourcePath":"/instances","stage":"v1"},"requestId":"9f7b878f-a62a-47d1-be33-9526adf64f30","resource":"/instances","stageVariables":null,"timestamp":"2024-04-09T01:32:42.513Z"}
)
Attributes:
     -> id: Str(38192844131982194366912752325268730063823134327222042626)
Trace ID: 
Span ID: 
Flags: 0
LogRecord #3
ObservedTimestamp: 2024-04-09 01:32:48.468672478 +0000 UTC
Timestamp: 2024-04-09 01:32:42.572 +0000 UTC
SeverityText: 
SeverityNumber: Unspecified(0)
Body: Str({"level":"debug","message":"------------- headers -------------","requestId":"9f7b878f-a62a-47d1-be33-9526adf64f30","timestamp":"2024-04-09T01:32:42.572Z"}
)
Attributes:
     -> id: Str(38192844132004495112111282948410265782095782688728023043)
Trace ID: 
Span ID: 
Flags: 0
LogRecord #4
ObservedTimestamp: 2024-04-09 01:32:48.468672478 +0000 UTC
Timestamp: 2024-04-09 01:32:42.572 +0000 UTC
SeverityText: 
SeverityNumber: Unspecified(0)
Body: Str({"level":"debug","message":"origin","requestId":"9f7b878f-a62a-47d1-be33-9526adf64f30","timestamp":"2024-04-09T01:32:42.572Z"}
)
Attributes:
     -> id: Str(38192844132004495112111282948410265782095782688728023044)
Trace ID: 
Span ID: 
Flags: 0
LogRecord #5
ObservedTimestamp: 2024-04-09 01:32:48.468672478 +0000 UTC
Timestamp: 2024-04-09 01:32:42.573 +0000 UTC
SeverityText: 
SeverityNumber: Unspecified(0)
Body: Str({"0":"http://localhost:8080","1":"https://node-ready.com","2":"https://instance-provisioning-admin-api-doc-bucket.s3.us-east-2.amazonaws.com","3":"https://instance-provisioning-api-doc-bucket.s3.us-east-2.amazonaws.com","level":"debug","message":"allowedOrigins","requestId":"9f7b878f-a62a-47d1-be33-9526adf64f30","timestamp":"2024-04-09T01:32:42.572Z"}
)
Attributes:
     -> id: Str(38192844132026795857309813571551801500368431050234003461)
Trace ID: 
Span ID: 
Flags: 0
LogRecord #6
ObservedTimestamp: 2024-04-09 01:32:48.468672478 +0000 UTC
Timestamp: 2024-04-09 01:32:42.573 +0000 UTC
SeverityText: 
SeverityNumber: Unspecified(0)
Body: Str({"level":"debug","message":{"Access-Control-Allow-Credentials":true,"Access-Control-Allow-Headers":"Content-Type,Authorization,X-Amz-Date,X-Api-Key,X-Amz-Security-Token,X-Amz-User-Agent","Access-Control-Allow-Methods":"OPTIONS,GET","Access-Control-Allow-Origin":"https://node-ready.com","Content-Security-Policy":"default-src 'none'; connect-src 'self'; img-src 'none'; script-src 'self'; style-src 'none'; object-src 'none'; frame-src 'none'","Content-Type":"application/json","Cross-Origin-Embedder-Policy":"require-corp","Cross-Origin-Opener-Policy":"same-origin","Referrer-Policy":"no-referrer","Strict-Transport-Security":"max-age=63072000; includeSubdomains; preload","X-Content-Type-Options":"nosniff","X-Frame-Options":"DENY","X-Permitted-Cross-Domain-Policies":"none","X-XSS-Protection":"1; mode=block"},"requestId":"9f7b878f-a62a-47d1-be33-9526adf64f30","timestamp":"2024-04-09T01:32:42.573Z"}
)
Attributes:
     -> id: Str(38192844132026795857309813571551801500368431050234003462)
Trace ID: 
Span ID: 
Flags: 0
LogRecord #7
ObservedTimestamp: 2024-04-09 01:32:48.468672478 +0000 UTC
Timestamp: 2024-04-09 01:32:42.573 +0000 UTC
SeverityText: 
SeverityNumber: Unspecified(0)
Body: Str({"level":"debug","message":"------------- headers -------------","requestId":"9f7b878f-a62a-47d1-be33-9526adf64f30","timestamp":"2024-04-09T01:32:42.573Z"}
)
Attributes:
     -> id: Str(38192844132026795857309813571551801500368431050234003463)
Trace ID: 
Span ID: 
Flags: 0
LogRecord #8
ObservedTimestamp: 2024-04-09 01:32:48.468672478 +0000 UTC
Timestamp: 2024-04-09 01:32:42.573 +0000 UTC
SeverityText: 
SeverityNumber: Unspecified(0)
Body: Str({"level":"info","message":"fetching secrets from secrets manager","requestId":"9f7b878f-a62a-47d1-be33-9526adf64f30","timestamp":"2024-04-09T01:32:42.573Z"}
)
Attributes:
     -> id: Str(38192844132026795857309813571551801500368431050234003464)
Trace ID: 
Span ID: 
Flags: 0
LogRecord #9
ObservedTimestamp: 2024-04-09 01:32:48.468672478 +0000 UTC
Timestamp: 2024-04-09 01:32:43.433 +0000 UTC
SeverityText: 
SeverityNumber: Unspecified(0)
Body: Str({"level":"info","message":"*****","requestId":"9f7b878f-a62a-47d1-be33-9526adf64f30","timestamp":"2024-04-09T01:32:43.433Z"}
)
Attributes:
     -> id: Str(38192844151205436728046149473272519214846021945377161225)
Trace ID: 
Span ID: 
Flags: 0
LogRecord #10
ObservedTimestamp: 2024-04-09 01:32:48.468672478 +0000 UTC
Timestamp: 2024-04-09 01:32:43.433 +0000 UTC
SeverityText: 
SeverityNumber: Unspecified(0)
Body: Str({"level":"debug","message":"creating new prisma client","requestId":"9f7b878f-a62a-47d1-be33-9526adf64f30","timestamp":"2024-04-09T01:32:43.433Z"}
)
Attributes:
     -> id: Str(38192844151205436728046149473272519214846021945377161226)
Trace ID: 
Span ID: 
Flags: 0
LogRecord #11
ObservedTimestamp: 2024-04-09 01:32:48.468672478 +0000 UTC
Timestamp: 2024-04-09 01:32:43.434 +0000 UTC
SeverityText: 
SeverityNumber: Unspecified(0)
Body: Str({"level":"info","message":"*****","requestId":"9f7b878f-a62a-47d1-be33-9526adf64f30","timestamp":"2024-04-09T01:32:43.433Z"}
)
Attributes:
     -> id: Str(38192844151227737473244680096414054933118670306883141643)
Trace ID: 
Span ID: 
Flags: 0
LogRecord #12
ObservedTimestamp: 2024-04-09 01:32:48.468672478 +0000 UTC
Timestamp: 2024-04-09 01:32:43.434 +0000 UTC
SeverityText: 
SeverityNumber: Unspecified(0)
Body: Str({"level":"debug","message":"","requestId":"9f7b878f-a62a-47d1-be33-9526adf64f30","timestamp":"2024-04-09T01:32:43.434Z"}
)
Attributes:
     -> id: Str(38192844151227737473244680096414054933118670306883141644)
Trace ID: 
Span ID: 
Flags: 0
LogRecord #13
ObservedTimestamp: 2024-04-09 01:32:48.468672478 +0000 UTC
Timestamp: 2024-04-09 01:32:43.594 +0000 UTC
SeverityText: 
SeverityNumber: Unspecified(0)
Body: Str({"level":"info","message":"setting up sentry","requestId":"9f7b878f-a62a-47d1-be33-9526adf64f30","timestamp":"2024-04-09T01:32:43.591Z"}
)
Attributes:
     -> id: Str(38192844154795856705009579799059769856742408147840008205)
Trace ID: 
Span ID: 
Flags: 0
LogRecord #14
ObservedTimestamp: 2024-04-09 01:32:48.468672478 +0000 UTC
Timestamp: 2024-04-09 01:32:43.752 +0000 UTC
SeverityText: 
SeverityNumber: Unspecified(0)
Body: Str({"level":"info","message":"finding all instances for: 814b1560-b051-7040-b58b-b218b42ce59b","requestId":"9f7b878f-a62a-47d1-be33-9526adf64f30","timestamp":"2024-04-09T01:32:43.752Z"}
)
Attributes:
     -> id: Str(38192844158319374446377418255422413343820849265784913934)
Trace ID: 
Span ID: 
Flags: 0
	{"kind": "exporter", "data_type": "logs", "name": "debug"}
2024-04-09T01:32:58.361Z	debug	awscloudwatchreceiver@v0.97.0/logs.go:287	attempting to discover log groups.	{"kind": "receiver", "name": "awscloudwatch", "data_type": "logs", "limit": 100}
2024-04-09T01:32:58.381Z	debug	awscloudwatchreceiver@v0.97.0/logs.go:323	discovered log group	{"kind": "receiver", "name": "awscloudwatch", "data_type": "logs", "log group": "{\n  Arn: \"arn:aws:logs:us-east-2:845044614340:log-group:/aws/lambda/get-instances-api-function:*\",\n  CreationTime: 1701809123724,\n  LogGroupArn: \"arn:aws:logs:us-east-2:845044614340:log-group:/aws/lambda/get-instances-api-function\",\n  LogGroupClass: \"STANDARD\",\n  LogGroupName: \"/aws/lambda/get-instances-api-function\",\n  MetricFilterCount: 0,\n  RetentionInDays: 7,\n  StoredBytes: 113613\n}"}

Expected Result

  • All log events from a log stream should be exported
  • All log streams of a log group that are still in AWS must be exported

Actual Result

  • Just a few log events from a log stream are exported
  • Only the most recent log stream of a log group is exported

Collector version

0.97.0

Environment information

Environment

NAME="Amazon Linux"
VERSION="2023"
ID="amzn"
ID_LIKE="fedora"
VERSION_ID="2023"
PLATFORM_ID="platform:al2023"
PRETTY_NAME="Amazon Linux 2023"
ANSI_COLOR="0;33"
CPE_NAME="cpe:2.3:o:amazon:amazon_linux:2023"
HOME_URL="https://aws.amazon.com/linux/"
BUG_REPORT_URL="https://github.com/amazonlinux/amazon-linux-2023"
SUPPORT_END="2028-03-15"

OpenTelemetry Collector configuration

receivers:
  otlp:
    protocols:
      grpc:
        endpoint: 0.0.0.0:4317
      http:
        endpoint: 0.0.0.0:4318
  hostmetrics:
    scrapers:
      cpu:
      disk:
      filesystem:
      load:
      memory:
  prometheus:
    config:
      scrape_configs:
        - job_name: otel-collector
          scrape_interval: 5s
          static_configs:
            - targets: [localhost:8888]
  awscloudwatch:
    region: us-east-2
    logs:
      poll_interval: 10s
      max_events_per_request: 1000
      groups:
        autodiscover:
          limit: 100
          prefix: /aws/lambda/get-instances-api-function
processors:
  batch:
    send_batch_size: 10
    timeout: 1s
  # Ref: https://github.com/open-telemetry/opentelemetry-collector-contrib/blob/main/processor/resourcedetectionprocessor/README.md
  resourcedetection:
    detectors: [env, system] # include ec2 for AWS, gcp for GCP and azure for Azure.
    # Using OTEL_RESOURCE_ATTRIBUTES envvar, env detector adds custom labels.
    timeout: 2s
    system:
      hostname_sources: [os] # alternatively, use [dns,os] for setting FQDN as host.name and os as fallback
extensions:
  health_check: {}
  zpages: {}
exporters:
  otlp:
    endpoint: "MY_OTLP_GATEWAY_IP_ADDRESS:4317"
    tls:
      insecure: true
  debug:
    # verbosity of the logging export: detailed, normal, basic
    verbosity: detailed
service:
  telemetry:
    logs:
      level: debug
    metrics:
      address: 0.0.0.0:8888
  extensions: [health_check, zpages]
  pipelines:
    metrics:
      receivers: [otlp]
      processors: [batch]
      exporters: [otlp]
    metrics/internal:
      receivers: [prometheus, hostmetrics]
      processors: [resourcedetection, batch]
      exporters: [otlp]
    traces:
      receivers: [otlp]
      processors: [batch]
      exporters: [otlp]
    logs:
      receivers: [otlp, awscloudwatch]
      processors: [batch]
      exporters: [otlp, debug]

Log output

2024-04-09T01:32:28.424Z	debug	awscloudwatchreceiver@v0.97.0/logs.go:323	discovered log group	{"kind": "receiver", "name": "awscloudwatch", "data_type": "logs", "log group": "{\n  Arn: \"arn:aws:logs:us-east-2:845044614340:log-group:/aws/lambda/get-instances-api-function:*\",\n  CreationTime: 1701809123724,\n  LogGroupArn: \"arn:aws:logs:us-east-2:845044614340:log-group:/aws/lambda/get-instances-api-function\",\n  LogGroupClass: \"STANDARD\",\n  LogGroupName: \"/aws/lambda/get-instances-api-function\",\n  MetricFilterCount: 0,\n  RetentionInDays: 7,\n  StoredBytes: 113613\n}"}
2024-04-09T01:32:38.361Z	debug	awscloudwatchreceiver@v0.97.0/logs.go:287	attempting to discover log groups.	{"kind": "receiver", "name": "awscloudwatch", "data_type": "logs", "limit": 100}
2024-04-09T01:32:38.386Z	debug	awscloudwatchreceiver@v0.97.0/logs.go:323	discovered log group	{"kind": "receiver", "name": "awscloudwatch", "data_type": "logs", "log group": "{\n  Arn: \"arn:aws:logs:us-east-2:845044614340:log-group:/aws/lambda/get-instances-api-function:*\",\n  CreationTime: 1701809123724,\n  LogGroupArn: \"arn:aws:logs:us-east-2:845044614340:log-group:/aws/lambda/get-instances-api-function\",\n  LogGroupClass: \"STANDARD\",\n  LogGroupName: \"/aws/lambda/get-instances-api-function\",\n  MetricFilterCount: 0,\n  RetentionInDays: 7,\n  StoredBytes: 113613\n}"}
2024-04-09T01:32:48.362Z	debug	awscloudwatchreceiver@v0.97.0/logs.go:287	attempting to discover log groups.	{"kind": "receiver", "name": "awscloudwatch", "data_type": "logs", "limit": 100}
2024-04-09T01:32:48.382Z	debug	awscloudwatchreceiver@v0.97.0/logs.go:323	discovered log group	{"kind": "receiver", "name": "awscloudwatch", "data_type": "logs", "log group": "{\n  Arn: \"arn:aws:logs:us-east-2:845044614340:log-group:/aws/lambda/get-instances-api-function:*\",\n  CreationTime: 1701809123724,\n  LogGroupArn: \"arn:aws:logs:us-east-2:845044614340:log-group:/aws/lambda/get-instances-api-function\",\n  LogGroupClass: \"STANDARD\",\n  LogGroupName: \"/aws/lambda/get-instances-api-function\",\n  MetricFilterCount: 0,\n  RetentionInDays: 7,\n  StoredBytes: 113613\n}"}
2024-04-09T01:32:48.468Z	info	LogsExporter	{"kind": "exporter", "data_type": "logs", "name": "debug", "resource logs": 1, "log records": 15}
2024-04-09T01:32:48.469Z	info	ResourceLog #0
Resource SchemaURL: 
Resource attributes:
     -> aws.region: Str(us-east-2)
     -> cloudwatch.log.group.name: Str(/aws/lambda/get-instances-api-function)
     -> cloudwatch.log.stream: Str(2024/04/09/[$LATEST]9a79cb34998a4037bf1e3ff5df35fea5)
ScopeLogs #0
ScopeLogs SchemaURL: 
InstrumentationScope  
LogRecord #0
ObservedTimestamp: 2024-04-09 01:32:48.468672478 +0000 UTC
Timestamp: 2024-04-09 01:32:41.354 +0000 UTC
SeverityText: 
SeverityNumber: Unspecified(0)
Body: Str(INIT_START Runtime Version: nodejs:18.v26	Runtime Version ARN: arn:aws:lambda:us-east-2::runtime:0cdcfbdefbc5e7d3343f73c2e2dd3cba17d61dea0686b404502a0c9ce83931b9
)
Attributes:
     -> id: Str(38192844104842187460300983962019760926010078374443876352)
Trace ID: 
Span ID: 
Flags: 0
LogRecord #1
ObservedTimestamp: 2024-04-09 01:32:48.468672478 +0000 UTC
Timestamp: 2024-04-09 01:32:42.498 +0000 UTC
SeverityText: 
SeverityNumber: Unspecified(0)
Body: Str(START RequestId: 9f7b878f-a62a-47d1-be33-9526adf64f30 Version: $LATEST
)
Attributes:
     -> id: Str(38192844130354239967420016835936622629919803937285472257)
Trace ID: 
Span ID: 
Flags: 0
LogRecord #2
ObservedTimestamp: 2024-04-09 01:32:48.468672478 +0000 UTC
Timestamp: 2024-04-09 01:32:42.571 +0000 UTC
SeverityText: 
SeverityNumber: Unspecified(0)
Body: Str({"body":null,"headers":{"Accept":"application/json, text/plain, */*","Accept-Encoding":"gzip, deflate, br, zstd","Accept-Language":"en-GB,en-US;q=0.9,en;q=0.8,pt;q=0.7","Authorization":"eyJraWQiOiJvSUdwUUVhaFIxOWlMc05GbWhpVHpsRkNYUEx3eDcxNDN0S3hGaEFUNEJBPSIsImFsZyI6IlJTMjU2In0.eyJhdF9oYXNoIjoiUTZHdUZyLXAtREJvd3ZacElwZW81QSIsInN1YiI6IjgxNGIxNTYwLWIwNTEtNzA0MC1iNThiLWIyMThiNDJjZTU5YiIsImN1c3RvbTpzdWJkb21haW4iOiJhbGxhbm9yaWNpbCIsImNvZ25pdG86Z3JvdXBzIjpbInVzLWVhc3QtMl84QmVYSm42M2RfR29vZ2xlIiwiZDFkNDI5NTAtZGFiYi00M2VmLWI3Y2ItMDY3ODRiOGNmNDViIl0sImlzcyI6Imh0dHBzOlwvXC9jb2duaXRvLWlkcC51cy1lYXN0LTIuYW1hem9uYXdzLmNvbVwvdXMtZWFzdC0yXzhCZVhKbjYzZCIsImlkZW50aXRpZXMiOlt7ImRhdGVDcmVhdGVkIjoiMTcxMTExNjcwOTcwNyIsInVzZXJJZCI6IjEwNDExNjkxMTE3NDI0NjE1NzU5NSIsInByb3ZpZGVyTmFtZSI6Ikdvb2dsZSIsInByb3ZpZGVyVHlwZSI6Ikdvb2dsZSIsImlzc3VlciI6bnVsbCwicHJpbWFyeSI6InRydWUifV0sImF1dGhfdGltZSI6MTcxMjQzNTE1MywiZXhwIjoxNzEyNzEyNzYwLCJpYXQiOjE3MTI2MjYzNjAsImp0aSI6ImZjNWE3NDkzLTFjOWQtNGUyMC1hMGViLThjYzMzYzlmMTdmMiIsImVtYWlsIjoiYWxsbGFub3JpY2lsQGdtYWlsLmNvbSIsImN1c3RvbTpwc19zdWJzY3JpcHRpb25faWQiOiJzdWJfMU94OHFWSFl3TUVKQjQ2OFFGWlNMeThHIiwiZW1haWxfdmVyaWZpZWQiOmZhbHNlLCJjdXN0b206cHNfY3VzdG9tZXJfaWQiOiJjdXNfUG1pSFowanV4RnAzTHYiLCJjdXN0b206b3JnYW5pemF0aW9uX2lkIjoiZDFkNDI5NTAtZGFiYi00M2VmLWI3Y2ItMDY3ODRiOGNmNDViIiwiY3VzdG9tOnBzX3JlZGlyZWN0X3VybCI6Imh0dHA6XC9cL25vZGUtcmVhZHkuY29tXC9pbnN0YW5jZS1wcm92aXNpb25pbmctam9ic1wvY2FsbGJhY2siLCJjdXN0b206b3JnYW5pemF0aW9uX25hbWUiOiJBbGxhbiBPcmljaWwiLCJjb2duaXRvOnVzZXJuYW1lIjoiR29vZ2xlXzEwNDExNjkxMTE3NDI0NjE1NzU5NSIsImN1c3RvbTpzZXR1cCI6InRydWUiLCJnaXZlbl9uYW1lIjoiQWxsYW4gR29uw6dhbHZlcyBHb21lcyIsIm9yaWdpbl9qdGkiOiIwMzI5NTkwOC1hYjIzLTQ2NzAtODVjNS1mOTQ3NjhiNmQ3OGYiLCJhdWQiOiIxazMxY2Q4NzRoMTRkNHU4aTcyNGlkNDJwbiIsImN1c3RvbTpjcmVhdGVfaW5zdGFuY2UiOiJ0cnVlIiwidG9rZW5fdXNlIjoiaWQiLCJmYW1pbHlfbmFtZSI6Ik9yaWNpbCJ9.Srb-vdOkRbkNcq0ZvHUDYT1cl9GJ-F_XM8est0yzehoOKsfD72e4_qhWcwCRGCp1ZRd9KmFxp7rhQiqzTjccXjNNSORl823c8fM3sDLSpUF6-llLvdf9zDaH5JZxOwMX-wIfJmtRCgWjYmCEJpZatPHG2M3e8jVvAFzB-EAaEs9AfR6Re5YhVtzy5aUcHUxkMw_mCtrEmg2V-zY2SEyIg5m6EMQ6J5AfKsDe3nteM28Sa0IoWOxGf5KhBySUC49RawQCfO0vYNYXC3PVJNfmTaKJ8P2g4kXtCpi11RS7ONMth2ynrPg-eWTiI_I6g9pQdvsAruomontiYrvH4kknNw","CloudFront-Forwarded-Proto":"https","CloudFront-Is-Desktop-Viewer":"true","CloudFront-Is-Mobile-Viewer":"false","CloudFront-Is-SmartTV-Viewer":"false","CloudFront-Is-Tablet-Viewer":"false","CloudFront-Viewer-ASN":"269073","CloudFront-Viewer-Country":"BR","Host":"vjydkvkskl.execute-api.us-east-2.amazonaws.com","Referer":"https://node-ready.com/","User-Agent":"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/123.0.0.0 Safari/537.36","Via":"2.0 6fe8e2d5db6a80353eb675f61c249810.cloudfront.net (CloudFront)","X-Amz-Cf-Id":"kgg1BBFHKR4861YdGx-Kcf1-149AGIp-EUSUm9_J4MFT5WvC0cH3PA==","X-Amzn-Trace-Id":"Root=1-66149ab8-0a47130e4ff50bbf715025c0","X-Forwarded-For":"138.94.127.146, 15.158.19.244","X-Forwarded-Port":"443","X-Forwarded-Proto":"https","origin":"https://node-ready.com","sec-ch-ua":"\"Brave\";v=\"123\", \"Not:A-Brand\";v=\"8\", \"Chromium\";v=\"123\"","sec-ch-ua-mobile":"?0","sec-ch-ua-platform":"\"macOS\"","sec-fetch-dest":"empty","sec-fetch-mode":"cors","sec-fetch-site":"cross-site","sec-gpc":"1"},"httpMethod":"GET","isBase64Encoded":false,"level":"info","message":"event","multiValueHeaders":{"Accept":["application/json, text/plain, */*"],"Accept-Encoding":["gzip, deflate, br, zstd"],"Accept-Language":["en-GB,en-US;q=0.9,en;q=0.8,pt;q=0.7"],"Authorization":["eyJraWQiOiJvSUdwUUVhaFIxOWlMc05GbWhpVHpsRkNYUEx3eDcxNDN0S3hGaEFUNEJBPSIsImFsZyI6IlJTMjU2In0.eyJhdF9oYXNoIjoiUTZHdUZyLXAtREJvd3ZacElwZW81QSIsInN1YiI6IjgxNGIxNTYwLWIwNTEtNzA0MC1iNThiLWIyMThiNDJjZTU5YiIsImN1c3RvbTpzdWJkb21haW4iOiJhbGxhbm9yaWNpbCIsImNvZ25pdG86Z3JvdXBzIjpbInVzLWVhc3QtMl84QmVYSm42M2RfR29vZ2xlIiwiZDFkNDI5NTAtZGFiYi00M2VmLWI3Y2ItMDY3ODRiOGNmNDViIl0sImlzcyI6Imh0dHBzOlwvXC9jb2duaXRvLWlkcC51cy1lYXN0LTIuYW1hem9uYXdzLmNvbVwvdXMtZWFzdC0yXzhCZVhKbjYzZCIsImlkZW50aXRpZXMiOlt7ImRhdGVDcmVhdGVkIjoiMTcxMTExNjcwOTcwNyIsInVzZXJJZCI6IjEwNDExNjkxMTE3NDI0NjE1NzU5NSIsInByb3ZpZGVyTmFtZSI6Ikdvb2dsZSIsInByb3ZpZGVyVHlwZSI6Ikdvb2dsZSIsImlzc3VlciI6bnVsbCwicHJpbWFyeSI6InRydWUifV0sImF1dGhfdGltZSI6MTcxMjQzNTE1MywiZXhwIjoxNzEyNzEyNzYwLCJpYXQiOjE3MTI2MjYzNjAsImp0aSI6ImZjNWE3NDkzLTFjOWQtNGUyMC1hMGViLThjYzMzYzlmMTdmMiIsImVtYWlsIjoiYWxsbGFub3JpY2lsQGdtYWlsLmNvbSIsImN1c3RvbTpwc19zdWJzY3JpcHRpb25faWQiOiJzdWJfMU94OHFWSFl3TUVKQjQ2OFFGWlNMeThHIiwiZW1haWxfdmVyaWZpZWQiOmZhbHNlLCJjdXN0b206cHNfY3VzdG9tZXJfaWQiOiJjdXNfUG1pSFowanV4RnAzTHYiLCJjdXN0b206b3JnYW5pemF0aW9uX2lkIjoiZDFkNDI5NTAtZGFiYi00M2VmLWI3Y2ItMDY3ODRiOGNmNDViIiwiY3VzdG9tOnBzX3JlZGlyZWN0X3VybCI6Imh0dHA6XC9cL25vZGUtcmVhZHkuY29tXC9pbnN0YW5jZS1wcm92aXNpb25pbmctam9ic1wvY2FsbGJhY2siLCJjdXN0b206b3JnYW5pemF0aW9uX25hbWUiOiJBbGxhbiBPcmljaWwiLCJjb2duaXRvOnVzZXJuYW1lIjoiR29vZ2xlXzEwNDExNjkxMTE3NDI0NjE1NzU5NSIsImN1c3RvbTpzZXR1cCI6InRydWUiLCJnaXZlbl9uYW1lIjoiQWxsYW4gR29uw6dhbHZlcyBHb21lcyIsIm9yaWdpbl9qdGkiOiIwMzI5NTkwOC1hYjIzLTQ2NzAtODVjNS1mOTQ3NjhiNmQ3OGYiLCJhdWQiOiIxazMxY2Q4NzRoMTRkNHU4aTcyNGlkNDJwbiIsImN1c3RvbTpjcmVhdGVfaW5zdGFuY2UiOiJ0cnVlIiwidG9rZW5fdXNlIjoiaWQiLCJmYW1pbHlfbmFtZSI6Ik9yaWNpbCJ9.Srb-vdOkRbkNcq0ZvHUDYT1cl9GJ-F_XM8est0yzehoOKsfD72e4_qhWcwCRGCp1ZRd9KmFxp7rhQiqzTjccXjNNSORl823c8fM3sDLSpUF6-llLvdf9zDaH5JZxOwMX-wIfJmtRCgWjYmCEJpZatPHG2M3e8jVvAFzB-EAaEs9AfR6Re5YhVtzy5aUcHUxkMw_mCtrEmg2V-zY2SEyIg5m6EMQ6J5AfKsDe3nteM28Sa0IoWOxGf5KhBySUC49RawQCfO0vYNYXC3PVJNfmTaKJ8P2g4kXtCpi11RS7ONMth2ynrPg-eWTiI_I6g9pQdvsAruomontiYrvH4kknNw"],"CloudFront-Forwarded-Proto":["https"],"CloudFront-Is-Desktop-Viewer":["true"],"CloudFront-Is-Mobile-Viewer":["false"],"CloudFront-Is-SmartTV-Viewer":["false"],"CloudFront-Is-Tablet-Viewer":["false"],"CloudFront-Viewer-ASN":["269073"],"CloudFront-Viewer-Country":["BR"],"Host":["vjydkvkskl.execute-api.us-east-2.amazonaws.com"],"Referer":["https://node-ready.com/"],"User-Agent":["Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/123.0.0.0 Safari/537.36"],"Via":["2.0 6fe8e2d5db6a80353eb675f61c249810.cloudfront.net (CloudFront)"],"X-Amz-Cf-Id":["kgg1BBFHKR4861YdGx-Kcf1-149AGIp-EUSUm9_J4MFT5WvC0cH3PA=="],"X-Amzn-Trace-Id":["Root=1-66149ab8-0a47130e4ff50bbf715025c0"],"X-Forwarded-For":["138.94.127.146, 15.158.19.244"],"X-Forwarded-Port":["443"],"X-Forwarded-Proto":["https"],"origin":["https://node-ready.com"],"sec-ch-ua":["\"Brave\";v=\"123\", \"Not:A-Brand\";v=\"8\", \"Chromium\";v=\"123\""],"sec-ch-ua-mobile":["?0"],"sec-ch-ua-platform":["\"macOS\""],"sec-fetch-dest":["empty"],"sec-fetch-mode":["cors"],"sec-fetch-site":["cross-site"],"sec-gpc":["1"]},"multiValueQueryStringParameters":null,"path":"/instances","pathParameters":null,"queryStringParameters":null,"requestContext":{"accountId":"845044614340","apiId":"vjydkvkskl","authorizer":{"claims":{"at_hash":"Q6GuFr-p-DBowvZpIpeo5A","aud":"1k31cd874h14d4u8i724id42pn","auth_time":"1712435153","cognito:groups":"us-east-2_8BeXJn63d_Google,d1d42950-dabb-43ef-b7cb-06784b8cf45b","cognito:username":"Google_104116911174246157595","custom:create_instance":"true","custom:organization_id":"d1d42950-dabb-43ef-b7cb-06784b8cf45b","custom:organization_name":"Allan Oricil","custom:ps_customer_id":"cus_PmiHZ0juxFp3Lv","custom:ps_redirect_url":"http://node-ready.com/instance-provisioning-jobs/callback","custom:ps_subscription_id":"sub_1Ox8qVHYwMEJB468QFZSLy8G","custom:setup":"true","custom:subdomain":"allanoricil","email":"alllanoricil@gmail.com","email_verified":"false","exp":"Wed Apr 10 01:32:40 UTC 2024","family_name":"Oricil","given_name":"Allan Gonçalves Gomes","iat":"Tue Apr 09 01:32:40 UTC 2024","identities":"{\"dateCreated\":\"1711116709707\",\"userId\":\"104116911174246157595\",\"providerName\":\"Google\",\"providerType\":\"Google\",\"issuer\":null,\"primary\":\"true\"}","iss":"https://cognito-idp.us-east-2.amazonaws.com/us-east-2_8BeXJn63d","jti":"fc5a7493-1c9d-4e20-a0eb-8cc33c9f17f2","origin_jti":"03295908-ab23-4670-85c5-f94768b6d78f","sub":"814b1560-b051-7040-b58b-b218b42ce59b","token_use":"id"}},"deploymentId":"ube8ur","domainName":"vjydkvkskl.execute-api.us-east-2.amazonaws.com","domainPrefix":"vjydkvkskl","extendedRequestId":"V70c7FCmiYcED6Q=","httpMethod":"GET","identity":{"accessKey":null,"accountId":null,"caller":null,"cognitoAuthenticationProvider":null,"cognitoAuthenticationType":null,"cognitoIdentityId":null,"cognitoIdentityPoolId":null,"principalOrgId":null,"sourceIp":"138.94.127.146","user":null,"userAgent":"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/123.0.0.0 Safari/537.36","userArn":null},"operationName":"This method can be used to list all instances from an account","path":"/v1/instances","protocol":"HTTP/1.1","requestId":"4b2c442e-a405-4659-8e62-6a58bb578993","requestTime":"09/Apr/2024:01:32:40 +0000","requestTimeEpoch":1712626360898,"resourceId":"t4baq8","resourcePath":"/instances","stage":"v1"},"requestId":"9f7b878f-a62a-47d1-be33-9526adf64f30","resource":"/instances","stageVariables":null,"timestamp":"2024-04-09T01:32:42.513Z"}
)
Attributes:
     -> id: Str(38192844131982194366912752325268730063823134327222042626)
Trace ID: 
Span ID: 
Flags: 0
LogRecord #3
ObservedTimestamp: 2024-04-09 01:32:48.468672478 +0000 UTC
Timestamp: 2024-04-09 01:32:42.572 +0000 UTC
SeverityText: 
SeverityNumber: Unspecified(0)
Body: Str({"level":"debug","message":"------------- headers -------------","requestId":"9f7b878f-a62a-47d1-be33-9526adf64f30","timestamp":"2024-04-09T01:32:42.572Z"}
)
Attributes:
     -> id: Str(38192844132004495112111282948410265782095782688728023043)
Trace ID: 
Span ID: 
Flags: 0
LogRecord #4
ObservedTimestamp: 2024-04-09 01:32:48.468672478 +0000 UTC
Timestamp: 2024-04-09 01:32:42.572 +0000 UTC
SeverityText: 
SeverityNumber: Unspecified(0)
Body: Str({"level":"debug","message":"origin","requestId":"9f7b878f-a62a-47d1-be33-9526adf64f30","timestamp":"2024-04-09T01:32:42.572Z"}
)
Attributes:
     -> id: Str(38192844132004495112111282948410265782095782688728023044)
Trace ID: 
Span ID: 
Flags: 0
LogRecord #5
ObservedTimestamp: 2024-04-09 01:32:48.468672478 +0000 UTC
Timestamp: 2024-04-09 01:32:42.573 +0000 UTC
SeverityText: 
SeverityNumber: Unspecified(0)
Body: Str({"0":"http://localhost:8080","1":"https://node-ready.com","2":"https://instance-provisioning-admin-api-doc-bucket.s3.us-east-2.amazonaws.com","3":"https://instance-provisioning-api-doc-bucket.s3.us-east-2.amazonaws.com","level":"debug","message":"allowedOrigins","requestId":"9f7b878f-a62a-47d1-be33-9526adf64f30","timestamp":"2024-04-09T01:32:42.572Z"}
)
Attributes:
     -> id: Str(38192844132026795857309813571551801500368431050234003461)
Trace ID: 
Span ID: 
Flags: 0
LogRecord #6
ObservedTimestamp: 2024-04-09 01:32:48.468672478 +0000 UTC
Timestamp: 2024-04-09 01:32:42.573 +0000 UTC
SeverityText: 
SeverityNumber: Unspecified(0)
Body: Str({"level":"debug","message":{"Access-Control-Allow-Credentials":true,"Access-Control-Allow-Headers":"Content-Type,Authorization,X-Amz-Date,X-Api-Key,X-Amz-Security-Token,X-Amz-User-Agent","Access-Control-Allow-Methods":"OPTIONS,GET","Access-Control-Allow-Origin":"https://node-ready.com","Content-Security-Policy":"default-src 'none'; connect-src 'self'; img-src 'none'; script-src 'self'; style-src 'none'; object-src 'none'; frame-src 'none'","Content-Type":"application/json","Cross-Origin-Embedder-Policy":"require-corp","Cross-Origin-Opener-Policy":"same-origin","Referrer-Policy":"no-referrer","Strict-Transport-Security":"max-age=63072000; includeSubdomains; preload","X-Content-Type-Options":"nosniff","X-Frame-Options":"DENY","X-Permitted-Cross-Domain-Policies":"none","X-XSS-Protection":"1; mode=block"},"requestId":"9f7b878f-a62a-47d1-be33-9526adf64f30","timestamp":"2024-04-09T01:32:42.573Z"}
)
Attributes:
     -> id: Str(38192844132026795857309813571551801500368431050234003462)
Trace ID: 
Span ID: 
Flags: 0
LogRecord #7
ObservedTimestamp: 2024-04-09 01:32:48.468672478 +0000 UTC
Timestamp: 2024-04-09 01:32:42.573 +0000 UTC
SeverityText: 
SeverityNumber: Unspecified(0)
Body: Str({"level":"debug","message":"------------- headers -------------","requestId":"9f7b878f-a62a-47d1-be33-9526adf64f30","timestamp":"2024-04-09T01:32:42.573Z"}
)
Attributes:
     -> id: Str(38192844132026795857309813571551801500368431050234003463)
Trace ID: 
Span ID: 
Flags: 0
LogRecord #8
ObservedTimestamp: 2024-04-09 01:32:48.468672478 +0000 UTC
Timestamp: 2024-04-09 01:32:42.573 +0000 UTC
SeverityText: 
SeverityNumber: Unspecified(0)
Body: Str({"level":"info","message":"fetching secrets from secrets manager","requestId":"9f7b878f-a62a-47d1-be33-9526adf64f30","timestamp":"2024-04-09T01:32:42.573Z"}
)
Attributes:
     -> id: Str(38192844132026795857309813571551801500368431050234003464)
Trace ID: 
Span ID: 
Flags: 0
LogRecord #9
ObservedTimestamp: 2024-04-09 01:32:48.468672478 +0000 UTC
Timestamp: 2024-04-09 01:32:43.433 +0000 UTC
SeverityText: 
SeverityNumber: Unspecified(0)
Body: Str({"level":"info","message":"*****","requestId":"9f7b878f-a62a-47d1-be33-9526adf64f30","timestamp":"2024-04-09T01:32:43.433Z"}
)
Attributes:
     -> id: Str(38192844151205436728046149473272519214846021945377161225)
Trace ID: 
Span ID: 
Flags: 0
LogRecord #10
ObservedTimestamp: 2024-04-09 01:32:48.468672478 +0000 UTC
Timestamp: 2024-04-09 01:32:43.433 +0000 UTC
SeverityText: 
SeverityNumber: Unspecified(0)
Body: Str({"level":"debug","message":"creating new prisma client","requestId":"9f7b878f-a62a-47d1-be33-9526adf64f30","timestamp":"2024-04-09T01:32:43.433Z"}
)
Attributes:
     -> id: Str(38192844151205436728046149473272519214846021945377161226)
Trace ID: 
Span ID: 
Flags: 0
LogRecord #11
ObservedTimestamp: 2024-04-09 01:32:48.468672478 +0000 UTC
Timestamp: 2024-04-09 01:32:43.434 +0000 UTC
SeverityText: 
SeverityNumber: Unspecified(0)
Body: Str({"level":"info","message":"*****","requestId":"9f7b878f-a62a-47d1-be33-9526adf64f30","timestamp":"2024-04-09T01:32:43.433Z"}
)
Attributes:
     -> id: Str(38192844151227737473244680096414054933118670306883141643)
Trace ID: 
Span ID: 
Flags: 0
LogRecord #12
ObservedTimestamp: 2024-04-09 01:32:48.468672478 +0000 UTC
Timestamp: 2024-04-09 01:32:43.434 +0000 UTC
SeverityText: 
SeverityNumber: Unspecified(0)
Body: Str({"level":"debug","message":"","requestId":"9f7b878f-a62a-47d1-be33-9526adf64f30","timestamp":"2024-04-09T01:32:43.434Z"}
)
Attributes:
     -> id: Str(38192844151227737473244680096414054933118670306883141644)
Trace ID: 
Span ID: 
Flags: 0
LogRecord #13
ObservedTimestamp: 2024-04-09 01:32:48.468672478 +0000 UTC
Timestamp: 2024-04-09 01:32:43.594 +0000 UTC
SeverityText: 
SeverityNumber: Unspecified(0)
Body: Str({"level":"info","message":"setting up sentry","requestId":"9f7b878f-a62a-47d1-be33-9526adf64f30","timestamp":"2024-04-09T01:32:43.591Z"}
)
Attributes:
     -> id: Str(38192844154795856705009579799059769856742408147840008205)
Trace ID: 
Span ID: 
Flags: 0
LogRecord #14
ObservedTimestamp: 2024-04-09 01:32:48.468672478 +0000 UTC
Timestamp: 2024-04-09 01:32:43.752 +0000 UTC
SeverityText: 
SeverityNumber: Unspecified(0)
Body: Str({"level":"info","message":"finding all instances for: 814b1560-b051-7040-b58b-b218b42ce59b","requestId":"9f7b878f-a62a-47d1-be33-9526adf64f30","timestamp":"2024-04-09T01:32:43.752Z"}
)
Attributes:
     -> id: Str(38192844158319374446377418255422413343820849265784913934)
Trace ID: 
Span ID: 
Flags: 0
	{"kind": "exporter", "data_type": "logs", "name": "debug"}
2024-04-09T01:32:58.361Z	debug	awscloudwatchreceiver@v0.97.0/logs.go:287	attempting to discover log groups.	{"kind": "receiver", "name": "awscloudwatch", "data_type": "logs", "limit": 100}
2024-04-09T01:32:58.381Z	debug	awscloudwatchreceiver@v0.97.0/logs.go:323	discovered log group	{"kind": "receiver", "name": "awscloudwatch", "data_type": "logs", "log group": "{\n  Arn: \"arn:aws:logs:us-east-2:845044614340:log-group:/aws/lambda/get-instances-api-function:*\",\n  CreationTime: 1701809123724,\n  LogGroupArn: \"arn:aws:logs:us-east-2:845044614340:log-group:/aws/lambda/get-instances-api-function\",\n  LogGroupClass: \"STANDARD\",\n  LogGroupName: \"/aws/lambda/get-instances-api-function\",\n  MetricFilterCount: 0,\n  RetentionInDays: 7,\n  StoredBytes: 113613\n}"}
2024-04-09T01:33:08.362Z	debug	awscloudwatchreceiver@v0.97.0/logs.go:287	attempting to discover log groups.	{"kind": "receiver", "name": "awscloudwatch", "data_type": "logs", "limit": 100}
2024-04-09T01:33:08.382Z	debug	awscloudwatchreceiver@v0.97.0/logs.go:323	discovered log group	{"kind": "receiver", "name": "awscloudwatch", "data_type": "logs", "log group": "{\n  Arn: \"arn:aws:logs:us-east-2:845044614340:log-group:/aws/lambda/get-instances-api-function:*\",\n  CreationTime: 1701809123724,\n  LogGroupArn: \"arn:aws:logs:us-east-2:845044614340:log-group:/aws/lambda/get-instances-api-function\",\n  LogGroupClass: \"STANDARD\",\n  LogGroupName: \"/aws/lambda/get-instances-api-function\",\n  MetricFilterCount: 0,\n  RetentionInDays: 7,\n  StoredBytes: 113613\n}"}
2024-04-09T01:33:18.362Z	debug	awscloudwatchreceiver@v0.97.0/logs.go:287	attempting to discover log groups.	{"kind": "receiver", "name": "awscloudwatch", "data_type": "logs", "limit": 100}
2024-04-09T01:33:18.380Z	debug	awscloudwatchreceiver@v0.97.0/logs.go:323	discovered log group	{"kind": "receiver", "name": "awscloudwatch", "data_type": "logs", "log group": "{\n  Arn: \"arn:aws:logs:us-east-2:845044614340:log-group:/aws/lambda/get-instances-api-function:*\",\n  CreationTime: 1701809123724,\n  LogGroupArn: \"arn:aws:logs:us-east-2:845044614340:log-group:/aws/lambda/get-instances-api-function\",\n  LogGroupClass: \"STANDARD\",\n  LogGroupName: \"/aws/lambda/get-instances-api-function\",\n  MetricFilterCount: 0,\n  RetentionInDays: 7,\n  StoredBytes: 113613\n}"}
2024-04-09T01:33:28.362Z	debug	awscloudwatchreceiver@v0.97.0/logs.go:287	attempting to discover log groups.	{"kind": "receiver", "name": "awscloudwatch", "data_type": "logs", "limit": 100}
2024-04-09T01:33:28.380Z	debug	awscloudwatchreceiver@v0.97.0/logs.go:323	discovered log group	{"kind": "receiver", "name": "awscloudwatch", "data_type": "logs", "log group": "{\n  Arn: \"arn:aws:logs:us-east-2:845044614340:log-group:/aws/lambda/get-instances-api-function:*\",\n  CreationTime: 1701809123724,\n  LogGroupArn: \"arn:aws:logs:us-east-2:845044614340:log-group:/aws/lambda/get-instances-api-function\",\n  LogGroupClass: \"STANDARD\",\n  LogGroupName: \"/aws/lambda/get-instances-api-function\",\n  MetricFilterCount: 0,\n  RetentionInDays: 7,\n  StoredBytes: 113613\n}"}
2024-04-09T01:33:38.362Z	debug	awscloudwatchreceiver@v0.97.0/logs.go:287	attempting to discover log groups.	{"kind": "receiver", "name": "awscloudwatch", "data_type": "logs", "limit": 100}
2024-04-09T01:33:38.379Z	debug	awscloudwatchreceiver@v0.97.0/logs.go:323	discovered log group	{"kind": "receiver", "name": "awscloudwatch", "data_type": "logs", "log group": "{\n  Arn: \"arn:aws:logs:us-east-2:845044614340:log-group:/aws/lambda/get-instances-api-function:*\",\n  CreationTime: 1701809123724,\n  LogGroupArn: \"arn:aws:logs:us-east-2:845044614340:log-group:/aws/lambda/get-instances-api-function\",\n  LogGroupClass: \"STANDARD\",\n  LogGroupName: \"/aws/lambda/get-instances-api-function\",\n  MetricFilterCount: 0,\n  RetentionInDays: 7,\n  StoredBytes: 113613\n}"}
2024-04-09T01:33:48.361Z	debug	awscloudwatchreceiver@v0.97.0/logs.go:287	attempting to discover log groups.	{"kind": "receiver", "name": "awscloudwatch", "data_type": "logs", "limit": 100}
2024-04-09T01:33:48.382Z	debug	awscloudwatchreceiver@v0.97.0/logs.go:323	discovered log group	{"kind": "receiver", "name": "awscloudwatch", "data_type": "logs", "log group": "{\n  Arn: \"arn:aws:logs:us-east-2:845044614340:log-group:/aws/lambda/get-instances-api-function:*\",\n  CreationTime: 1701809123724,\n  LogGroupArn: \"arn:aws:logs:us-east-2:845044614340:log-group:/aws/lambda/get-instances-api-function\",\n  LogGroupClass: \"STANDARD\",\n  LogGroupName: \"/aws/lambda/get-instances-api-function\",\n  MetricFilterCount: 0,\n  RetentionInDays: 7,\n  StoredBytes: 113613\n}"}
2024-04-09T01:33:58.362Z	debug	awscloudwatchreceiver@v0.97.0/logs.go:287	attempting to discover log groups.	{"kind": "receiver", "name": "awscloudwatch", "data_type": "logs", "limit": 100}
2024-04-09T01:33:58.385Z	debug	awscloudwatchreceiver@v0.97.0/logs.go:323	discovered log group	{"kind": "receiver", "name": "awscloudwatch", "data_type": "logs", "log group": "{\n  Arn: \"arn:aws:logs:us-east-2:845044614340:log-group:/aws/lambda/get-instances-api-function:*\",\n  CreationTime: 1701809123724,\n  LogGroupArn: \"arn:aws:logs:us-east-2:845044614340:log-group:/aws/lambda/get-instances-api-function\",\n  LogGroupClass: \"STANDARD\",\n  LogGroupName: \"/aws/lambda/get-instances-api-function\",\n  MetricFilterCount: 0,\n  RetentionInDays: 7,\n  StoredBytes: 113613\n}"}

Additional context

Im using Signoz to see my logs

Compare the images to verify that there are missing log events for my log stream named 2024/04/09/[$LATEST]9a79cb34998a4037bf1e3ff5df35fea5

image

In Signoz, I filtered logs by the log stream name
image

@AllanOricil AllanOricil added bug Something isn't working needs triage New item requiring triage labels Apr 9, 2024
Copy link
Contributor

github-actions bot commented Apr 9, 2024

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

@AllanOricil
Copy link
Author

AllanOricil commented Apr 9, 2024

It is also not iterating over all log groups I have in my aws account. It iterates over and over again in log groups that dont even have log streams anymore, and it also never tries other log groups than these ones shown below.

image
2024-04-09T17:01:47.395Z	debug	awscloudwatchreceiver@v0.97.0/logs.go:323	discovered log group	{"kind": "receiver", "name": "awscloudwatch/2", "data_type": "logs", "log group": "{\n  Arn: \"arn:aws:logs:us-east-2:845044614340:log-group:/aws/apigateway/w4r1tjdtbj/v1:*\",\n  CreationTime: 1704593418144,\n  LogGroupArn: \"arn:aws:logs:us-east-2:845044614340:log-group:/aws/apigateway/w4r1tjdtbj/v1\",\n  LogGroupClass: \"STANDARD\",\n  LogGroupName: \"/aws/apigateway/w4r1tjdtbj/v1\",\n  MetricFilterCount: 0,\n  StoredBytes: 33873236\n}"}
2024-04-09T17:01:47.395Z	debug	awscloudwatchreceiver@v0.97.0/logs.go:323	discovered log group	{"kind": "receiver", "name": "awscloudwatch/2", "data_type": "logs", "log group": "{\n  Arn: \"arn:aws:logs:us-east-2:845044614340:log-group:/aws/apigateway/welcome:*\",\n  CreationTime: 1686097976313,\n  LogGroupArn: \"arn:aws:logs:us-east-2:845044614340:log-group:/aws/apigateway/welcome\",\n  LogGroupClass: \"STANDARD\",\n  LogGroupName: \"/aws/apigateway/welcome\",\n  MetricFilterCount: 0,\n  StoredBytes: 110\n}"}
2024-04-09T17:01:47.395Z	debug	awscloudwatchreceiver@v0.97.0/logs.go:323	discovered log group	{"kind": "receiver", "name": "awscloudwatch/2", "data_type": "logs", "log group": "{\n  Arn: \"arn:aws:logs:us-east-2:845044614340:log-group:/aws/codebuild/DockerBuild:*\",\n  CreationTime: 1690469165284,\n  LogGroupArn: \"arn:aws:logs:us-east-2:845044614340:log-group:/aws/codebuild/DockerBuild\",\n  LogGroupClass: \"STANDARD\",\n  LogGroupName: \"/aws/codebuild/DockerBuild\",\n  MetricFilterCount: 0,\n  StoredBytes: 61572\n}"}
2024-04-09T17:01:47.395Z	debug	awscloudwatchreceiver@v0.97.0/logs.go:323	discovered log group	{"kind": "receiver", "name": "awscloudwatch/2", "data_type": "logs", "log group": "{\n  Arn: \"arn:aws:logs:us-east-2:845044614340:log-group:/aws/codebuild/LLEProjectE0AA5ECD-AICRGHcCk5hn:*\",\n  CreationTime: 1711131852056,\n  LogGroupArn: \"arn:aws:logs:us-east-2:845044614340:log-group:/aws/codebuild/LLEProjectE0AA5ECD-AICRGHcCk5hn\",\n  LogGroupClass: \"STANDARD\",\n  LogGroupName: \"/aws/codebuild/LLEProjectE0AA5ECD-AICRGHcCk5hn\",\n  MetricFilterCount: 0,\n  StoredBytes: 2808\n}"}
2024-04-09T17:01:47.395Z	debug	awscloudwatchreceiver@v0.97.0/logs.go:323	discovered log group	{"kind": "receiver", "name": "awscloudwatch/2", "data_type": "logs", "log group": "{\n  Arn: \"arn:aws:logs:us-east-2:845044614340:log-group:/aws/codebuild/MyBucketPipelineLintPipelin-ZfBXxNdpfAzC:*\",\n  CreationTime: 1705729018015,\n  LogGroupArn: \"arn:aws:logs:us-east-2:845044614340:log-group:/aws/codebuild/MyBucketPipelineLintPipelin-ZfBXxNdpfAzC\",\n  LogGroupClass: \"STANDARD\",\n  LogGroupName: \"/aws/codebuild/MyBucketPipelineLintPipelin-ZfBXxNdpfAzC\",\n  MetricFilterCount: 0,\n  StoredBytes: 13570\n}"}
2024-04-09T17:01:47.395Z	debug	awscloudwatchreceiver@v0.97.0/logs.go:323	discovered log group	{"kind": "receiver", "name": "awscloudwatch/2", "data_type": "logs", "log group": "{\n  Arn: \"arn:aws:logs:us-east-2:845044614340:log-group:/aws/codebuild/MyBucketPipelinePublishPipe-kmL0P9FPfuzQ:*\",\n  CreationTime: 1705730226062,\n  LogGroupArn: \"arn:aws:logs:us-east-2:845044614340:log-group:/aws/codebuild/MyBucketPipelinePublishPipe-kmL0P9FPfuzQ\",\n  LogGroupClass: \"STANDARD\",\n  LogGroupName: \"/aws/codebuild/MyBucketPipelinePublishPipe-kmL0P9FPfuzQ\",\n  MetricFilterCount: 0,\n  StoredBytes: 43284\n}"}
2024-04-09T17:01:47.396Z	debug	awscloudwatchreceiver@v0.97.0/logs.go:323	discovered log group	{"kind": "receiver", "name": "awscloudwatch/2", "data_type": "logs", "log group": "{\n  Arn: \"arn:aws:logs:us-east-2:845044614340:log-group:/aws/codebuild/MyBucketPipelineTestPipelin-UHa4TfoMAlV3:*\",\n  CreationTime: 1705729172049,\n  LogGroupArn: \"arn:aws:logs:us-east-2:845044614340:log-group:/aws/codebuild/MyBucketPipelineTestPipelin-UHa4TfoMAlV3\",\n  LogGroupClass: \"STANDARD\",\n  LogGroupName: \"/aws/codebuild/MyBucketPipelineTestPipelin-UHa4TfoMAlV3\",\n  MetricFilterCount: 0,\n  StoredBytes: 367152\n}"}
2024-04-09T17:01:47.396Z	debug	awscloudwatchreceiver@v0.97.0/logs.go:323	discovered log group	{"kind": "receiver", "name": "awscloudwatch/2", "data_type": "logs", "log group": "{\n  Arn: \"arn:aws:logs:us-east-2:845044614340:log-group:/aws/codebuild/MyPipeline-selfupdate:*\",\n  CreationTime: 1690237246405,\n  LogGroupArn: \"arn:aws:logs:us-east-2:845044614340:log-group:/aws/codebuild/MyPipeline-selfupdate\",\n  LogGroupClass: \"STANDARD\",\n  LogGroupName: \"/aws/codebuild/MyPipeline-selfupdate\",\n  MetricFilterCount: 0,\n  StoredBytes: 17016\n}"}
2024-04-09T17:01:47.396Z	debug	awscloudwatchreceiver@v0.97.0/logs.go:323	discovered log group	{"kind": "receiver", "name": "awscloudwatch/2", "data_type": "logs", "log group": "{\n  Arn: \"arn:aws:logs:us-east-2:845044614340:log-group:/aws/codebuild/MyProject39F7B0AE-1WinSXoew9Qo:*\",\n  CreationTime: 1686462158018,\n  LogGroupArn: \"arn:aws:logs:us-east-2:845044614340:log-group:/aws/codebuild/MyProject39F7B0AE-1WinSXoew9Qo\",\n  LogGroupClass: \"STANDARD\",\n  LogGroupName: \"/aws/codebuild/MyProject39F7B0AE-1WinSXoew9Qo\",\n  MetricFilterCount: 0,\n  StoredBytes: 11720\n}"}
2024-04-09T17:01:47.396Z	debug	awscloudwatchreceiver@v0.97.0/logs.go:323	discovered log group	{"kind": "receiver", "name": "awscloudwatch/2", "data_type": "logs", "log group": "{\n  Arn: \"arn:aws:logs:us-east-2:845044614340:log-group:/aws/codebuild/MyProject39F7B0AE-8K3Qo5kQ14Sd:*\",\n  CreationTime: 1686767847677,\n  LogGroupArn: \"arn:aws:logs:us-east-2:845044614340:log-group:/aws/codebuild/MyProject39F7B0AE-8K3Qo5kQ14Sd\",\n  LogGroupClass: \"STANDARD\",\n  LogGroupName: \"/aws/codebuild/MyProject39F7B0AE-8K3Qo5kQ14Sd\",\n  MetricFilterCount: 0,\n  StoredBytes: 2253\n}"}
2024-04-09T17:01:56.753Z	debug	awscloudwatchreceiver@v0.97.0/logs.go:323	discovered log group	{"kind": "receiver", "name": "awscloudwatch/2", "data_type": "logs", "log group": "{\n  Arn: \"arn:aws:logs:us-east-2:845044614340:log-group:/aws/codebuild/MyProject39F7B0AE-1WinSXoew9Qo:*\",\n  CreationTime: 1686462158018,\n  LogGroupArn: \"arn:aws:logs:us-east-2:845044614340:log-group:/aws/codebuild/MyProject39F7B0AE-1WinSXoew9Qo\",\n  LogGroupClass: \"STANDARD\",\n  LogGroupName: \"/aws/codebuild/MyProject39F7B0AE-1WinSXoew9Qo\",\n  MetricFilterCount: 0,\n  StoredBytes: 11720\n}"}
2024-04-09T17:02:08.930Z	debug	awscloudwatchreceiver@v0.97.0/logs.go:323	discovered log group	{"kind": "receiver", "name": "awscloudwatch/2", "data_type": "logs", "log group": "{\n  Arn: \"arn:aws:logs:us-east-2:845044614340:log-group:/aws/apigateway/w4r1tjdtbj/v1:*\",\n  CreationTime: 1704593418144,\n  LogGroupArn: \"arn:aws:logs:us-east-2:845044614340:log-group:/aws/apigateway/w4r1tjdtbj/v1\",\n  LogGroupClass: \"STANDARD\",\n  LogGroupName: \"/aws/apigateway/w4r1tjdtbj/v1\",\n  MetricFilterCount: 0,\n  StoredBytes: 33873236\n}"}
2024-04-09T17:02:08.930Z	debug	awscloudwatchreceiver@v0.97.0/logs.go:323	discovered log group	{"kind": "receiver", "name": "awscloudwatch/2", "data_type": "logs", "log group": "{\n  Arn: \"arn:aws:logs:us-east-2:845044614340:log-group:/aws/apigateway/welcome:*\",\n  CreationTime: 1686097976313,\n  LogGroupArn: \"arn:aws:logs:us-east-2:845044614340:log-group:/aws/apigateway/welcome\",\n  LogGroupClass: \"STANDARD\",\n  LogGroupName: \"/aws/apigateway/welcome\",\n  MetricFilterCount: 0,\n  StoredBytes: 110\n}"}
2024-04-09T17:02:08.930Z	debug	awscloudwatchreceiver@v0.97.0/logs.go:323	discovered log group	{"kind": "receiver", "name": "awscloudwatch/2", "data_type": "logs", "log group": "{\n  Arn: \"arn:aws:logs:us-east-2:845044614340:log-group:/aws/codebuild/DockerBuild:*\",\n  CreationTime: 1690469165284,\n  LogGroupArn: \"arn:aws:logs:us-east-2:845044614340:log-group:/aws/codebuild/DockerBuild\",\n  LogGroupClass: \"STANDARD\",\n  LogGroupName: \"/aws/codebuild/DockerBuild\",\n  MetricFilterCount: 0,\n  StoredBytes: 61572\n}"}
2024-04-09T17:02:08.930Z	debug	awscloudwatchreceiver@v0.97.0/logs.go:323	discovered log group	{"kind": "receiver", "name": "awscloudwatch/2", "data_type": "logs", "log group": "{\n  Arn: \"arn:aws:logs:us-east-2:845044614340:log-group:/aws/codebuild/LLEProjectE0AA5ECD-AICRGHcCk5hn:*\",\n  CreationTime: 1711131852056,\n  LogGroupArn: \"arn:aws:logs:us-east-2:845044614340:log-group:/aws/codebuild/LLEProjectE0AA5ECD-AICRGHcCk5hn\",\n  LogGroupClass: \"STANDARD\",\n  LogGroupName: \"/aws/codebuild/LLEProjectE0AA5ECD-AICRGHcCk5hn\",\n  MetricFilterCount: 0,\n  StoredBytes: 2808\n}"}
2024-04-09T17:02:08.930Z	debug	awscloudwatchreceiver@v0.97.0/logs.go:323	discovered log group	{"kind": "receiver", "name": "awscloudwatch/2", "data_type": "logs", "log group": "{\n  Arn: \"arn:aws:logs:us-east-2:845044614340:log-group:/aws/codebuild/MyBucketPipelineLintPipelin-ZfBXxNdpfAzC:*\",\n  CreationTime: 1705729018015,\n  LogGroupArn: \"arn:aws:logs:us-east-2:845044614340:log-group:/aws/codebuild/MyBucketPipelineLintPipelin-ZfBXxNdpfAzC\",\n  LogGroupClass: \"STANDARD\",\n  LogGroupName: \"/aws/codebuild/MyBucketPipelineLintPipelin-ZfBXxNdpfAzC\",\n  MetricFilterCount: 0,\n  StoredBytes: 13570\n}"}
2024-04-09T17:02:08.930Z	debug	awscloudwatchreceiver@v0.97.0/logs.go:323	discovered log group	{"kind": "receiver", "name": "awscloudwatch/2", "data_type": "logs", "log group": "{\n  Arn: \"arn:aws:logs:us-east-2:845044614340:log-group:/aws/codebuild/MyBucketPipelinePublishPipe-kmL0P9FPfuzQ:*\",\n  CreationTime: 1705730226062,\n  LogGroupArn: \"arn:aws:logs:us-east-2:845044614340:log-group:/aws/codebuild/MyBucketPipelinePublishPipe-kmL0P9FPfuzQ\",\n  LogGroupClass: \"STANDARD\",\n  LogGroupName: \"/aws/codebuild/MyBucketPipelinePublishPipe-kmL0P9FPfuzQ\",\n  MetricFilterCount: 0,\n  StoredBytes: 43284\n}"}
2024-04-09T17:02:08.930Z	debug	awscloudwatchreceiver@v0.97.0/logs.go:323	discovered log group	{"kind": "receiver", "name": "awscloudwatch/2", "data_type": "logs", "log group": "{\n  Arn: \"arn:aws:logs:us-east-2:845044614340:log-group:/aws/codebuild/MyBucketPipelineTestPipelin-UHa4TfoMAlV3:*\",\n  CreationTime: 1705729172049,\n  LogGroupArn: \"arn:aws:logs:us-east-2:845044614340:log-group:/aws/codebuild/MyBucketPipelineTestPipelin-UHa4TfoMAlV3\",\n  LogGroupClass: \"STANDARD\",\n  LogGroupName: \"/aws/codebuild/MyBucketPipelineTestPipelin-UHa4TfoMAlV3\",\n  MetricFilterCount: 0,\n  StoredBytes: 367152\n}"}
2024-04-09T17:02:08.930Z	debug	awscloudwatchreceiver@v0.97.0/logs.go:323	discovered log group	{"kind": "receiver", "name": "awscloudwatch/2", "data_type": "logs", "log group": "{\n  Arn: \"arn:aws:logs:us-east-2:845044614340:log-group:/aws/codebuild/MyPipeline-selfupdate:*\",\n  CreationTime: 1690237246405,\n  LogGroupArn: \"arn:aws:logs:us-east-2:845044614340:log-group:/aws/codebuild/MyPipeline-selfupdate\",\n  LogGroupClass: \"STANDARD\",\n  LogGroupName: \"/aws/codebuild/MyPipeline-selfupdate\",\n  MetricFilterCount: 0,\n  StoredBytes: 17016\n}"}
2024-04-09T17:02:08.930Z	debug	awscloudwatchreceiver@v0.97.0/logs.go:323	discovered log group	{"kind": "receiver", "name": "awscloudwatch/2", "data_type": "logs", "log group": "{\n  Arn: \"arn:aws:logs:us-east-2:845044614340:log-group:/aws/codebuild/MyProject39F7B0AE-1WinSXoew9Qo:*\",\n  CreationTime: 1686462158018,\n  LogGroupArn: \"arn:aws:logs:us-east-2:845044614340:log-group:/aws/codebuild/MyProject39F7B0AE-1WinSXoew9Qo\",\n  LogGroupClass: \"STANDARD\",\n  LogGroupName: \"/aws/codebuild/MyProject39F7B0AE-1WinSXoew9Qo\",\n  MetricFilterCount: 0,\n  StoredBytes: 11720\n}"}

@AllanOricil
Copy link
Author

@djaglowski @schmikei could you guys help me to get my dev env so that I can fix this?

@AllanOricil
Copy link
Author

is this #32057 the fix?

@schmikei
Copy link
Contributor

I think it might be related, if you'd be willing to try out the PR feel free to see if it resolves your issue, we'd very much appreciate it!

@AllanOricil
Copy link
Author

@schmikei how can I do it? Im using the otel-collector binary

@schmikei
Copy link
Contributor

@schmikei how can I do it? Im using the otel-collector binary

You can checkout the fork and do a make otelcontribcol to build a binary!

@AllanOricil
Copy link
Author

@schmikei can I build linux dis on a mac?

@AllanOricil
Copy link
Author

AllanOricil commented Apr 10, 2024

never mind @schmikei
I asked that because my t2.micro died while building it. I created a t3.medium, and I'm now trying to build it again. Let's hope it is enought.

@AllanOricil
Copy link
Author

@schmikei do you have this error while building? I checkout your branch, and then I run the make otelcontribcol
image

@AllanOricil
Copy link
Author

google/cadvisor#3508

@AllanOricil
Copy link
Author

AllanOricil commented Apr 11, 2024

@schmikei I spent the day building this stuff but it did not work :/

I used the same config.yaml with the binary I created from your branch and they had both different results.

Below is the output when using 0.97.0

2024-04-11T04:59:12.112Z	info	prometheusreceiver@v0.97.0/metrics_receiver.go:299	Starting scrape manager	{"kind": "receiver", "name": "prometheus", "data_type": "metrics"}
2024-04-11T04:59:21.971Z	debug	awscloudwatchreceiver@v0.97.0/logs.go:287	attempting to discover log groups.	{"kind": "receiver", "name": "awscloudwatch", "data_type": "logs", "limit": 100}
2024-04-11T04:59:22.003Z	debug	awscloudwatchreceiver@v0.97.0/logs.go:323	discovered log group	{"kind": "receiver", "name": "awscloudwatch", "data_type": "logs", "log group": "{\n  Arn: \"arn:aws:logs:us-east-2:845044614340:log-group:/aws/lambda/get-instances-api-function:*\",\n  CreationTime: 1701809123724,\n  LogGroupArn: \"arn:aws:logs:us-east-2:845044614340:log-group:/aws/lambda/get-instances-api-function\",\n  LogGroupClass: \"STANDARD\",\n  LogGroupName: \"/aws/lambda/get-instances-api-function\",\n  MetricFilterCount: 0,\n  RetentionInDays: 7,\n  StoredBytes: 137470\n}"}
2024-04-11T04:59:31.970Z	debug	awscloudwatchreceiver@v0.97.0/logs.go:287	attempting to discover log groups.	{"kind": "receiver", "name": "awscloudwatch", "data_type": "logs", "limit": 100}
2024-04-11T04:59:31.994Z	debug	awscloudwatchreceiver@v0.97.0/logs.go:323	discovered log group	{"kind": "receiver", "name": "awscloudwatch", "data_type": "logs", "log group": "{\n  Arn: \"arn:aws:logs:us-east-2:845044614340:log-group:/aws/lambda/get-instances-api-function:*\",\n  CreationTime: 1701809123724,\n  LogGroupArn: \"arn:aws:logs:us-east-2:845044614340:log-group:/aws/lambda/get-instances-api-function\",\n  LogGroupClass: \"STANDARD\",\n  LogGroupName: \"/aws/lambda/get-instances-api-function\",\n  MetricFilterCount: 0,\n  RetentionInDays: 7,\n  StoredBytes: 137470\n}"}
2024-04-11T04:59:41.971Z	debug	awscloudwatchreceiver@v0.97.0/logs.go:287	attempting to discover log groups.	{"kind": "receiver", "name": "awscloudwatch", "data_type": "logs", "limit": 100}
2024-04-11T04:59:41.997Z	debug	awscloudwatchreceiver@v0.97.0/logs.go:323	discovered log group	{"kind": "receiver", "name": "awscloudwatch", "data_type": "logs", "log group": "{\n  Arn: \"arn:aws:logs:us-east-2:845044614340:log-group:/aws/lambda/get-instances-api-function:*\",\n  CreationTime: 1701809123724,\n  LogGroupArn: \"arn:aws:logs:us-east-2:845044614340:log-group:/aws/lambda/get-instances-api-function\",\n  LogGroupClass: \"STANDARD\",\n  LogGroupName: \"/aws/lambda/get-instances-api-function\",\n  MetricFilterCount: 0,\n  RetentionInDays: 7,\n  StoredBytes: 137470\n}"}

Below you can see the result I got from the new binary

2024-04-11T04:58:36.200Z	debug	awscloudwatchreceiver@v0.97.0/logs.go:287	attempting to discover log groups.	{"kind": "receiver", "name": "awscloudwatch", "data_type": "logs", "limit": 100}
2024-04-11T04:58:36.200Z	error	awscloudwatchreceiver@v0.97.0/logs.go:165	unable to perform discovery of log groups	{"kind": "receiver", "name": "awscloudwatch", "data_type": "logs", "error": "unable to list log groups: InvalidParameter: 1 validation error(s) found.\n- minimum field size of 1, DescribeLogGroupsInput.NextToken.\n"}
github.com/open-telemetry/opentelemetry-collector-contrib/receiver/awscloudwatchreceiver.(*logsReceiver).startPolling
	github.com/open-telemetry/opentelemetry-collector-contrib/receiver/awscloudwatchreceiver@v0.97.0/logs.go:165
2024-04-11T04:58:46.201Z	debug	awscloudwatchreceiver@v0.97.0/logs.go:287	attempting to discover log groups.	{"kind": "receiver", "name": "awscloudwatch", "data_type": "logs", "limit": 100}
2024-04-11T04:58:46.201Z	error	awscloudwatchreceiver@v0.97.0/logs.go:165	unable to perform discovery of log groups	{"kind": "receiver", "name": "awscloudwatch", "data_type": "logs", "error": "unable to list log groups: InvalidParameter: 1 validation error(s) found.\n- minimum field size of 1, DescribeLogGroupsInput.NextToken.\n"}
github.com/open-telemetry/opentelemetry-collector-contrib/receiver/awscloudwatchreceiver.(*logsReceiver).startPolling

@AllanOricil
Copy link
Author

@djaglowski @schmikei can you help here?

Copy link
Contributor

This issue has been inactive for 60 days. It will be closed in 60 days if there is no activity. To ping code owners by adding a component label, see Adding Labels via Comments, or if you are unsure of which component this issue relates to, please ping @open-telemetry/collector-contrib-triagers. If this issue is still relevant, please ping the code owners or leave a comment explaining why it is still relevant. Otherwise, please close it.

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

@github-actions github-actions bot added the Stale label Jun 24, 2024
@AllanOricil
Copy link
Author

AllanOricil commented Jun 24, 2024

@schmikei do you intend to work on this?

@github-actions github-actions bot removed the Stale label Jun 25, 2024
@schmikei
Copy link
Contributor

@AllanOricil did you end up trying the latest release? I've validated that everything with the receiver is currently working as expected in my lab and I'm currently unable to replicate your specific behavior at the moment.

If you want to start narrowing down your config to target specific like in our examples: https://github.com/open-telemetry/opentelemetry-collector-contrib/tree/main/receiver/awscloudwatchreceiver#sample-configs

Apologies that I cannot be more helpful, if you do find something, always happy to review a PR 😄

@Jessimon
Copy link

I have a similar issue.
When sending 2000 logs to a log stream, i only get about 80% of them processed by the receiver.
I am using version 0.103.0 in docker.
The system or the docker do not hit cpu, memory or other limits that i see.

My receiver config:

  awscloudwatch:
    region: eu-central-1
    logs:
      poll_interval: 1m

Thanks

@Jessimon
Copy link

Aah, i see this:
max_events_per_request | default=50
I am probably hitting that limit.
Added max_events_per_request=5000 in my config and will test again.

@Jessimon
Copy link

Ok, so i did some tests with AWS API Gateway.
I send 1000 API requests, one request every 250 ms.
It takes 6m 40s to send them using postman.

Of those 1000 requests 1000 are processed by AWS API Gateway.
There are 1000 log lines containing "AWS Integration Endpoint RequestId" in Cloudwatch.
The receiver only processes 792 lines containing the same text.
I export the logs to Loki and i see no message fails there. (Monitored with prometheus)

I add here the timestamps of the Cloudwatch logs and the loki logs.
Loki-logs.txt
Cloudwatch-logs.txt

When you compare the two file you see about every minute (my poll interval) a gap in the logs.
Hope this helps to find a solution.

@Jessimon
Copy link

Increasing or decreasing the poll interval does not help.

@schmikei
Copy link
Contributor

I'm starting to think Cloudwatch may take a second to serve the log entries on the API (just a suspicion), created a Spike PR in #33809 that will set the next end time...

I will try to dedicate some time to see if I can replicate the specific behavior, but if you'd like to test out the PR to see if it helps, it would help as I try and replicate your test case.

@AllanOricil
Copy link
Author

@schmikei once I have time I will test it again

@Jessimon
Copy link

I did an other test this time with only API Gateway log groups. That is 4 log groups out of the 38 that i have in total.
But that does not make a difference.

The gaps in the logs that i see are between 14-18 seconds with a 1min pull request.

@schmikei I don't have the mad skills to do this. But if you can provide a docker, i can test.
Can it be that the receiver does not generate metrics that are exported to Prometheus?

@Jessimon
Copy link

Jessimon commented Jul 1, 2024

@schmikei
There was something bugging me, namely that if it is an authentication issue i should have a better log retrieval % when using longer poll interval.
That is also the case, but the gain is marginal, from around 79% with 1 min to 83% with 5 min poll.

Here is a graph of the logs i get. This is done with a poll of 5 min and you see a big gap every 5 min. When you look closely you also see small gaps every couple of seconds.
image

It turns out that when Cloudwatch created a new log stream the first logs (API Gateway call) is not picked up.
Cloudwatch is creating a new log stream every 3-10 seconds.

A possible solution to fix this would be to work with a time offset equal to the poll rate, so if poll_interval: 1m the time offset would also be 1m.
This could also solve the slow authentication issue, as the slow authentication will not be an issue as long as it is shorter then the poll interval. You will have to keep track of the last log timestamp you retrieved.
You can also make the offset optional so that user can choose to use it or not.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working needs triage New item requiring triage receiver/awscloudwatch
Projects
None yet
Development

No branches or pull requests

3 participants