Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

lambda-promtail: Add support for Kinesis data stream events #5977

Merged
merged 3 commits into from
Aug 26, 2022

Conversation

juissi-t
Copy link
Contributor

@juissi-t juissi-t commented Apr 21, 2022

What this PR does / why we need it:

This PR adds support for sending Kinesis data stream events to lambda-promtail. One use case would be e.g. to send CloudFront realtime logs to Loki.

Which issue(s) this PR fixes:

Fixes #5978

Special notes for your reviewer:

n/a

Checklist

  • Documentation added
  • Tests updated
  • Is this an important fix or new feature? Add an entry in the CHANGELOG.md.
  • Changes that require user attention or interaction to upgrade are documented in docs/sources/upgrading/_index.md

@juissi-t
Copy link
Contributor Author

juissi-t commented May 3, 2022

@grafana/loki-team: could someone review this PR, please?

@juissi-t
Copy link
Contributor Author

@KMiller-Grafana this PR has been open for about three weeks now. Any chance you could take a look at it?

@slim-bean
Copy link
Collaborator

Apologies @juissi-t, it's been busy trying to finish up projects at the end of our quarter and this week most of the team is at an office.

Three weeks is a long time :( very sorry about that, we will take a look at this as soon as we can, but may not be until t week.

In the meantime if you could fix the changelog conflict that will help us get this merged as soon as we can

@juissi-t
Copy link
Contributor Author

Thanks, it would be great if you could review this in the near future!

In the meantime if you could fix the changelog conflict that will help us get this merged as soon as we can

Rebased the PR.

@grafanabot
Copy link
Collaborator

./tools/diff_coverage.sh ../loki-main/test_results.txt test_results.txt ingester,distributor,querier,querier/queryrange,iter,storage,chunkenc,logql,loki

Change in test coverage per package. Green indicates 0 or positive change, red indicates that test coverage for a package fell.

+           ingester	0%
+        distributor	0%
+            querier	0%
+ querier/queryrange	0%
+               iter	0%
+            storage	0%
+           chunkenc	0%
+              logql	0%
+               loki	0%

Copy link
Contributor

@KMiller-Grafana KMiller-Grafana left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I approve of the docs portion of this PR.

@chaudum chaudum removed the type/docs label Jun 13, 2022
@juissi-t
Copy link
Contributor Author

@slim-bean any chance you could take a look at this, now that it's been over a month since the last time I reached out. 🙏

@cstyan cstyan self-assigned this Jun 22, 2022
@juissi-t juissi-t force-pushed the main branch 3 times, most recently from 6e0b1db to 5f21387 Compare June 23, 2022 07:32
@grafanabot
Copy link
Collaborator

./tools/diff_coverage.sh ../loki-main/test_results.txt test_results.txt ingester,distributor,querier,querier/queryrange,iter,storage,chunkenc,logql,loki

Change in test coverage per package. Green indicates 0 or positive change, red indicates that test coverage for a package fell.

+           ingester	0%
+        distributor	0%
+            querier	0%
+ querier/queryrange	0%
+               iter	0%
+            storage	0%
+           chunkenc	0%
+              logql	0%
+               loki	0%

1 similar comment
@grafanabot
Copy link
Collaborator

./tools/diff_coverage.sh ../loki-main/test_results.txt test_results.txt ingester,distributor,querier,querier/queryrange,iter,storage,chunkenc,logql,loki

Change in test coverage per package. Green indicates 0 or positive change, red indicates that test coverage for a package fell.

+           ingester	0%
+        distributor	0%
+            querier	0%
+ querier/queryrange	0%
+               iter	0%
+            storage	0%
+           chunkenc	0%
+              logql	0%
+               loki	0%

@juissi-t
Copy link
Contributor Author

juissi-t commented Aug 1, 2022

@cstyan I rebased the PR. Some test seems to fail, but to me it looks unrelated to this change. Could you check this PR?

@luva03
Copy link

luva03 commented Aug 8, 2022

@juissi-t @slim-bean @cstyan

Is there any progress on this PR? We are very interested in using this feature. 🙏 TY

@juissi-t
Copy link
Contributor Author

Is there any progress on this PR? We are very interested in using this feature. pray TY

I have been trying to ping the people who have commented or assigned this, but no luck yet. I'll keep trying.

@slim-bean @cstyan could someone take a look? This has been open for over three months now.

@github-actions github-actions bot added the type/docs Issues related to technical documentation; the Docs Squad uses this label across many repositories label Aug 17, 2022
@grafanabot
Copy link
Collaborator

./tools/diff_coverage.sh ../loki-main/test_results.txt test_results.txt ingester,distributor,querier,querier/queryrange,iter,storage,chunkenc,logql,loki

Change in test coverage per package. Green indicates 0 or positive change, red indicates that test coverage for a package fell.

+           ingester	0%
+        distributor	0%
+            querier	0%
+ querier/queryrange	0%
+               iter	0%
+            storage	0%
+           chunkenc	0%
+              logql	0%
+               loki	0%

@juissi-t
Copy link
Contributor Author

Pinging recent contributors to lambda-promtail: @dannykopping @jonathanmorais @simonswine. Could one of you maybe take a look at this pull request? Or find someone who could do that?

This PR has been open for over four months now, and I find it quite disheartening for any new contributor to an open source project backed by a company to have to wait so long for a review for a fairly trivial new feature. I know Loki team has its priorities and for sure you are busy, but still... If you don't want to have this, that's fine too, but could someone at least say something?

Copy link
Contributor

@dannykopping dannykopping left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@juissi-t big apologies from the Loki team - we've dropped the ball here.

We are quite swamped as you can imagine, but in the future please feel free to reach out in public Slack if you're not getting any response. GitHub notifications are notoriously unreliable, and sometimes we miss things, sometimes we simply don't have the time.

All that being said, thank you for your contribution. I have reviewed it and have one comment, and also I'd ask you to add a test for this please.

@grafanabot
Copy link
Collaborator

./tools/diff_coverage.sh ../loki-main/test_results.txt test_results.txt ingester,distributor,querier,querier/queryrange,iter,storage,chunkenc,logql,loki

Change in test coverage per package. Green indicates 0 or positive change, red indicates that test coverage for a package fell.

+           ingester	0%
+        distributor	0%
+            querier	0%
+ querier/queryrange	0%
+               iter	0%
+            storage	0%
+           chunkenc	0%
+              logql	0%
+               loki	0%

Copy link
Contributor

@dannykopping dannykopping left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the changes.
Added another comment

Sidenote:
Please don't rebase once a review begins as it rewrites history and it becomes difficult to do incremental reviews in GitHub

Copy link
Contributor

@dannykopping dannykopping left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, thanks for the contribution @juissi-t
Apologies again for the delays here, I hope it doesn't deter you from contributing again

@grafanabot
Copy link
Collaborator

./tools/diff_coverage.sh ../loki-main/test_results.txt test_results.txt ingester,distributor,querier,querier/queryrange,iter,storage,chunkenc,logql,loki

Change in test coverage per package. Green indicates 0 or positive change, red indicates that test coverage for a package fell.

+           ingester	0%
+        distributor	0%
+            querier	0%
+ querier/queryrange	0%
+               iter	0%
+            storage	0%
+           chunkenc	0%
+              logql	0%
+               loki	0%

@juissi-t
Copy link
Contributor Author

Apologies again for the delays here, I hope it doesn't deter you from contributing again

Thanks a lot for your help in getting this reviewed @dannykopping! I will keep in mind the option of asking in Slack if I find another scratch I need to itch, and the PR review seems to take too long. 😄

@dannykopping dannykopping merged commit 073c620 into grafana:main Aug 26, 2022
lxwzy pushed a commit to lxwzy/loki that referenced this pull request Nov 7, 2022
…5977)

**What this PR does / why we need it**:

This PR adds support for sending Kinesis data stream events to lambda-promtail. One use case would be e.g. to send CloudFront realtime logs to Loki.

**Which issue(s) this PR fixes**:

Fixes grafana#5978 

**Special notes for your reviewer**:

n/a

**Checklist**
- [x] Documentation added
- [x] Tests updated
- [x] Is this an important fix or new feature? Add an entry in the `CHANGELOG.md`.
- [ ] Changes that require user attention or interaction to upgrade are documented in `docs/sources/upgrading/_index.md`
MichelHollands pushed a commit that referenced this pull request Nov 9, 2022
**What this PR does / why we need it**:
#5977
With the addition of the kinesis data stream function add kinesis data
stream to use in terraform
Abuelodelanada pushed a commit to canonical/loki that referenced this pull request Dec 1, 2022
…#7632)

**What this PR does / why we need it**:
grafana#5977
With the addition of the kinesis data stream function add kinesis data
stream to use in terraform
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
size/L type/docs Issues related to technical documentation; the Docs Squad uses this label across many repositories
Projects
None yet
Development

Successfully merging this pull request may close these issues.

lambda-promtail: Allow ingesting real-time logs from Cloudfront
9 participants