Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[loki-distributed] Fix compactor address without protocol schema #2121

Merged
merged 2 commits into from
Jan 27, 2023

Conversation

zaldnoay
Copy link
Contributor

The previous version of loki included a wrong compactor address, which caused the following errors from other components:

level=error ts=2023-01-13T09:47:29.06565832Z caller=http.go:131 msg="error getting cache gen numbers from the store" err="Get \"loki-distributed-qa-compactor:3100\": unsupported protocol scheme \"loki-distributed-qa-compactor\""

This changes fix compactor address by adding http:// protocol schema.

Related comment: grafana/loki#3109 (comment)

Signed-off-by: junwei.liang junwei.liang@mintegral.com

@CLAassistant
Copy link

CLAassistant commented Jan 13, 2023

CLA assistant check
All committers have signed the CLA.

@sgrzemski
Copy link

Amen to that.

@zaldnoay zaldnoay force-pushed the fix_loki_compactor_address branch 3 times, most recently from 3c03203 to d0e960f Compare January 16, 2023 03:01
@mateuszdrab
Copy link

mateuszdrab commented Jan 24, 2023

Hey guys, it's a small tweak, can we get this merged? 🙏
@Whyeasy @unguiculus

@patsevanton
Copy link
Contributor

zaldnoay and others added 2 commits January 25, 2023 00:33
Signed-off-by: junwei.liang <junwei.liang@mintegral.com>
Signed-off-by: junwei.liang <zaldnoay@users.noreply.github.com>
Signed-off-by: junwei.liang <zaldnoay@users.noreply.github.com>
@zaldnoay
Copy link
Contributor Author

@patsevanton
Done

@patsevanton
Copy link
Contributor

@zanhsieh Could you merge request please ?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants