Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

日志数据推送到kafka后,继续存储到elasticsearch方法分享 #22

Closed
chinarenliwei opened this issue Sep 30, 2022 · 1 comment

Comments

@chinarenliwei
Copy link

基本思路:
利用kafka的connect能力,采用confluentinc为kafka开发的kafka-connect-elasticsearch组件完成从kafka将日志数据自动存储到elasticsearch;

1、由于kafka-connect自带json解析器,但是know-agent默认推送到kafka的日志是list格式的:[{日志1},{日志2},{日志n}],导致json解析器无法解析,需要在创建采集任务的时候,第四步,高级配置中,添加已下配置:
{"transFormate":1}

源代码默认值是0,0代表list:
/**
* 传输格式 0:List 1: MqLogEvent 2: 原始类型(String)
*/
private Integer transFormate = 0;

2、kafka-connect-elasticsearch的具体配置使用,参考:
https://blog.csdn.net/Jerry_wo/article/details/107937500

@huqidong
Copy link
Collaborator

huqidong commented Oct 8, 2022

感谢你宝贵的建议!
下一个版本我们会在高级配置框给出各配置项明确含义与配置方法,后续我们会提供 log2elasticsearch 的能力,欢迎共建。

@huqidong huqidong closed this as completed Oct 8, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants