Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Logstash Support #933

Closed
jamalzkhan opened this issue Jun 11, 2023 · 6 comments
Closed

Logstash Support #933

jamalzkhan opened this issue Jun 11, 2023 · 6 comments
Assignees

Comments

@jamalzkhan
Copy link

Which OpenObserve functionalities are relevant/related to the feature request?

No response

Description

We currently send logs directly using Logstash. Any future plans for support?

Proposed solution

Adapter for logstash

Alternatives considered

No alternatives

@hengfeiyang
Copy link
Contributor

We tested filebeat https://openobserve.ai/docs/ingestion/logs/filebeat/

but we haven't test logstash, maybe you can use try, we supported ingestion like elasticsearch.

@ximply
Copy link

ximply commented Jul 5, 2023

I try it, but it does not work.

Use logstash-7.15.2 and elasticsearch-7.15.2, then get this error:
[ERROR][logstash.outputs.elasticsearch][main] Could not connect to a compatible version of Elasticsearch

@ximply
Copy link

ximply commented Jul 5, 2023

filebeat+kafka+logstash is widely used, if it is supported, migrate from elasticsearch to openobserve will be so easy!

@hengfeiyang
Copy link
Contributor

Try to use a version greater than 7.16.x, it should work. @ximply

@Axm11
Copy link

Axm11 commented Aug 29, 2023

Try to use a version greater than 7.16.x, it should work. @ximply

Could you share your configuration from logstash to openobserve? Thanks a lot. @hengfeiyang

@e8tg001
Copy link

e8tg001 commented Aug 30, 2023

logstash can work!
-------------------------- filebeat-6.6.2 config ------------------------

filebeat.inputs:
- type: log
  enabled: true
  paths:
    - /data/log/info.log
  exclude_lines: ['^$']
  exclude_files: ['.gz$']
  fields:
    system: 'crm'
    service: 'crm-web'
    logType: 'log4j'
  tags: ["resin"]
  multiline.type: pattern
  multiline.pattern: '^\d{4}-\d{2}-\d{2}'
  multiline.negate: true
  multiline.match: after

output.logstash:
  hosts: ["127.0.0.1:5044"]

-------------------------- logstash-6.6.2 config ------------------------

input {
    beats {
        port => 5044
    }
}
filter {
    grok {
        match => [
            # message = 2023-04-20 14:42:56,019 INFO [Thread-51] [com.xxx.xx.DataManager:61] refresh data end UdbData
            "message", "(?<logdate>%{YEAR}[./-]%{MONTHNUM}[./-]%{MONTHDAY}[- ]%{TIME})\s%{LOGLEVEL:level}\s\[%{DATA:thread}\]\s\[%{DATA:class}[./:]%{INT:line}\]\s(?<msg>.+)$",
            "message", "(?<msg>.+)$"
        ]
        remove_field => "message"
    }
    # log timezone format => @timestamp
    date {
        timezone => "Asia/Shanghai"
        match =>  [ "logdate", "YYYY-MM-dd HH:mm:ss.SSS", "yyyy-MM-dd'T'HH:mm:ss.SSSZZZ", "ISO8601" ]
        target => "@timestamp"
    }
}
output {
    http {
        url => ["http://xxx.xxx.xxx.xxx:5601/api/default/%{[fields][service]}/_multi"]
        format => "json"
        http_method => "post"
        content_type => "application/json"
        headers => ["Authorization", "Basic xxxxxxxxxxxxxxxxxxxxxxxxxxxx="]
        mapping => {
            "@timestamp" => "%{[@timestamp]}"
            "source" => "%{[source]}"
            "tags" => "%{[tags]}"
            "logdate" => "%{[logdate]}"
            "level" => "%{[level]}"
            "thread" => "%{[thread]}"
            "class" => "%{[class]}"
            "line" => "%{[line]}"
            "msg" => "%{[msg]}"
            "server" => "%{[fields][service]}"
            "log_type" => "%{[fields][logType]}"
            "host_name" => "%{[host][name]}"
        }
    }
}

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants