You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The error was discovered. If there are many indexes , then logstash will be in the logs
Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>"%{id}", :_index=>"_integration_ms", :_type=>"integr_sybase", :routing=>nil}, #LogStash::Event:0x44b7f68], :response=>{"index"=>{"_index"=>"_integration_ms", "_type"=>"integr_syb", "_id"=>"%{id}", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [host] of type [text] in document with id '%{id}'. Preview of field's value: '{name=mail1.domain.com}'", "caused_by"=>{"type"=>"illegal_state_exception", "reason"=>"Can't get text on a START_OBJECT at 1:299"}}}}} . That is, filebeat tries to write data to each index.
If in logstash
output {
if "postfix" in [tags]{
elasticsearch {
hosts => "localhost:9200"
index => "postfix-%{+YYYY.MM.dd}"
}
}
}
Thanks. In https://discuss.elastic.co answered
"
That is, we had a document with id mgaz20Bp3jq-MOqGvqp, and it had a host field that Elasticsearch tried to interpret as text but couldn't. Racit in the document was something like {..... "host": {"name": "elk.domain.com"} ...} and judging by
"reason"=>"Can't get text on a START_OBJECT at 1:357"
It happened on the 357th symbol.
Result: the mapping for the host field in the index does not match the information being sent.
"
The error was discovered. If there are many indexes , then logstash will be in the logs
Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>"%{id}", :_index=>"_integration_ms", :_type=>"integr_sybase", :routing=>nil}, #LogStash::Event:0x44b7f68], :response=>{"index"=>{"_index"=>"_integration_ms", "_type"=>"integr_syb", "_id"=>"%{id}", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [host] of type [text] in document with id '%{id}'. Preview of field's value: '{name=mail1.domain.com}'", "caused_by"=>{"type"=>"illegal_state_exception", "reason"=>"Can't get text on a START_OBJECT at 1:299"}}}}} . That is, filebeat tries to write data to each index.
If in logstash
output {
if "postfix" in [tags]{
elasticsearch {
hosts => "localhost:9200"
index => "postfix-%{+YYYY.MM.dd}"
}
}
}
and filebeat.yml
filebeat.inputs:
enabled: true
paths:
- /var/log/maillog*
exclude_files: [".gz$"]
tags: ["postfix"]
output.logstash:
hosts: ["10.50.11.8:5044"]
in logs the same errors and the index is not created.
Can you help ?
The text was updated successfully, but these errors were encountered: