-
Notifications
You must be signed in to change notification settings - Fork 3.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
@timestamp erroneously added to events in pipeline-to-pipeline communication #13333
Comments
Polite ping! |
Thank you for the report and the minimal reproduction. The timestamp is added as part of the event instantiation and because we do not have copy-on-write events, the entire event is cloned when being sent to the other pipeline (this prevents mutation issues where we either have multiple downstream pipelines or we have multiple outputs and a downstream pipeline begins work on an event before it has finished its way through all of the upstream pipelines). I don't see an immediate way around this issue. |
@yaauie Thanks for your reply. The part about
got me thinking, because one could argue, that this is not really true. One does not get a real clone of the source event, because whenever a new event is created, this always gets a timestamp. I tried also the
I see following options:
To support this argument, consider the following test case, where an event is cloned, which has the timestamp field, but due to the
produces the following output:
|
Polite ping. |
Logstash information:
Please include the following information:
Logstash 7.15 from official Docker image (
FROM docker.elastic.co/logstash/logstash:7.15.0
)Plugins installed: (
bin/logstash-plugin list --verbose
)only default plugins
JVM (e.g.
java -version
):JVM from official Docker image (
FROM docker.elastic.co/logstash/logstash:7.15.0
)Description of the problem including expected versus actual behavior:
Given a logstash configuration with 2 connected pipelines, where in the first pipeline the
@timestamp
field is removed (e.g. withfilter { mutate { remove_field => [ "@timestamp" ] } }
), it is re-added when the event is passed to the second pipeline.Steps to reproduce:
Dockerfile to reproduce:
Build and run the above Dockerfile with:
For the above test case, the expected output would be:
The actual output is:
See also: magnusbaeck/logstash-filter-verifier#151
The text was updated successfully, but these errors were encountered: