DNS Processor for Ingest Pipelines #91624
Labels
:Data Management/Ingest Node
Execution or management of Ingest Pipelines including GeoIP
>enhancement
Team:Data Management
Meta label for data/management team
Description
TL:DR - There needs to be a DNS processor for Ingest Pipelines.
We have been using Elastic as a SIEM for nearly four years, and we have now spent the last ~15 months rolling it out as a Data Lake and Infrastructure Monitoring tool for the rest of our company. Over the years, we have come to rely quite heavily on Logstash for parsing and enrichment of all data being ingested (including Beats); nothing is written directly to Elasticsearch. The key enrichment aspects we rely on are Geo Location, DNS lookups, and appending to the
related.*
fields.Now that Elastic Agent has been around for a while, I have been playing around with it on a few test devices as a centrally managed alternative to Beats agents; mainly Winlogbeat which is currently deployed to 5,000+ devices. Winlogbeat has always parsed each event at the agent (creating the ECS fields
source.ip
,user.name
, etc.), so this allowed our Logstash enrichment on these ECS fields to work perfectly. Now that Elastic Agent uses Filebeat for Windows Event Logs, the parsing does not occur until hitting Ingest Pipelines at Elasticsearch. This unfortunately means the enrichment we have always done in Logstash is not possible.I recently realized each Integration's Ingest Pipelines also run an
*@custom
pipeline after the pre-built pipeline runs. I thought this may be our way translate our Logstash pipelines to Ingest Pipelines and start using Elastic Agent with the same enrichment we have always done. I am unfortunately finding there is no DNS processor for Ingest Pipelines, and like I mentioned before, this is one thing we have come to rely on quite heavily.I would like to request a DNS processor be created for Ingest Pipelines. The lack of a DNS processor for Ingest Pipelines is preventing us from using Elastic Agent.
Eric
The text was updated successfully, but these errors were encountered: