Skip to content

lsopromadze/netflowcollector-elk

Repository files navigation

NetFlow Collector Installation

NetFlow Collector is built using the Elastic Stack, including Elasticsearch, Kibana and Elastiflow Flow-Collector. To install and configure NetFlow Collector you must first have a working Elastic Stack environment.

Requirements

Please be aware that in production environments the volume of data generated by many network flow sources can be considerable. It is not uncommon for a core router or firewall to produce 1000s of flow records per second.

It is my experience that most people underestimate the volume of flow data their network will produce. Save yourself the headache and don't start too small. Use the following table as a guide:

flows/sec (v)CPUs Memory Disk (30-days) ES JVM Heap LS JVM Heap
250 4 32 GB 512 GB 12 GB 4 GB
500 6 48 GB 1 TB 16 GB 4 GB
1000 8 64 GB 2 TB 24 GB 6 GB
1500 12 96 GB 3 TB 31 GB 6 GB

Setting up NetFlow Collector on Docker

The easiest way to get everything up and running quickly is to use Docker and docker-compose. The following instructions will walk you through setting up a single node installation of NetFlow Collector on Docker.

NOTE: These instructions assume that you will have a server available with a recent Linux distribution and both Docker and docker-composer installed.

Prepare the Data Path

Data written within a container's file system is ephemeral. It will be lost when the container is removed. For the data to persist it is necessary to write the data to the local host's file system using a bind mount. You must create a path on the local host, and set the necessary permissions for the processes within the container to write to it.

sudo mkdir /var/lib/netflow_es
sudo chown -R 1000:1000 /var/lib/netflow_es

Docker composer

docker-composer.yml

#------------------------------------------------------------------------------
# Portions of this file are Copyright (C)2021 Levan Sopromadze
#------------------------------------------------------------------------------

version: '3'

services:
  netflow-elasticsearch:
    image: docker.io/lsopromadze/netflow-elasticsearch
    container_name: netflow-elasticsearch
    restart: 'no'
    ulimits:
      memlock:
        soft: -1
        hard: -1
      nofile:
        soft: 131072
        hard: 131072
      nproc: 8192
      fsize: -1
    network_mode: host
    volumes:
      # You will need to create the path and permissions on the local file system where Elasticsearch will store data.
      #   mkdir /var/lib/netflow_es && chown -R 1000:1000 /var/lib/netflow_es
      - /var/lib/netflow_es:/usr/share/elasticsearch/data
    environment:
      ES_JAVA_OPTS: '-Xms4g -Xmx4g'
      cluster.name: netflow
      bootstrap.memory_lock: 'true'
      network.host: 0.0.0.0
      http.port: 9200
      discovery.type: 'single-node'
      indices.query.bool.max_clause_count: 8192
      search.max_buckets: 250000
      action.destructive_requires_name: 'true'

  netflow-kibana:
    image: docker.io/lsopromadze/netflow-kibana
    container_name: netflow-kibana
    restart: 'no'
    depends_on:
      - netflow-elasticsearch
    network_mode: host
    environment:
      SERVER_HOST: 0.0.0.0
      SERVER_PORT: 5601
      SERVER_MAXPAYLOADBYTES: 8388608

      ELASTICSEARCH_HOSTS: "http://127.0.0.1:9200"
      ELASTICSEARCH_REQUESTTIMEOUT: 132000
      ELASTICSEARCH_SHARDTIMEOUT: 120000

      KIBANA_DEFAULTAPPID: "dashboard/653cf1e0-2fd2-11e7-99ed-49759aed30f5"
      KIBANA_AUTOCOMPLETETIMEOUT: 3000
      KIBANA_AUTOCOMPLETETERMINATEAFTER: 2500000

      LOGGING_DEST: stdout
      LOGGING_QUIET: 'false'

  netflow-flowcollector:
    image: docker.io/lsopromadze/netflow-flowcollector
    container_name: netflow-flowcollector
    restart: 'unless-stopped'
    network_mode: 'host'
    depends_on:
      - netflow-elasticsearch
    #volumes:
    #  - /etc/elastiflow:/etc/elastiflow
    environment:
      LS_JAVA_OPTS: '-Xms4g -Xmx4g'
      EF_FLOW_SERVER_UDP_IP: '0.0.0.0'
      EF_FLOW_SERVER_UDP_PORT: 5678
      #EF_FLOW_DECODER_SETTINGS_PATH: '/etc/elastiflow'
      EF_FLOW_DECODER_NETFLOW9_ENABLE: 'true'
        
      EF_FLOW_DECODER_ENRICH_IPADDR_METADATA_ENABLE: 'false'
      #EF_FLOW_DECODER_ENRICH_IPADDR_METADATA_USERDEF_PATH: 'metadata/ipaddrs.yml'
      #EF_FLOW_DECODER_ENRICH_IPADDR_METADATA_REFRESH_RATE: 15

      EF_FLOW_DECODER_ENRICH_DNS_ENABLE: 'false'
      EF_FLOW_DECODER_ENRICH_DNS_NAMESERVER_IP: ''
      EF_FLOW_DECODER_ENRICH_DNS_NAMESERVER_TIMEOUT: 3000

      EF_FLOW_DECODER_ENRICH_MAXMIND_ASN_ENABLE: 'true'
      EF_FLOW_DECODER_ENRICH_MAXMIND_ASN_PATH: '/etc/elastiflow/GeoLite2-ASN.mmdb'

      EF_FLOW_DECODER_ENRICH_MAXMIND_GEOIP_ENABLE: 'true'
      EF_FLOW_DECODER_ENRICH_MAXMIND_GEOIP_PATH: '/etc/elastiflow/GeoLite2-City.mmdb'
      #EF_FLOW_DECODER_ENRICH_MAXMIND_GEOIP_VALUES: 'city,country,country_code,location,timezone'
      #EF_FLOW_DECODER_ENRICH_MAXMIND_GEOIP_LANG: 'en'
      #EF_FLOW_DECODER_ENRICH_MAXMIND_GEOIP_INCLEXCL_PATH: 'maxmind/incl_excl.yml'
      #EF_FLOW_DECODER_ENRICH_MAXMIND_GEOIP_INCLEXCL_REFRESH_RATE: 15

      EF_FLOW_DECODER_ENRICH_RISKIQ_ASN_ENABLE: 'false'
      EF_FLOW_DECODER_ENRICH_RISKIQ_THREAT_ENABLE: 'false'

      #EF_FLOW_DECODER_ENRICH_SAMPLERATE_CACHE_SIZE: 32768
      #EF_FLOW_DECODER_ENRICH_SAMPLERATE_USERDEF_ENABLE: 'false'
      #EF_FLOW_DECODER_ENRICH_SAMPLERATE_USERDEF_PATH: 'settings/sample_rate.yml'

      #EF_FLOW_DECODER_ENRICH_COMMUNITYID_ENABLE: 'true'
      #EF_FLOW_DECODER_ENRICH_COMMUNITYID_SEED: 0
      #EF_FLOW_DECODER_ENRICH_CONVERSATIONID_ENABLE: 'true'
      #EF_FLOW_DECODER_ENRICH_CONVERSATIONID_SEED: 0

      EF_FLOW_DECODER_ENRICH_JOIN_ASN: 'true'
      EF_FLOW_DECODER_ENRICH_JOIN_GEOIP: 'true'
      #EF_FLOW_DECODER_ENRICH_JOIN_SEC: 'true'
      #EF_FLOW_DECODER_ENRICH_JOIN_NETATTR: 'true'
      #EF_FLOW_DECODER_ENRICH_JOIN_SUBNETATTR: 'true'

      #EF_FLOW_DECODER_DURATION_PRECISION: 'ms'
      #EF_FLOW_DECODER_TIMESTAMP_PRECISION: 'ms'
      #EF_FLOW_DECODER_PERCENT_NORM: 100
      #EF_FLOW_DECODER_ENRICH_EXPAND_CLISRV: 'true'
      #EF_FLOW_DECODER_ENRICH_KEEP_CPU_TICKS: 'false'

      # Elasticsearch
      EF_FLOW_OUTPUT_ELASTICSEARCH_ENABLE: 'true'
      EF_FLOW_OUTPUT_ELASTICSEARCH_ECS_ENABLE: 'false'
      #EF_FLOW_OUTPUT_ELASTICSEARCH_BATCH_DEADLINE: 2000
      #EF_FLOW_OUTPUT_ELASTICSEARCH_BATCH_MAX_BYTES: 8388608
      #EF_FLOW_OUTPUT_ELASTICSEARCH_TIMESTAMP_SOURCE: 'end'
      #EF_FLOW_OUTPUT_ELASTICSEARCH_INDEX_PERIOD: 'daily'
      #EF_FLOW_OUTPUT_ELASTICSEARCH_INDEX_SUFFIX: ''

      #EF_FLOW_OUTPUT_ELASTICSEARCH_INDEX_TEMPLATE_ENABLE: 'true'
      #EF_FLOW_OUTPUT_ELASTICSEARCH_INDEX_TEMPLATE_OVERWRITE: 'true'
      EF_FLOW_OUTPUT_ELASTICSEARCH_INDEX_TEMPLATE_SHARDS: 1
      EF_FLOW_OUTPUT_ELASTICSEARCH_INDEX_TEMPLATE_REPLICAS: 0
      #EF_FLOW_OUTPUT_ELASTICSEARCH_INDEX_TEMPLATE_REFRESH_INTERVAL: '10s'
      #EF_FLOW_OUTPUT_ELASTICSEARCH_INDEX_TEMPLATE_CODEC: 'best_compression'
      #EF_FLOW_OUTPUT_ELASTICSEARCH_INDEX_TEMPLATE_ILM_LIFECYCLE: ''
      #EF_FLOW_OUTPUT_ELASTICSEARCH_INDEX_TEMPLATE_ILM_ROLLOVER_ALIAS: ''
      #EF_FLOW_OUTPUT_ELASTICSEARCH_INDEX_TEMPLATE_ISM_POLICY: ''
      #EF_FLOW_OUTPUT_ELASTICSEARCH_INDEX_TEMPLATE_PIPELINE_DEFAULT: '_none'
      #EF_FLOW_OUTPUT_ELASTICSEARCH_INDEX_TEMPLATE_PIPELINE_FINAL: '_none'

      # A comma separated list of Elasticsearch nodes to use. DO NOT include "http://" or "https://"
      EF_FLOW_OUTPUT_ELASTICSEARCH_ADDRESSES: '127.0.0.1:9200'
      #EF_FLOW_OUTPUT_ELASTICSEARCH_USERNAME: 'elastic'
      #EF_FLOW_OUTPUT_ELASTICSEARCH_PASSWORD: 'changeme'
      #EF_FLOW_OUTPUT_ELASTICSEARCH_CLOUD_ID: ''
      #EF_FLOW_OUTPUT_ELASTICSEARCH_API_KEY: ''

      EF_FLOW_OUTPUT_ELASTICSEARCH_TLS_ENABLE: 'false'
      EF_FLOW_OUTPUT_ELASTICSEARCH_TLS_SKIP_VERIFICATION: 'false'
      EF_FLOW_OUTPUT_ELASTICSEARCH_TLS_CA_CERT_FILEPATH: ''

      #EF_FLOW_OUTPUT_ELASTICSEARCH_RETRY_ENABLE: 'true'
      #EF_FLOW_OUTPUT_ELASTICSEARCH_RETRY_ON_TIMEOUT_ENABLE: 'true'
      #EF_FLOW_OUTPUT_ELASTICSEARCH_MAX_RETRIES: 3
      #EF_FLOW_OUTPUT_ELASTICSEARCH_RETRY_BACKOFF: 1000

Netflow Port

Netflow port is set to listen on UDP 5678

Importing Kibana Dashboards

Administration - Stack Management - Kibana - Saved Objects - Import

filename: kibana-7.14.x-codex-dark.ndjson

Autostart on Linux with systemd:

create /etc/systemd/system/docker-compose-app.service

[Unit]
Description=Docker Compose Application Service
Requires=docker.service
After=docker.service

[Service]
Type=oneshot
RemainAfterExit=yes
WorkingDirectory=/opt/netflowcollector-elk/ #CHANGE_TO_YOUR_DIRECTORY
ExecStart=/usr/local/bin/docker-compose up -d
ExecStop=/usr/local/bin/docker-compose down
TimeoutStartSec=0

[Install]
WantedBy=multi-user.target

systemctl enable docker-compose-app

Network device configuration:

If you are interested in how to configure a network device to send NetFlow, you can read a good article by my friend who helped me with assembly and testing: https://ccie37726.blogspot.com/2022/02/howto-flexible-netflow-with-ios-xe.html

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published