Elasticsearch for Offensive Security
Switch branches/tags
Nothing to show
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
_data/nmap Public Release Jul 16, 2018
elasticsearch Public Release Jul 16, 2018
extensions Public Release Jul 16, 2018
ingestor Modify VulntoES: only ingest open ports Oct 17, 2018
kibana Update Dashboard View Aug 8, 2018
logstash Public Release Jul 16, 2018
.gititgnore Update .gitignore file Aug 8, 2018
LICENSE Public Release Jul 16, 2018
README.md Update README Nov 9, 2018
docker-compose.yml Update docker-compose: always restart containers Oct 17, 2018

README.md

Offensive ELK: Elasticsearch for Offensive Security

Traditional “defensive” tools can be effectively used for Offensive security data analysis, helping your team collaborate and triage scan results.

In particular, Elasticsearch offers the chance to aggregate a moltitude of disparate data sources, query them with a unifed interface, with the aim of extracting actionable knowledge from a huge amount of unclassified data.

A full walkthrough that led me to this setup can be found at: https://www.marcolancini.it/2018/blog-elk-for-nmap/.

Usage

  1. Clone this repository:
❯ git clone https://github.com/marco-lancini/docker_offensive_elk.git
  1. Create the _data folder and ensure it is owned by your own user:
❯ cd docker_offensive_elk/
❯ mkdir ./_data/
❯ sudo chown -R <user>:<user> ./_data/
  1. Start the stack using docker-compose:
docker-elk ❯ docker-compose up -d
  1. Give Kibana a few seconds to initialize, then access the Kibana web UI running at: http://localhost:5601.
  2. During the first run, create an index.
  3. Ingest nmap results.

Create an Index

  1. Create the nmap-vuln-to-es index using curl:
❯ curl -XPUT 'localhost:9200/nmap-vuln-to-es'
  1. Open Kibana in your browser (http://localhost:5601) and you should be presented with the screen below:

  1. Insert nmap* as index pattern and press "Next Step":

  1. Choose "I don't want to use the Time Filter", then click on "Create Index Pattern":

  1. If everything goes well you should be presented with a page that lists every field in the nmap* index and the field's associated core type as recorded by Elasticsearch.

Ingest Nmap Results

In order to be able to ingest our Nmap scans, we will have to output the results in an XML formatted report (-oX) that can be parsed by Elasticsearch. Once done with the scans, place the reports in the ./_data/nmap/ folder and run the ingestor:

❯ docker-compose run ingestor ingest
Starting elk_elasticsearch ... done
Processing /data/scan_192.168.1.0_24.xml file...
Sending Nmap data to Elasticsearch
Processing /data/scan_192.168.2.0_24.xml file...
Sending Nmap data to Elasticsearch
Processing /data/scan_192.168.3.0_24.xml file...
Sending Nmap data to Elasticsearch