Skip to content
Permalink
Browse files

add initial plaso supertimeline support! Fixes #162

  • Loading branch information...
philhagen committed Jun 15, 2019
1 parent 69cea93 commit 917f54417a43964cb68a55b7e24ed580db1abb5b
@@ -27,11 +27,12 @@ All parsers and dashboards for this VM are now maintained in this Github reposit
* has sudo access to run ALL commands
* Logstash will ingest all files from the following filesystem locations:
* `/logstash/syslog/`: Syslog-formatted data
* NOTICE: Remember that syslog DOES NOT reflect the year of a log entry! Therefore, Logstash has been configured to look for a year value in the path to a file. For example: `/logstash/syslog/2015/var/log/messages` will assign all entries from that file to the year 2015. If no year is present, the current year will be assumed. This is enabled only for the `/logstash/syslog/` directory.
* `/logstash/nfarch/`: Archived NetFlow output, formatted as described below
* `/logstash/httpd/`: Apache logs in common, combined, or vhost-combined formats
* `/logstash/passivedns/`: Logs from the passivedns utility
* `/logstash/kape/`: JSON-format files generated by the [KAPE](https://learn.duffandphelps.com/kape) triage collection tool ([See this document](doc/kape_support.md) for details on which specific output files are currently supported and their required file naming structure.)
* NOTICE: Remember that syslog DOES NOT reflect the year of a log entry! Therefore, Logstash has been configured to look for a year value in the path to a file. For example: `/logstash/syslog/2015/var/log/messages` will assign all entries from that file to the year 2015. If no year is present, the current year will be assumed. This is enabled only for the `/logstash/syslog/` directory.
* `/logstash/kape/`: JSON-format files generated by the [KAPE](https://learn.duffandphelps.com/kape) triage collection tool. ([See this document](doc/kape_support.md) for details on which specific output files are currently supported and their required file naming structure.)
* `/logstash/plaso/`: CSV bodyfile-format files generated by the [Plaso](https://github.com/log2timeline/plaso) tool from the [log2timeline](https://github.com/log2timeline) framework. ([See this document](doc/log2timeline-plaso.md) for details on creating CSV files in a supported format.)
* Commands to be familiar with:
* `/usr/local/sbin/sof-elk_clear.py`: DESTROY contents of the Elasticsearch database. Most frequently used with an index name base (e.g. `sof-elk_clear.py -i logstash` will delete all data from the Elasticsearch `logstash-*` indexes. Other options detailed with the `-h` flag.
* `/usr/local/sbin/sof-elk_update.sh`: Update the SOF-ELK® configuration files from the Github repository. (Requires sudo.)
@@ -7,7 +7,7 @@
# This file contains transforms and enrichments to be applied in postprocessing

filter {
if [type] == "log2timeline" {
if [type] == "plaso" {
csv {
separator => ","
quote_char => "ª" # workaround: don't use a quote character as " gives issues if the field contains a "
@@ -35,44 +35,46 @@ filter {
if ("C" in [macb]) { mutate { add_tag => [ "changed" ] } }
if ("B" in [macb]) { mutate { add_tag => [ "birth" ] } }

# extract data from the "desc" field based on the respective datasource value
# Extract filenames
if [datasource] == "FILE" {
if [datasource] == "FILE" or [datasource] == "META" {
grok {
break_on_match => false
match => [
"desc", "(:(?<extracted.path>/.*?))?$",
"extracted.path", "(?<extracted.filename>[^/]+?)?$",
"extracted.filename", "((\.(?<extracted.ext>[^./]+))?)?$"
"desc", "(:(?<path>/.*?))?$",
"path", "(?<filename>[^/]+?)?$"
]
}
}

if [datasource] == "META" {
# Extract urls
} else if [datasource] == "WEBHIST" {
grok {
break_on_match => false
match => [
"filename", "(:(?<extracted.path>/.*?))?$",
"extracted.path", "(?<extracted.filename>[^/]+?)?$",
"extracted.filename", "((\.(?<extracted.ext>[^./]+))?)?$"
"desc", "Location: (?<url>.*?)[ $]"
]
}
}

# Extract urls
if [datasource] == "WEBHIST" {
# extract event log data fields
} else if [datasource] == "EVT" and [datasourcetype] == "WinEVTX" {
grok {
match => [
"desc", "Location: (?<extracted.url>.*?)[ $]"
]
patterns_dir => [ "/usr/local/sof-elk/grok-patterns" ]
match => [ "desc", "\[%{POSINT:event_id}.*\] Source Name: %{DATA:provider} Strings: \[%{DATA:payload}\] Computer Name: %{HOSTNAME:computer} Record Number: %{POSINT:record_number} Event Level: %{POSINT:level}" ]
tag_on_failure => [ "_gpfail_l2t01" ]
}

# extract prefetch data fields
} else if [datasource] == "LOG" and [datasourcetype] == "WinPrefetch" {
grok {
patterns_dir => [ "/usr/local/sof-elk/grok-patterns" ]
match => [ "desc", "Prefetch \[%{DATA:filename}\] was executed - run count %{POSINT:run_count} path: %{DATA:path} hash: %{WORD:prefetch_hash} volume: %{POSINT:volume_number} \[serial number: %{DATA:volume_serial} device path: %{DATA:device_path}\]" ]
tag_on_failure => [ "_gpfail_l2t02" ]
}
}

mutate {
convert => [
"inode", "integer",
"version", "integer
]
lowercase => [
"extracted.ext"
"version", "integer"
]
remove_field => [
"message",
@@ -83,4 +85,4 @@ filter {
]
}
}
}
}

This file was deleted.

@@ -0,0 +1,10 @@
output {
if [type] == "plaso" {
elasticsearch {
index => "plaso-%{+YYYY.MM.dd}"
template => "/usr/local/sof-elk/lib/elasticsearch-plaso-template.json"
template_name => "plaso"
template_overwrite => true
}
}
}
@@ -0,0 +1,17 @@
SOF-ELK® log2timeline/Plaso Support
=======

[log2timeline](https://github.com/log2timeline) is a framework for extensive and flexible timeline creation. The [Plaso](https://github.com/log2timeline/plaso) tool, part of the framework, creates what are known as "supertimelines", containing aggregated and normalized forensic artifacts, based primarily on observed time stamps. This gives a forensicator the ability to review a wide range of artifacts in a standardized fashion.

SOF-ELK will parse the CSV format of the Plaso tool's output. The commands below serve as a general guideline on creating a compatible output file that SOF-ELK can handle. These commands are not a substitute for log2timeline and/or Plaso documentation.

**Generating a compatible Plaso Output File**

- Generate the Plaso dumpfile
- `log2timeline.py -z UTC --parsers "win7,-filestat" /cases/capstone/base-rd01-triage-plaso.dump /mnt/windows_mount/base-rd01/`
- Use `psort.py` to generate CSV
- `psort.py -z "UTC" -o L2tcsv base-rd01-triage-plaso.dump "date > '2018-08-23 00:00:00' AND date < '2018-09-07 00:00:00'" -w base-rd01-triage-plaso.csv`

**Credits:**

Mark Hallman and Mike Pilkington did a lot of the groundwork on a standalone ELK VM used in FOR508. Without their work and help integrating the configuration to SOF-ELK, this would have been a much more difficult task.
@@ -0,0 +1,11 @@
[
{
"key": "@timestamp",
"value": {
"id": "date",
"params": {
"pattern": "YYYY-MM-DD HH:mm:ss.SSS\\Z"
}
}
}
]
@@ -0,0 +1,68 @@
{"name":"@timestamp","type":"date","count":0,"scripted":false,"searchable":true,"aggregatable":true,"readFromDocValues":true}
{"name":"@version","type":"string","count":0,"scripted":false,"searchable":true,"aggregatable":true,"readFromDocValues":true}
{"name":"_id","type":"string","count":0,"scripted":false,"searchable":true,"aggregatable":true,"readFromDocValues":false}
{"name":"_index","type":"string","count":0,"scripted":false,"searchable":true,"aggregatable":true,"readFromDocValues":false}
{"name":"_score","type":"number","count":0,"scripted":false,"searchable":false,"aggregatable":false,"readFromDocValues":false}
{"name":"_source","type":"_source","count":0,"scripted":false,"searchable":false,"aggregatable":false,"readFromDocValues":false}
{"name":"_type","type":"string","count":0,"scripted":false,"searchable":true,"aggregatable":true,"readFromDocValues":false}
{"name":"beat.hostname","type":"string","count":0,"scripted":false,"searchable":true,"aggregatable":false,"readFromDocValues":false}
{"name":"beat.hostname.keyword","type":"string","count":0,"scripted":false,"searchable":true,"aggregatable":true,"readFromDocValues":true}
{"name":"beat.name","type":"string","count":0,"scripted":false,"searchable":true,"aggregatable":false,"readFromDocValues":false}
{"name":"beat.name.keyword","type":"string","count":0,"scripted":false,"searchable":true,"aggregatable":true,"readFromDocValues":true}
{"name":"beat.version","type":"string","count":0,"scripted":false,"searchable":true,"aggregatable":false,"readFromDocValues":false}
{"name":"beat.version.keyword","type":"string","count":0,"scripted":false,"searchable":true,"aggregatable":true,"readFromDocValues":true}
{"name":"computer","type":"string","count":1,"scripted":false,"searchable":true,"aggregatable":false,"readFromDocValues":false}
{"name":"computer.keyword","type":"string","count":0,"scripted":false,"searchable":true,"aggregatable":true,"readFromDocValues":true}
{"name":"datasource","type":"string","count":0,"scripted":false,"searchable":true,"aggregatable":false,"readFromDocValues":false}
{"name":"datasource.keyword","type":"string","count":0,"scripted":false,"searchable":true,"aggregatable":true,"readFromDocValues":true}
{"name":"datasourcetype","type":"string","count":0,"scripted":false,"searchable":true,"aggregatable":false,"readFromDocValues":false}
{"name":"datasourcetype.keyword","type":"string","count":0,"scripted":false,"searchable":true,"aggregatable":true,"readFromDocValues":true}
{"name":"desc","type":"string","count":0,"scripted":false,"searchable":true,"aggregatable":false,"readFromDocValues":false}
{"name":"desc.keyword","type":"string","count":0,"scripted":false,"searchable":true,"aggregatable":true,"readFromDocValues":true}
{"name":"device_path","type":"string","count":0,"scripted":false,"searchable":true,"aggregatable":false,"readFromDocValues":false}
{"name":"device_path.keyword","type":"string","count":0,"scripted":false,"searchable":true,"aggregatable":true,"readFromDocValues":true}
{"name":"event_id","type":"number","count":0,"scripted":false,"searchable":true,"aggregatable":true,"readFromDocValues":true}
{"name":"eventtype","type":"string","count":0,"scripted":false,"searchable":true,"aggregatable":false,"readFromDocValues":false}
{"name":"eventtype.keyword","type":"string","count":0,"scripted":false,"searchable":true,"aggregatable":true,"readFromDocValues":true}
{"name":"extra","type":"string","count":0,"scripted":false,"searchable":true,"aggregatable":false,"readFromDocValues":false}
{"name":"extra.keyword","type":"string","count":0,"scripted":false,"searchable":true,"aggregatable":true,"readFromDocValues":true}
{"name":"filename","type":"string","count":0,"scripted":false,"searchable":true,"aggregatable":false,"readFromDocValues":false}
{"name":"filename.keyword","type":"string","count":0,"scripted":false,"searchable":true,"aggregatable":true,"readFromDocValues":true}
{"name":"format","type":"string","count":0,"scripted":false,"searchable":true,"aggregatable":false,"readFromDocValues":false}
{"name":"format.keyword","type":"string","count":0,"scripted":false,"searchable":true,"aggregatable":true,"readFromDocValues":true}
{"name":"host","type":"string","count":0,"scripted":false,"searchable":true,"aggregatable":false,"readFromDocValues":false}
{"name":"host.keyword","type":"string","count":0,"scripted":false,"searchable":true,"aggregatable":true,"readFromDocValues":true}
{"name":"inode","type":"number","count":0,"scripted":false,"searchable":true,"aggregatable":true,"readFromDocValues":true}
{"name":"input.type","type":"string","count":0,"scripted":false,"searchable":true,"aggregatable":false,"readFromDocValues":false}
{"name":"input.type.keyword","type":"string","count":0,"scripted":false,"searchable":true,"aggregatable":true,"readFromDocValues":true}
{"name":"level","type":"number","count":0,"scripted":false,"searchable":true,"aggregatable":true,"readFromDocValues":true}
{"name":"log.file.path","type":"string","count":0,"scripted":false,"searchable":true,"aggregatable":false,"readFromDocValues":false}
{"name":"log.file.path.keyword","type":"string","count":0,"scripted":false,"searchable":true,"aggregatable":true,"readFromDocValues":true}
{"name":"macb","type":"string","count":0,"scripted":false,"searchable":true,"aggregatable":false,"readFromDocValues":false}
{"name":"macb.keyword","type":"string","count":0,"scripted":false,"searchable":true,"aggregatable":true,"readFromDocValues":true}
{"name":"notes","type":"string","count":0,"scripted":false,"searchable":true,"aggregatable":false,"readFromDocValues":false}
{"name":"notes.keyword","type":"string","count":0,"scripted":false,"searchable":true,"aggregatable":true,"readFromDocValues":true}
{"name":"offset","type":"number","count":0,"scripted":false,"searchable":true,"aggregatable":true,"readFromDocValues":true}
{"name":"path","type":"string","count":0,"scripted":false,"searchable":true,"aggregatable":false,"readFromDocValues":false}
{"name":"path.keyword","type":"string","count":0,"scripted":false,"searchable":true,"aggregatable":true,"readFromDocValues":true}
{"name":"payload","type":"string","count":1,"scripted":false,"searchable":true,"aggregatable":false,"readFromDocValues":false}
{"name":"payload.keyword","type":"string","count":0,"scripted":false,"searchable":true,"aggregatable":true,"readFromDocValues":true}
{"name":"prefetch_hash","type":"string","count":0,"scripted":false,"searchable":true,"aggregatable":false,"readFromDocValues":false}
{"name":"prefetch_hash.keyword","type":"string","count":0,"scripted":false,"searchable":true,"aggregatable":true,"readFromDocValues":true}
{"name":"prospector.type","type":"string","count":0,"scripted":false,"searchable":true,"aggregatable":true,"readFromDocValues":true}
{"name":"provider","type":"string","count":2,"scripted":false,"searchable":true,"aggregatable":false,"readFromDocValues":false}
{"name":"provider.keyword","type":"string","count":0,"scripted":false,"searchable":true,"aggregatable":true,"readFromDocValues":true}
{"name":"record_number","type":"number","count":0,"scripted":false,"searchable":true,"aggregatable":true,"readFromDocValues":true}
{"name":"run_count","type":"number","count":0,"scripted":false,"searchable":true,"aggregatable":true,"readFromDocValues":true}
{"name":"source","type":"string","count":0,"scripted":false,"searchable":true,"aggregatable":false,"readFromDocValues":false}
{"name":"source.keyword","type":"string","count":0,"scripted":false,"searchable":true,"aggregatable":true,"readFromDocValues":true}
{"name":"tags","type":"string","count":0,"scripted":false,"searchable":true,"aggregatable":false,"readFromDocValues":false}
{"name":"tags.keyword","type":"string","count":0,"scripted":false,"searchable":true,"aggregatable":true,"readFromDocValues":true}
{"name":"type","type":"string","count":0,"scripted":false,"searchable":true,"aggregatable":false,"readFromDocValues":false}
{"name":"type.keyword","type":"string","count":0,"scripted":false,"searchable":true,"aggregatable":true,"readFromDocValues":true}
{"name":"user","type":"string","count":0,"scripted":false,"searchable":true,"aggregatable":false,"readFromDocValues":false}
{"name":"user.keyword","type":"string","count":0,"scripted":false,"searchable":true,"aggregatable":true,"readFromDocValues":true}
{"name":"version","type":"number","count":0,"scripted":false,"searchable":true,"aggregatable":true,"readFromDocValues":true}
{"name":"volume_number","type":"number","count":0,"scripted":false,"searchable":true,"aggregatable":true,"readFromDocValues":true}
{"name":"volume_serial","type":"string","count":0,"scripted":false,"searchable":true,"aggregatable":false,"readFromDocValues":false}
{"name":"volume_serial.keyword","type":"string","count":0,"scripted":false,"searchable":true,"aggregatable":true,"readFromDocValues":true}
@@ -0,0 +1,6 @@
{
"attributes": {
"title": "plaso-*",
"timeFieldName": "@timestamp"
}
}
@@ -1,6 +1,6 @@
{
"index_patterns": [
"log2timeline-*"
"plaso-*"
],
"settings": {
"number_of_shards" : 1,
@@ -5,13 +5,13 @@

- type: log
paths:
- /logstash/log2timeline/*/*/*/*/*
- /logstash/log2timeline/*/*/*/*
- /logstash/log2timeline/*/*/*
- /logstash/log2timeline/*/*
- /logstash/log2timeline/*
- /logstash/plaso/*/*/*/*/*
- /logstash/plaso/*/*/*/*
- /logstash/plaso/*/*/*
- /logstash/plaso/*/*
- /logstash/plaso/*
exclude_files: [ '\.gz$', '\.bz2$', '\.zip$' ]
close_inactive: 5m
fields_under_root: true
fields:
type: log2timeline
type: plaso
@@ -26,7 +26,7 @@ for lspid in $( ps -u logstash | grep java | awk '{print $1}' ); do
done

# create necessary ingest directories
ingest_dirs="syslog nfarch httpd passivedns zeek kape"
ingest_dirs="syslog nfarch httpd passivedns zeek kape plaso"
for ingest_dir in ${ingest_dirs}; do
if [ ! -d /logstash/${ingest_dir} ]; then
mkdir -m 1777 -p /logstash/${ingest_dir}
@@ -74,4 +74,4 @@ for deadlink in $( ls -1 /etc/cron.d/* ); do
done

# reload all dashboards
/usr/local/sbin/load_all_dashboards.sh
/usr/local/sbin/load_all_dashboards.sh

0 comments on commit 917f544

Please sign in to comment.
You can’t perform that action at this time.