Skip to content

CERT-W/StormCell

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

StormCell is a an high-speed Python tool to automate the parsing and shipping of Windows artifacts towards ELK or Splunk.

StormCell:

  • Expects a mounted disk or ZIP triage(s) from KAPE's Targets or CollectRaptor as inputs. Any triage made with Velociraptor and its KapeTarget module should work.

  • Relies on KAPE and around 30 third-party binaries or scripts to parse the artifacts and libraries such as .NET.

  • Uses Vector and around 50 Vector transform files from Vector4IR to ship the parsed outputs to ELK or Splunk.

Installation / Setup

Since StormCell relies on KAPE and Vector, a setup process has been implemented. This setup is partially automated as some tools and utilities used by StormCell require manual end-user license agreements.

  1. KAPE, which requires a license for commercial usage, should be downloaded from: https://www.kroll.com/en/services/cyber-risk/incident-response-litigation-support/kroll-artifact-parser-extractor-kape

  2. The Visual Studio Build Tools are required for the zipfile_deflate64 package (which is required to unzip Deflate64-compressed ZIP, as this algorithm is not supported by Python standard library). It can be installed using the Visual Studio Installer:

    vs_BuildTools.exe --norestart --passive --downloadThenInstall --includeRecommended --add Microsoft.VisualStudio.Workload.VCTools
    

    Since this installation can be tedious, pre-generated wheel files are available in the repository for Python 3.10 to 3.14.

  3. StormCell setup can then be used to download Vector and the required third party tools:

    # Cloning using --recursive allows to fetch the Vector4IR submodule
    git clone --recursive https://github.com/CERT-W/StormCell.git
    python -m pip install -r requirements.txt
    python StormCell.py setup --kape "<KAPE_FOLDER>" --vector "<VECTOR_FOLDER>"
  4. Optional - KAPE and Vector can be added to the system PATH to skip setting their path in StormCell's config. Note that adding a user modifiable directory / files in the machine PATH can introduce a local elevation of privileges.

    function Add-Path($Path) {
      $Path = [Environment]::GetEnvironmentVariable("PATH", "Machine") + [IO.Path]::PathSeparator + $Path
      [Environment]::SetEnvironmentVariable( "Path", $Path, "Machine" )
    }
    
    Add-Path "<KAPE_FOLDER>"
    Add-Path "<VECTOR_FOLDER>\bin"

ELK

StormCell can ship events, through Vector, to an Elastic stack. An Elastic stack can be spawned as docker containers using deviantony's docker-elk project.

StormCell must be able to reach elasticsearch HTTP API endpoint (exposed on port TCP 9200 by default). StormCell authenticates to ELK using username and password credentials.

git clone https://github.com/deviantony/docker-elk.git
cd docker-elk
docker-compose up setup
docker-compose up

In ELK forwarding mode, StormCell will automatically create the required index with a specific type mapping:

"mappings": {
    "dynamic_templates": [
        {"dates": {"match_mapping_type": "date", "mapping": {"type": "date"}}},
        {"strings": {"match_mapping_type": "*", "mapping": {"type": "text"}}}
    ]
}

This mapping will handle dates as such, but every other fields as text, which allows to be resilient to some tool's outputs whose fields' types may vary and lead to parsing errors from ELK.

If an already existing index is detected, it will be used instead but may generate vector forwarding errors.

Splunk

StormCell can ship events, through Vector, to Splunk. For non-commercial uses, the docker-splunk Docker container may be used.

StormCell must be able to reach the Splunk's HTTP event collectors service (exposed on port TCP 8088 by default). Additionally, a HTTP Event Collector data input must be created beforehand and its associated token configured for StormCell.

docker pull splunk/splunk:latest

# Port 8000: Splunk web interface.
# Port 8088: Splunk HTTP event collectors service.
docker run -p [<IP>:]8000:8000 -p [<IP>:]8088:8088 -e "SPLUNK_PASSWORD=<PASSWORD>" -e "SPLUNK_START_ARGS=--accept-license" splunk/splunk:latest

Execution

StormCell can be executed with admin rights using triage_once, triage_loop, or mountpoint execution mode:

  • In triage_once mode, the specified file(s) are parsed using the configured KAPE modules and StormCell, eventually ship parsed events with Vector then exit.

  • In triage_loop mode, StormCell listens for new file(s) in its short and long input folders, and parses then ships the detected files with the configured KAPE modules.

    Different KAPE modules can be associated with the short and long folders, with the idea of first parsing and shipping the most critical artifacts, and then allowing a deeper analysis on chosen collects following a first round of investigation. The parsing history is stored in a local SQLite database, so only non-previously executed KAPE modules are used for a subsequent execution on a given collect.

  • In mountpoint mode, StormCell extracts artefacts from the specified mountpoint (using KAPE and the given KAPE targets) to an intermediate ZIP archive. The ZIP archive is then processed as would be a triage archive in triage_once mode. When several executions are performed using the same hostname, only the initially created archive will be used.

By default, StormCell expects to ship events, requiring the configuration of an ELK or Splunk sink. The tracking of shipped events is also stored in a local SQLite database. Events tracked as shipped will depend on the KAPE modules set as executed in the database.

To only parse artifacts, the parse_only option should be set to true in the configuration file or the --parse_only flag should be set as a command-line option. To only send already parsed artifacts, the send_only option should be set to true in the configuration or the --parse_only flag should be set as a command-line option.

Complete parsing and event forwarding can be forced by using reparse and resend option flags in the command-line.

To reset the database which tracks modules executed and the data sent to ELK/Splunk, the option --cleardb can be used.

Quick usage

While parameters can be specified as command-line options, it is recommended to configure and use a configuration file.

The StormCell.conf configuration template is self-documented, with required parameters identified through comments (# Required).

python StormCell.py <setup | triage_once | triage_loop | mountpoint> -C <CONFIG_FILE>

Vector & Vector4IR

The reliance of StormCell on Vector and Vector4IR has advantages:

  • A fast upload speed towards Splunk or ELK
  • An automated and configurable formating of logs sent:
    • Deduplication of some logs
    • Transform of some fields
  • An easily adaptable log format to match tools' updates or add new ones

It however also has some innate limitations:

  • Monitoring the execution of Vector and the logs actually forwarded can be tedious:
    • Because of logs formating and alteration
    • Because of the usage of sub-processes, threads and the lack of exhaustive logging options
  • The frequent updates of some third-party tools may require Vector4IR users to update parsers
  • Vector4IR doesn't provide a configuration file for every kind of artifacts. Especially, voluminous artifacts will mostly be sent through alternative lightweight parsers: for example EvtxEcmd will not be sent since there are alternatives such as Hayabusa and Chainsaw. It is however possible to create new configuration files for such artifacts.

About

StormCell is a Python tool to automate the parsing and shipping of Windows artifacts.

Resources

License

Stars

Watchers

Forks

Packages

 
 
 

Contributors

Languages