StormCell is a an high-speed Python tool to automate the parsing and shipping of Windows artifacts towards ELK or Splunk.
StormCell:
-
Expects a mounted disk or
ZIPtriage(s) fromKAPE'sTargetsorCollectRaptoras inputs. Any triage made withVelociraptorand itsKapeTargetmodule should work. -
Relies on
KAPEand around 30 third-party binaries or scripts to parse the artifacts and libraries such as .NET. -
Uses
Vectorand around 50Vectortransform files fromVector4IRto ship the parsed outputs toELKorSplunk.
Since StormCell relies on KAPE and Vector, a setup process has been implemented.
This setup is partially automated as some tools and utilities used by
StormCell require manual end-user license agreements.
-
KAPE, which requires a license for commercial usage, should be downloaded from: https://www.kroll.com/en/services/cyber-risk/incident-response-litigation-support/kroll-artifact-parser-extractor-kape -
The
Visual Studio Build Toolsare required for thezipfile_deflate64package (which is required to unzip Deflate64-compressed ZIP, as this algorithm is not supported by Python standard library). It can be installed using theVisual Studio Installer:vs_BuildTools.exe --norestart --passive --downloadThenInstall --includeRecommended --add Microsoft.VisualStudio.Workload.VCToolsSince this installation can be tedious, pre-generated wheel files are available in the repository for
Python 3.10 to 3.14. -
StormCellsetup can then be used to downloadVectorand the required third party tools:# Cloning using --recursive allows to fetch the Vector4IR submodule git clone --recursive https://github.com/CERT-W/StormCell.git python -m pip install -r requirements.txt python StormCell.py setup --kape "<KAPE_FOLDER>" --vector "<VECTOR_FOLDER>"
-
Optional -
KAPEandVectorcan be added to the systemPATHto skip setting their path inStormCell's config. Note that adding a user modifiable directory / files in the machinePATHcan introduce a local elevation of privileges.function Add-Path($Path) { $Path = [Environment]::GetEnvironmentVariable("PATH", "Machine") + [IO.Path]::PathSeparator + $Path [Environment]::SetEnvironmentVariable( "Path", $Path, "Machine" ) } Add-Path "<KAPE_FOLDER>" Add-Path "<VECTOR_FOLDER>\bin"
StormCell can ship events, through Vector, to an Elastic stack. An
Elastic stack can be spawned as docker containers using
deviantony's docker-elk project.
StormCell must be able to reach elasticsearch HTTP API endpoint (exposed
on port TCP 9200 by default). StormCell authenticates to ELK using
username and password credentials.
git clone https://github.com/deviantony/docker-elk.git
cd docker-elk
docker-compose up setup
docker-compose upIn ELK forwarding mode, StormCell will automatically create the required index with a specific type mapping:
"mappings": {
"dynamic_templates": [
{"dates": {"match_mapping_type": "date", "mapping": {"type": "date"}}},
{"strings": {"match_mapping_type": "*", "mapping": {"type": "text"}}}
]
}This mapping will handle dates as such, but every other fields as text, which allows to be resilient to some tool's outputs whose fields' types may vary and lead to parsing errors from ELK.
If an already existing index is detected, it will be used instead but may generate vector forwarding errors.
StormCell can ship events, through Vector, to Splunk. For non-commercial
uses, the docker-splunk Docker container may be used.
StormCell must be able to reach the Splunk's HTTP event collectors
service (exposed on port TCP 8088 by default). Additionally, a
HTTP Event Collector data input must be created beforehand and its associated token configured for StormCell.
docker pull splunk/splunk:latest
# Port 8000: Splunk web interface.
# Port 8088: Splunk HTTP event collectors service.
docker run -p [<IP>:]8000:8000 -p [<IP>:]8088:8088 -e "SPLUNK_PASSWORD=<PASSWORD>" -e "SPLUNK_START_ARGS=--accept-license" splunk/splunk:latestStormCell can be executed with admin rights using triage_once, triage_loop, or mountpoint execution mode:
-
In
triage_oncemode, the specified file(s) are parsed using the configuredKAPEmodules andStormCell, eventually ship parsed events withVectorthen exit. -
In
triage_loopmode,StormCelllistens for new file(s) in its short and long input folders, and parses then ships the detected files with the configuredKAPEmodules.Different
KAPEmodules can be associated with the short and long folders, with the idea of first parsing and shipping the most critical artifacts, and then allowing a deeper analysis on chosen collects following a first round of investigation. The parsing history is stored in a localSQLitedatabase, so only non-previously executedKAPEmodules are used for a subsequent execution on a given collect. -
In
mountpointmode,StormCellextracts artefacts from the specified mountpoint (usingKAPEand the givenKAPEtargets) to an intermediateZIParchive. TheZIParchive is then processed as would be a triage archive intriage_oncemode. When several executions are performed using the same hostname, only the initially created archive will be used.
By default, StormCell expects to ship events, requiring the configuration of
an ELK or Splunk sink. The tracking of shipped events is also stored in a
local SQLite database. Events tracked as shipped will depend on the KAPE
modules set as executed in the database.
To only parse artifacts, the parse_only option
should be set to true in the configuration file or the --parse_only flag
should be set as a command-line option.
To only send already parsed artifacts, the send_only option should be set to true in the configuration or the --parse_only flag should be set as a command-line option.
Complete parsing and event forwarding can be forced by using reparse and resend option flags in the command-line.
To reset the database which tracks modules executed and the data sent to ELK/Splunk, the option --cleardb can be used.
While parameters can be specified as command-line options, it is recommended to configure and use a configuration file.
The StormCell.conf configuration template is
self-documented, with required parameters identified through comments
(# Required).
python StormCell.py <setup | triage_once | triage_loop | mountpoint> -C <CONFIG_FILE>The reliance of StormCell on Vector and Vector4IR has advantages:
- A fast upload speed towards Splunk or ELK
- An automated and configurable formating of logs sent:
- Deduplication of some logs
- Transform of some fields
- An easily adaptable log format to match tools' updates or add new ones
It however also has some innate limitations:
- Monitoring the execution of
Vectorand the logs actually forwarded can be tedious:- Because of logs formating and alteration
- Because of the usage of sub-processes, threads and the lack of exhaustive logging options
- The frequent updates of some third-party tools may require
Vector4IRusers to update parsers - Vector4IR doesn't provide a configuration file for every kind of artifacts. Especially, voluminous artifacts will mostly be sent through alternative lightweight parsers: for example EvtxEcmd will not be sent since there are alternatives such as Hayabusa and Chainsaw. It is however possible to create new configuration files for such artifacts.