DeRF (Detection Replay Framework) is an "Attacks As A Service" framework, allowing the emulation of offensive techniques and generation of repeatable detection samples from a UI - without the need for End Users to install software, use the CLI or possess credentials in the target environment.
Read the release announcement
Read the Full Documentation
DeRF is a framework for executing attacks and generating detection samples against resource an AWS account and GCP Project. This framework is deployed across a targeted AWS Account and a GCP Project with Terraform. For more detailed instructions on deployment see here.
- Complete Prerequisites - see Prerequisites.
- Complete System Requirements - see System Requirements.
- Clone the Github repo to your local system.
git clone https://github.com/vectra-ai-research/derf.git
- Deploy The DeRF via Terraform from the
./env-prod
directory.
export AWS_PROFILE=PROFILE
terraform init -backend-config=derf.conf
terraform plan -var-file=derf.tfvars
terraform apply -var-file=derf.tfvars
Attack execution targeting both AWS and GCP is performed by invoking a Google Cloud Workflow. Workflows can be invoked either on the Google Cloud Console or programmatically with the gcloud cli
- Log into the Google Cloud Console and and navigate to the workflows page.
- Click on the name of the workflow that matches the attack you want to execute.
- Click on the
EXECUTE
button. - Refer to the Code panel on the right-hand side and select which user to run the attack as by copying one of the possible inputs.
- Paste selected json in the Input panel on the left-hand side.
- Finally, select the
EXECUTE
button at the bottom of the screen. The results of the attack will be displayed on the right-hand side of the screen.
- Ensure the Google command line tool is installed locally. Reference Google maintained documentation for instructions on installing
gcloud cli
- Authenticate to Google Cloud Project which DeRF is deployed
gcloud auth login --project PROJECT_ID
- Invoke a particular attack techniques' workflow with the
gcloud cli
. See Google documentation for more complete instructions on the workflows service.
gcloud workflows run aws-ec2-get-user-data `--data={"user": "user01"}`
This projects's documentation is build using mkdocs with material. From the root of this project you can always run mkdocs to see the rendered documentation locally or use the handle Makedocs shortcut, make docs-serve
.
- Install Python Requirements
pip install mkdocs-material mkdocs-awesome-pages-plugin
- Start mkdocs servcer
mkdocs serve --livereload
- Navigate to the locally hosted documentation with your browser 127.0.0.1:8000
Maintainer: @KatTraxler
- Status Red Team by DataDog
- CNAPPGoat by Ermetic
- Atomic Red Team by Red Canary
- Leonidas by F-Secure
- pacu by Rhino Security Labs
- Amazon GuardDuty Tester
- CloudGoat by Rhino Security Labs
If you found this tool useful, want to share an interesting use-case, bring issues to attention, whatever the reason - share them. You can email at: TheDerf@vectra.ai.