Skip to content

Commit

Permalink
Fix badly named filter macros. These have been detected using an upda…
Browse files Browse the repository at this point in the history
…ted version of contentctl that has not been released yet.
  • Loading branch information
pyth0n1c committed Jun 6, 2024
1 parent 9a96744 commit 252c17d
Show file tree
Hide file tree
Showing 6 changed files with 6 additions and 6 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ search: ' `amazon_security_lake` api.operation=DescribeEventAggregates "http_req
| stats values(src_endpoint.ip) as src_ip dc(src_endpoint.ip) as distinct_ip_count values(cloud.region) as cloud.region by time api.operation actor.user.account_uid actor.user.uid
| where distinct_ip_count > 1
| rename cloud.region as region, http_request.user_agent as user_agent, actor.user.account_uid as aws_account_id, actor.user.uid as user
| `aws_concurrent_sessions_from_different_ips_filter`'
| `asl_aws_concurrent_sessions_from_different_ips_filter`'
how_to_implement: The detection is based on Amazon Security Lake events from Amazon Web Services (AWS), which is a centralized data lake that provides
security-related data from AWS services. To use this detection, you must ingest CloudTrail logs from Amazon Security Lake into Splunk. To run this search,
ensure that you ingest events using the latest version of Splunk Add-on for Amazon Web Services (https://splunkbase.splunk.com/app/1876) or
Expand Down
2 changes: 1 addition & 1 deletion detections/cloud/aws_exfiltration_via_batch_service.yml
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ type: TTP
data_source:
- AWS CloudTrail JobCreated
description: This search looks for events where AWS Batch Service is used for creating a job that could potentially abuse the AWS Bucket Replication feature on S3 buckets. This AWS service can used to transfer data between different AWS S3 buckets and an attacker can leverage this to exfiltrate data by creating a malicious batch job.
search: '`cloudtrail` eventName = JobCreated | stats count min(_time) as firstTime max(_time) as lastTime values(serviceEventDetails.jobArn) as job_arn values(serviceEventDetails.status) as status by src_ip aws_account_id eventName errorCode userAgent| `security_content_ctime(firstTime)` | `security_content_ctime(lastTime)` | `aws_exfiltration_via_datasync_task_filter`'
search: '`cloudtrail` eventName = JobCreated | stats count min(_time) as firstTime max(_time) as lastTime values(serviceEventDetails.jobArn) as job_arn values(serviceEventDetails.status) as status by src_ip aws_account_id eventName errorCode userAgent| `security_content_ctime(firstTime)` | `security_content_ctime(lastTime)` | `aws_exfiltration_via_batch_service_filter`'
how_to_implement: You must install splunk AWS add on and Splunk App for AWS. This
search works with AWS CloudTrail logs.
known_false_positives: It is possible that an AWS Administrator or a user has legitimately created this job for some tasks.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ description: The following analytic detects API calls made to an S3 bucket when
S3 bucket replication can also be used for cross-account replication, where data is replicated from a source bucket owned by one AWS account to a destination bucket owned by a different AWS account.
search: '`cloudtrail` eventName = PutBucketReplication eventSource = s3.amazonaws.com
| rename requestParameters.* as *
| stats count values(bucketName) as source_bucket values(ReplicationConfiguration.Rule.ID) as rule_id values(ReplicationConfiguration.Rule.Destination.Bucket) as destination_bucket by _time user_arn userName user_type src_ip aws_account_id userIdentity.principalId user_agent | `aws_exfiltration_via_ec2_snapshot_filter`'
| stats count values(bucketName) as source_bucket values(ReplicationConfiguration.Rule.ID) as rule_id values(ReplicationConfiguration.Rule.Destination.Bucket) as destination_bucket by _time user_arn userName user_type src_ip aws_account_id userIdentity.principalId user_agent | `aws_exfiltration_via_bucket_replication_filter`'
how_to_implement: You must install splunk AWS add on and Splunk App for AWS. This
search works with AWS CloudTrail logs.
known_false_positives: It is possible that an AWS admin has legitimately implemented data replication to ensure data availability and improve data protection/backup strategies.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ data_source:
- AWS CloudTrail ConsoleLogin
search: '`cloudtrail` eventName=ConsoleLogin action=failure | bucket span=10m _time
| stats dc(user_name) AS unique_accounts values(user_name) as tried_accounts by _time,
src_ip |`aws_unusual_number_of_failed_authentications_from_ip_filter`'
src_ip |`aws_multiple_users_failing_to_authenticate_from_ip_filter`'
how_to_implement: You must install Splunk Add-on for AWS in order to ingest Cloudtrail.
We recommend the users to try different combinations of the bucket span time and
the tried account threshold to tune this search according to their environment.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@ search: '| mstats avg(k8s.pod.network.io) as io where `kubernetes_metrics` by k8
| rename service as k8s.service
| where count > 5
| rename k8s.node.name as host
| `kubernetes_anomalous_inbound_outbound_network_traffic_io_filter` '
| `kubernetes_anomalous_inbound_outbound_network_io_filter`'
how_to_implement: 'To implement this detection, follow these steps:
* Deploy the OpenTelemetry Collector (OTEL) to your Kubernetes cluster.
Expand Down
2 changes: 1 addition & 1 deletion detections/endpoint/living_off_the_land_detection.yml
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ search: '| tstats `security_content_summariesonly` min(_time) as firstTime max(_
Off The Land" All_Risk.risk_object_type="system" by All_Risk.risk_object All_Risk.risk_object_type
All_Risk.annotations.mitre_attack.mitre_tactic | `drop_dm_object_name(All_Risk)`
| `security_content_ctime(firstTime)` | `security_content_ctime(lastTime)` | where
source_count >= 5 | `living_off_the_land_filter`'
source_count >= 5 | `living_off_the_land_detection_filter`'
how_to_implement: To implement this correlation search a user needs to enable all
detections in the Living Off The Land Analytic Story and confirm it is generating
risk events. A simple search `index=risk analyticstories="Living Off The Land"`
Expand Down

0 comments on commit 252c17d

Please sign in to comment.