:warning: THIS IS A EXPERIMENTAL DETECTION

This detection has been marked experimental by the Splunk Threat Research team. This means we have not been able to test, simulate, or build datasets for this detection. Use at your own risk. This analytic is NOT supported.

Try in Splunk Security Cloud

Description

This search detects users creating spikes in API activity related to deletion of S3 buckets in your AWS environment. It will also update the cache file that factors in the latest data.

  • Type: Anomaly
  • Product: Splunk Enterprise, Splunk Enterprise Security, Splunk Cloud

  • Last Updated: 2018-11-27
  • Author: Bhavin Patel, Splunk
  • ID: e733a326-59d2-446d-b8db-14a17151aa68

Annotations

ATT&CK

ATT&CK

ID Technique Tactic
T1530 Data from Cloud Storage Collection
Kill Chain Phase
  • Exploitation
NIST
  • DE.AE
CIS20
  • CIS 13
CVE
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
`cloudtrail` eventName=DeleteBucket [search `cloudtrail` eventName=DeleteBucket 
| spath output=arn path=userIdentity.arn 
| stats count as apiCalls by arn 
| inputlookup s3_deletion_baseline append=t 
| fields - latestCount 
| stats values(*) as * by arn 
| rename apiCalls as latestCount 
| eval newAvgApiCalls=avgApiCalls + (latestCount-avgApiCalls)/720 
| eval newStdevApiCalls=sqrt(((pow(stdevApiCalls, 2)*719 + (latestCount-newAvgApiCalls)*(latestCount-avgApiCalls))/720)) 
| eval avgApiCalls=coalesce(newAvgApiCalls, avgApiCalls), stdevApiCalls=coalesce(newStdevApiCalls, stdevApiCalls), numDataPoints=if(isnull(latestCount), numDataPoints, numDataPoints+1) 
| table arn, latestCount, numDataPoints, avgApiCalls, stdevApiCalls 
| outputlookup s3_deletion_baseline 
| eval dataPointThreshold = 15, deviationThreshold = 3 
| eval isSpike=if((latestCount > avgApiCalls+deviationThreshold*stdevApiCalls) AND numDataPoints > dataPointThreshold, 1, 0) 
| where isSpike=1 
| rename arn as userIdentity.arn 
| table userIdentity.arn] 
| spath output=user userIdentity.arn 
| spath output=bucketName path=requestParameters.bucketName 
| stats values(bucketName) as bucketName, count as numberOfApiCalls, dc(eventName) as uniqueApisCalled by user 
| `detect_spike_in_s3_bucket_deletion_filter`

Macros

The SPL above uses the following Macros:

:information_source: detect_spike_in_s3_bucket_deletion_filter is a empty macro by default. It allows the user to filter out any results (false positives) without editing the SPL.

Lookups

The SPL above uses the following Lookups:

Required fields

List of fields required to use this analytic.

  • _time
  • eventName
  • userIdentity.arn

How To Implement

You must install the AWS App for Splunk (version 5.1.0 or later) and Splunk Add-on for AWS (version 4.4.0 or later), then configure your AWS CloudTrail inputs. You can modify dataPointThreshold and deviationThreshold to better fit your environment. The dataPointThreshold variable is the minimum number of data points required to have a statistically significant amount of data to determine. The deviationThreshold variable is the number of standard deviations away from the mean that the value must be to be considered a spike. This search works best when you run the "Baseline of S3 Bucket deletion activity by ARN" support search once to create a baseline of previously seen S3 bucket-deletion activity.

Known False Positives

Based on the values ofdataPointThreshold and deviationThreshold, the false positive rate may vary. Please modify this according the your environment.

Associated Analytic Story

RBA

Risk Score Impact Confidence Message
25.0 50 50 tbd

:information_source: The Risk Score is calculated by the following formula: Risk Score = (Impact * Confidence/100). Initial Confidence and Impact is set by the analytic author.

Reference

Test Dataset

Replay any dataset to Splunk Enterprise by using our replay.py tool or the UI. Alternatively you can replay a dataset into a Splunk Attack Range

source | version: 1