Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Requested updates by CB Lab #19

Open
wants to merge 2 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
110 changes: 77 additions & 33 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,48 +1,92 @@
# pySigma CrowdStrike Processing Pipeline

© 2024 The MITRE Corporation.
Approved for Public Release; Distribution Unlimited. Case Number 24-1824.

NOTICE
MITRE hereby grants express written permission to use, reproduce, distribute, modify, and otherwise leverage this software to the extent permitted by the licensed terms provided in the LICENSE.md file included with this project.

@author Jason Slaughter [email protected]
@author John Dombrowski [email protected]
@author Kaitlyn Laohoo [email protected]

NOTICE
This software was produced for the U. S. Government under Contract Number 70RSAT20D00000001 and Task Order 70RCSJ24FR0000016, and is subject to Federal Acquisition Regulation Clause 52.227-14, Rights in Data—General – Alternate II (Dec 2007) and Alternate III (Dec 2007) (DEVIATION).

No other use other than that granted to the U. S. Government, or to those acting on behalf of the U. S. Government under that Clause is authorized without the express written permission of The MITRE Corporation.
For further information, please contact The MITRE Corporation, Contracts Management Office, 7515 Colshire Drive, McLean, VA 22102-7539, (703) 983-6000.


![Tests](https://github.com/SigmaHQ/pySigma-pipeline-crowdstrike/actions/workflows/test.yml/badge.svg)
![Coverage Badge](https://img.shields.io/endpoint?url=https://gist.githubusercontent.com/thomaspatzke/46f41e1fcf5eaab808ff5742401ac42d/raw)
![Status](https://img.shields.io/badge/Status-pre--release-orange)

# pySigma CrowdStrike Backend
This package provides a processing pipeline for CrowdStrike events. It was mainly written for Falcon Data Replicator data but Splunk queries should also work in the CrowdStrike Splunk.

This is the CrowdStrike backend for pySigma. It provides the package `sigma.backends.crowdstrike` with the `LogScaleBackend` class.
It provides the package `sigma.pipeline.crowdstrike` with the `crowdstrike_fdr_pipeline` function that returns a ProcessingPipeline object.

Further it contains the following processing pipelines under `sigma.pipelines.crowdstrike`:
- `crowdstrike_fdr_pipeline` which was mainly written for the Falcon Data Replicator data but Splunk queries should work in the legacy CrowdStrike Splunk. The pipeline can also be used with other backends in case you ingest Falcon data to a different SIEM.
- `crowdstrike_falcon_pipeline` which was written for data collected by the CrowdStrike Falcon Agent stored natively in CrowdStrike Logscale. It effectively translates rules to the CrowdStrike Query Language used by LogScale. This is designed to be used with the `LogScaleBackend`.
Currently the pipeline adds support for the following event types (Sigma logsource category to event_simpleName mapping):

## Supported Rules
### Falcon Pipeline
The following categories and products are supported by the pipelines:
| category | product | CrowdStrike event_simpleName |
|-|-|-|
|`process_creation` | `windows`, `linux`| ProcessRollup2, SyntheticProcessRollup2 |
|`network_connection` | `windows`| NetworkConnectIP4, NetworkReceiveAcceptIP4 |
|`dns_query` | `windows`| DnsRequest |
|`image_load` | `windows`| ClassifiedModuleLoad |
|`driver_load` | `windows`| DriverLoad |
|`ps_script` | `windows`| CommandHistory, ScriptControlScanTelemetry |
- process_creation: ProcessRollup2
- Only rules with references to the file name of the parent image are supported because CrowdStrike ProcessRollup2 events only contain the file name.
- network_connection: NetworkConnectionIP4 or NetworkReceiveAcceptIP4 (depending on Initiated field value)
- events that refer to process image names are not supported because this information is not available in CrowdStrike network connection events, just a process id reference.

There's likely more windows categories that can be supported by the pipelines; We will be adding support gradually as availability allows.
Not supported because the FDR events lack information required by Sigma rules:

## Limitations and caveats:
- **Full Paths**:
Falcon agents do not capture drive names when logging paths. Instead, when drive letters are expected the device path is used. For example, `C:\Windows` results to `\Device\HarddiskVolume3\Windows` in the logs. To account for this, the pipeline replaces any drive letters in fields containing full path with `\Device\HarddiskVolume?\` (where '?' can be any single character).
- create_remote_thread: event lack information required by most rules. No process details, only reference.

- **Parent Name**:
Falcon `process_creation` events do not capture the full path of the parent. Hence, in such cases the transformation is configured to fail.
This backend is currently maintained by:

- **DNS Query Results**:
Falcon `dns_query` events return the IP records of a successful query in [semicolon-separated](https://github.com/CrowdStrike/logscale-community-content/blob/main/CrowdStrike-Query-Language-Map/CrowdStrike-Query-Language/concatArray.md) string. The pipeline handles this by enforcing a "contains" expression on the `QueryResults` field
- **Unsupported fields**:
Falcon does not always capture the same fields as sysmon for the categories supported. In cases where the rule requires unsupported fields, the transformation fails.
- [Thomas Patzke](https://github.com/thomaspatzke/)

- **PS Script Logging**:
There is not a clean equivelant between the events Falcon generates and PowerShell Script Logging. Our transformation is a best-effort approach that contains multiple fields that might contain the value in the field.
## Getting Started

## References
- [LogScale Community Content](https://github.com/CrowdStrike/logscale-community-content)
Make sure you have installed:
- Python 3
- pip
- pytest
- [sigma-cli](https://github.com/SigmaHQ/sigma-cli) (No need to clone the project)

This backend is currently maintained by:

* [Thomas Patzke](https://github.com/thomaspatzke/)
* [Panos Moullotos](https://github.com/moullos)
Install the sigma splunk backend by running:
```
sigma plugin install splunk
```

To install the pySigma CrowdStrike Processing Pipeline by running:
```
pip install crowdstrike_fdr_pipeline
```

The `crowdstrike_fdr_pipeline` must be installed before executing `pytest` to perform full test coverage testing. Follow the instructions in the sigma-cli [Getting Started](https://github.com/SigmaHQ/sigma-cli?tab=readme-ov-file#getting-started) to learn how to use the Sigma client in your local environment.


To run the pySigma CrowdStrike Processing Pipeline locally using Sigma-CLI, execute the following command:

```
sigma convert -p crowdstrike_fdr -t splunk -f default <rule.yml>
```

If you need to test your changes to the pipeline, modify the file located at `~\AppData\Local\Programs\Python\Python<version>\Lib\site-packages\sigma\pipelines\crowdstrike\crowdstrike.py` and run the `pytest` command from the command line.


To ensure that any local changes made to the pipeline have not broken the codebase, run the following command:

```
sigma convert -p crowdstrike_fdr -t splunk -f default <rule.yml>
```

## Change-Log

CB Lab team has made the following updates

- Updated mapping between Sigma process_creation and CS ProcessRollup2 / SyntheticProcessRollup2
- Created tests for new and updated mappings
- Updated the README.md with directions on testing for localhost development.

## TODO

- Create a docker image/enviroment on DockerHub for teams to pull down and use.
- Further document how this pipeline code works with the SigmaHQ repo. Possibly by pointing to a diagram or other README.md in another repo.
- Further implement missing fields.
6 changes: 6 additions & 0 deletions sigma/pipelines/crowdstrike/crowdstrike.py
Original file line number Diff line number Diff line change
Expand Up @@ -116,6 +116,12 @@ def common_processing_items():
"sha256": "SHA256HashData",
"Computer": "ComputerName",
"OriginalFileName": "OriginalFilename",
"CommandLine": "CommandLine", #CB Lab added CommandLine @author Jason Slaughter [email protected]
"ProcessID": "TargetProcessID", #CB Lab added ProcessID @author Kaitlyn Laohoo [email protected]
"md5": "MD5HashData", #CB Lab added md5 @author Kaitlyn Laohoo [email protected]
"sha1": "SHA1HashData", #CB Lab added sha1 @author Kaitlyn Laohoo [email protected]
"sha256": "SHA256HashData", #CB Lab added sha256 ProcessID @author Kaitlyn Laohoo [email protected]
"UtcTime": "ProcessStartTime", #CB Lab added UtcTime @author John Dombrowski [email protected]
}
),
rule_conditions=[
Expand Down