Skip to content

Commit

Permalink
Merge pull request #1724 from SEKOIA-IO/feat/aws-s3-sqs
Browse files Browse the repository at this point in the history
feat(aws): add template and update doc
  • Loading branch information
goudyj authored Apr 4, 2024
2 parents 942f910 + fb9baff commit ba206ee
Show file tree
Hide file tree
Showing 11 changed files with 103 additions and 91 deletions.

This file was deleted.

Original file line number Diff line number Diff line change
@@ -0,0 +1,70 @@
### Deploying the Data Collection Architecture

This section will guide you through creating all the AWS resources needed to collect AWS logs. If you already have existing resources that you want to use, you may do so, but any potential issues or incompatibilities with this tutorial will be your responsibility.

#### Prerequisites

In order to set up the AWS architecture, you need an administator access to the [Amazon console]( https://console.aws.amazon.com) with the permissions to create and manage S3 buckets, SQS queues, S3 notifications and users.

=== "Automatic"

To get started, click on the button below and fill the form on AWS to set up the required environment for Sekoia [![Deploy to AWS](https://s3.amazonaws.com/cloudformation-examples/cloudformation-launch-stack.png)](https://console.aws.amazon.com/cloudformation/home#/stacks/new?stackName=sekoia_stack&templateURL=https://sekoia-doc-bucket.s3.eu-west-3.amazonaws.com/resources.yml)

You need to fill 4 inputs:

- Stack name - Name of the stack in CloudFormation (Name of the template)
- BucketName - Name of the S3 Bucket
- IAMUserName - Name of the dedicated user to access the S3 and SQS queue
- SQSName - Name of the SQS queue

Read the different pages and click on `Next`, then click on `Submit`.

You can follow the creation in the `Events` tab (it can take few minutes).

Once finished, it should be displayed on the left `CREATE_COMPLETE`. Click on the `Outputs` tab in order to retrieve the information needed for Sekoia playbook.

<div style="text-align: center;">
<img width="100%" alt="image" src="/assets/operation_center/integration_catalog/cloud_and_saas/aws/aws_cloudformation.png">
</div>

=== "Manual"

**Create a S3 Bucket**

Please refer to [this guide](https://docs.aws.amazon.com/AmazonS3/latest/userguide/creating-bucket.html) to create a S3 Bucket.

**Create a SQS queue**

The collect will rely on S3 Event Notifications (SQS) to get new S3 objects.

1. Create a queue in the SQS service by following [this guide](https://docs.aws.amazon.com/AWSSimpleQueueService/latest/SQSDeveloperGuide/sqs-configure-create-queue.html)
2. In the Access Policy step, choose the advanced configuration and adapt this configuration sample with your own SQS Amazon Resource Name (ARN) (the main change is the Service directive allowing S3 bucket access):
```json
{
"Version": "2008-10-17",
"Id": "__default_policy_ID",
"Statement": [
{
"Sid": "__owner_statement",
"Effect": "Allow",
"Principal": {
"Service": "s3.amazonaws.com"
},
"Action": "SQS:SendMessage",
"Resource": "arn:aws:sqs:XXX:XXX"
}
]
}
```

!!! Important
Keep in mind that you have to create the SQS queue in the same region as the S3 bucket you want to watch.

** Create a S3 Event Notification **

Use the [following guide](https://docs.aws.amazon.com/AmazonS3/latest/userguide/enable-event-notifications.html) to create S3 Event Notification.
Once created:

1. Select the notification for object creation in the Event type section
2. As the destination, choose the SQS service
3. Select the queue you created in the previous section

This file was deleted.

Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
6 changes: 2 additions & 4 deletions docs/xdr/features/collect/ingestion_methods/cloud_saas/aws.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,5 @@
# Prerequisites to retrieve logs from AWS to Sekoia.io

When utilizing an AWS integration with Sekoia.io, the initial step involves centralizing your logs using AWS S3. This allows Sekoia.io to retrieve events seamlessly. The following page will guide you through the process of setting up these prerequisites;
When utilizing an AWS integration with Sekoia.io, the initial step involves centralizing your logs using AWS S3. This allows Sekoia.io to retrieve events seamlessly. The following page will guide you through the process of setting up these prerequisites.

{!_shared_content/operations_center/integrations/aws_create_sqs_queue.md!}

{!_shared_content/operations_center/integrations/aws_create_s3_notification.md!}
{!_shared_content/operations_center/integrations/aws_create_s3_sqs_notification.md!}
Original file line number Diff line number Diff line change
Expand Up @@ -14,9 +14,18 @@ Amazon CloudFront is a web service that speeds up distribution of your static an

## Configure

### CloudFront Logs
!!! Important
Cloudfront have strict requirements regarding regions. Make sure to check [the regions availabled](https://docs.aws.amazon.com/AmazonCloudFront/latest/DeveloperGuide/AccessLogs.html#access-logs-choosing-s3-bucket) before creating the data collection architecture.

You can configure CloudFront to create log files that contain detailed information about every user request that CloudFront receives. These are called standard logs, also known as access logs. But you have first of all to pay attention to [regions and s3 buckets](https://docs.aws.amazon.com/AmazonCloudFront/latest/DeveloperGuide/AccessLogs.html#access-logs-choosing-s3-bucket) and also your AWS account must have the following permissions for the bucket that you specify for log files (see [link](https://docs.aws.amazon.com/AmazonCloudFront/latest/DeveloperGuide/AccessLogs.html#AccessLogsBucketAndFileOwnership)) :
{!_shared_content/operations_center/integrations/aws_create_s3_sqs_notification.md!}

#### CloudFront Logs

You can configure CloudFront to create log files that contain detailed information about every user request that CloudFront receives. These are called standard logs, also known as access logs.

First of all, follow the [regions and s3 buckets requirements](https://docs.aws.amazon.com/AmazonCloudFront/latest/DeveloperGuide/AccessLogs.html#access-logs-choosing-s3-bucket)

also your AWS account must have the following permissions for the bucket that you specify for log files - see [this link](https://docs.aws.amazon.com/AmazonCloudFront/latest/DeveloperGuide/AccessLogs.html#AccessLogsBucketAndFileOwnership) for more information:

- The S3 access control list (ACL) for the bucket must grant you FULL_CONTROL. If you're the bucket owner, your account has this permission by default. If you're not, the bucket owner must update the ACL for the bucket.
- s3:GetBucketAcl
Expand All @@ -31,10 +40,6 @@ To turn on standard logging for a CloudFront distribution, follow these steps:
5. Choose the S3 bucket where you want CloudFront to deliver the log files. You can specify an optional prefix for the file names.
6. Choose Save changes.

{!_shared_content/operations_center/integrations/aws_create_sqs_queue.md!}

{!_shared_content/operations_center/integrations/aws_create_s3_notification.md!}

### Create the intake

Go to the [intake page](https://app.sekoia.io/operations/intakes) and create a new intake from the format `AWS CloudFront`
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -11,9 +11,13 @@ AWS CloudTrail is a service that enables governance, compliance, operational aud

## Configure

{!_shared_content/operations_center/integrations/aws_create_s3_sqs_notification.md!}

### CloudTrail trail

As a prerequisite you need an existing [CloudTrail trail](https://docs.aws.amazon.com/awscloudtrail/latest/userguide/creating-an-organizational-trail-in-the-console.html) and configure it to record activities from services that you want to monitor.
In order to allow Cloudtrail to store logs in S3, you have to create an **AWS S3 Policy**. Follow [this guide](https://docs.aws.amazon.com/awscloudtrail/latest/userguide/create-s3-bucket-policy-for-cloudtrail.html) to edit your S3 Bucket.

You need an existing [CloudTrail trail](https://docs.aws.amazon.com/awscloudtrail/latest/userguide/creating-an-organizational-trail-in-the-console.html) and configure it to record activities from services that you want to monitor.

In the AWS console, navigate to: `Services > CloudTrail > Trails`. From there, enable the events that you want to record:

Expand All @@ -23,10 +27,6 @@ In the AWS console, navigate to: `Services > CloudTrail > Trails`. From there, e

Activate the logging on the trail through the switch button (On/Off) located on the top right hand corner of the trail page.

{!_shared_content/operations_center/integrations/aws_create_sqs_queue.md!}

{!_shared_content/operations_center/integrations/aws_create_s3_notification.md!}

### Create the intake

Go to the [intake page](https://app.sekoia.io/operations/intakes) and create a new intake from the format `AWS CloudTrail`.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,8 @@ Amazon VPC Flow Logs is a feature that provides the ability to capture informati

## Configure

{!_shared_content/operations_center/integrations/aws_create_s3_sqs_notification.md!}

### VPC Flow Logs

As a prerequisite, you need an existing VPC, subnet or network interface (Elastic Load Balancing, Amazon RDS, Amazon ElastiCache, Amazon Redshift, Amazon WorkSpaces, NAT gateways, Transit gateways) to create a flow log. If you create a flow log for a subnet or VPC, each network interface in that subnet or VPC is monitored.
Expand All @@ -28,12 +30,7 @@ For VPC and subnet:
- Click on `Create flow log`
- Set up the flow log: we recommend to capture all traffic (accepted and rejected).

!!note
The AWS account must have a direct access to the resources because the integration do not work with managing account that make a call on the admin role

{!_shared_content/operations_center/integrations/aws_create_sqs_queue.md!}

{!_shared_content/operations_center/integrations/aws_create_s3_notification.md!}
Please follow [this guide](https://docs.aws.amazon.com/vpc/latest/userguide/flow-logs-s3.html) to configure and set up all the permissions needed.

### Create the intake

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -11,17 +11,7 @@ AWS GuardDuty is a service that detects potential security issues within your ne

## Configure

### Prerequisites

#### Create a S3 bucket

Your GuardDuty findings will be collected in an Amazon S3 bucket.

To set up the bucket, please refer to [this guide](https://docs.aws.amazon.com/AmazonS3/latest/gsg/CreatingABucket.html).

{!_shared_content/operations_center/integrations/aws_create_sqs_queue.md!}

{!_shared_content/operations_center/integrations/aws_create_s3_notification.md!}
{!_shared_content/operations_center/integrations/aws_create_s3_sqs_notification.md!}

#### Forward findings to S3

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -11,9 +11,7 @@ Several AWS services offers to store their logs on a S3 bucket. This integration

## Configure

{!_shared_content/operations_center/integrations/aws_create_sqs_queue.md!}

{!_shared_content/operations_center/integrations/aws_create_s3_notification.md!}
{!_shared_content/operations_center/integrations/aws_create_s3_sqs_notification.md!}

### Create the intake

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -11,28 +11,22 @@ AWS WAF is a web application firewall that lets you monitor the HTTP(S) requests

## Configure

### Prerequisites
!!! important
In this guide, your S3 bucket for AWS WAF logging must start with `aws-waf-logs-` and can end with any suffix you want. For example, `aws-waf-logs-DOC-EXAMPLE-BUCKET-SUFFIX`. More information in [this guide](https://docs.aws.amazon.com/waf/latest/developerguide/logging-s3.html)

#### Create a S3 bucket

Your web ACL traffic logs will be collected in an Amazon S3 bucket.

To set up the bucket, please refer to [this guide](https://docs.aws.amazon.com/AmazonS3/latest/gsg/CreatingABucket.html).

{!_shared_content/operations_center/integrations/aws_create_sqs_queue.md!}

{!_shared_content/operations_center/integrations/aws_create_s3_notification.md!}
{!_shared_content/operations_center/integrations/aws_create_s3_sqs_notification.md!}

#### Forward traffic logs to S3

To forward events produced by AWS WAF to S3, you have to:

1. In your AWS console, navigate to: `Services > WAF & Shield > Web ACLs`
2. Select the acl you want forwarding logs to your bucket
3. Select the tab `Logging and metrics`
4. In the first section, in front of the title `Logging`, click the button `Enable`
5. Check `S3 bucket` as `Logging destination` and select your bucket in the dropdown
6. Click the button `Save`
1. Configure the [Permissions required to publish logs to Amazon S3](https://docs.aws.amazon.com/waf/latest/developerguide/logging-s3.html#logging-s3-permissions) in order to authorize your bucket to receive AWS WAF logs
2. In your AWS console, navigate to: `Services > WAF & Shield > Web ACLs`
3. Select the acl you want forwarding logs to your bucket
4. Select the tab `Logging and metrics`
5. In the first section, in front of the title `Logging`, click the button `Enable`
6. Check `S3 bucket` as `Logging destination` and select your bucket in the dropdown
7. Click the button `Save`

### Create the intake

Expand Down

0 comments on commit ba206ee

Please sign in to comment.