Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

docs: input from Aaron on the blog #11947

Merged
merged 2 commits into from
Jun 3, 2024
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@ To follow along, clone the project, `git clone https://github.com/catmeme/arti.g

AWS provides [a few ways to integrate with AWS S3 Events](https://docs.aws.amazon.com/AmazonS3/latest/userguide/notification-how-to-event-types-and-destinations.html), for this project, we're going to focus on the latest and most straight-forward, [S3 Event Notifications](https://docs.aws.amazon.com/AmazonS3/latest/userguide/EventNotifications.html). Depending on your use case, integrating with [EventBridge](https://docs.aws.amazon.com/AmazonS3/latest/userguide/EventBridge.html) may make more sense.

While we can integrate S3 events directly with Lambda, what happens when we add 100's of files at the same time? Potentially, we have 100's of Lambdas invoked near simultaneously, depending on other factors, this could become an issue. We'll solve for this by [Using Lambda with Amazon SQS](https://docs.aws.amazon.com/lambda/latest/dg/with-sqs.html), and [reserved concurrency](https://docs.aws.amazon.com/lambda/latest/operatorguide/reserved-concurrency.html).
While we can integrate S3 events directly with Lambda, what happens when we add 100's of files at the same time? Potentially, hundreds of Lambdas could be invoked simultaneously, which could become an issue depending on other factors. We'll solve for this by [Using Lambda with Amazon SQS](https://docs.aws.amazon.com/lambda/latest/dg/with-sqs.html), and [reserved concurrency](https://docs.aws.amazon.com/lambda/latest/operatorguide/reserved-concurrency.html).

Our Pulumi program is updated to add a Queue and integrate it with our new Lambda SQS handler (policies left out for brevity, you can view them in the `deploy/pulumi/__main__.py` in the project).

Expand Down Expand Up @@ -124,7 +124,7 @@ def handle_file_uploads(event, say, client, logger):
logger.error("No valid message found for reaction.")
```

Keep in mind, we're not doing any validation here yet, we're blindly copying files from Slack to S3 and having our data loader process them. It's probably not a good idea to deploy this bot to a public Slack channel without more moderation/validation in place.
Keep in mind, we're not doing any validation here yet. We're blindly copying files from Slack to S3 and having our data loader process them. It's probably not a good idea to deploy this bot to a public Slack channel without more moderation/validation in place.

Here's what it looks like when we drag/drop a file into Slack now. On the right, a file was shared with the bot in private message. The bot acknowledged this upload with the checkmark reaction, the job is placed on the queue. Shortly after, on the left, the SQS Lambda handler has finished processing the uploaded file. We're now able to use the bot to talk about the uploaded files.

Expand Down
Loading