Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

2.10.1 support #395

Merged
merged 2 commits into from
Oct 2, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 4 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,7 @@
## Note
Starting from Airflow version 2.9, MWAA has open-sourced the original Docker image used in our production deployments. You can refer to our open-source image repository at https://github.com/aws/amazon-mwaa-docker-images to create a local environment identical to that of MWAA.
You can also continue to use the MWAA Local Runner for testing and packaging requirements for all Airflow versions supported on MWAA.

# About aws-mwaa-local-runner

This repository provides a command line interface (CLI) utility that replicates an Amazon Managed Workflows for Apache Airflow (MWAA) environment locally.
Expand Down
6 changes: 3 additions & 3 deletions docker/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -8,9 +8,9 @@ LABEL maintainer="amazon"

# Airflow
## Version specific ARGs
ARG AIRFLOW_VERSION=2.9.2
ARG WATCHTOWER_VERSION=3.2.0
ARG PROVIDER_AMAZON_VERSION=8.24.0
ARG AIRFLOW_VERSION=2.10.1
ARG WATCHTOWER_VERSION=3.3.1
ARG PROVIDER_AMAZON_VERSION=8.28.0

## General ARGs
ARG AIRFLOW_USER_HOME=/usr/local/airflow
Expand Down
18 changes: 18 additions & 0 deletions docker/config/airflow.cfg
Original file line number Diff line number Diff line change
Expand Up @@ -1031,3 +1031,21 @@ shards = 5

# comma separated sensor classes support in smart_sensor.
sensors_enabled = NamedHivePartitionSensor

[usage_data_collection]
# Airflow integrates `Scarf <https://about.scarf.sh/>`__ to collect basic platform and usage data
# during operation. This data assists Airflow maintainers in better understanding how Airflow is used.
# Insights gained from this telemetry are critical for prioritizing patches, minor releases, and
# security fixes. Additionally, this information supports key decisions related to the development road map.
# Check the FAQ doc for more information on what data is collected.
#
# Deployments can opt-out of analytics by setting the ``enabled`` option
# to ``False``, or the ``SCARF_ANALYTICS=false`` environment variable.
# Individual users can easily opt-out of analytics in various ways documented in the
# `Scarf Do Not Track docs <https://docs.scarf.sh/gateway/#do-not-track>`__.

# Enable or disable usage data collection and sending.
#
# Variable: AIRFLOW__USAGE_DATA_COLLECTION__ENABLED
#
enabled = False
Loading
Loading