Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

CI Workaround: Pin dbt-core, Disable SQLite Tests, and Correctly Ignore Clone Test to Pass CI #1337

Merged
merged 21 commits into from
Nov 26, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
137 changes: 69 additions & 68 deletions .github/workflows/test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -267,74 +267,75 @@ jobs:
AIRFLOW_CONN_DATABRICKS_DEFAULT: ${{ secrets.AIRFLOW_CONN_DATABRICKS_DEFAULT }}
DATABRICKS_CLUSTER_ID: ${{ secrets.DATABRICKS_CLUSTER_ID }}

Run-Integration-Tests-Sqlite:
needs: Authorize
runs-on: ubuntu-latest
strategy:
matrix:
python-version: ["3.11"]
airflow-version: ["2.8"]

steps:
- uses: actions/checkout@v3
with:
ref: ${{ github.event.pull_request.head.sha || github.ref }}
- uses: actions/cache@v3
with:
path: |
~/.cache/pip
.local/share/hatch/
key: integration-sqlite-${{ runner.os }}-${{ matrix.python-version }}-${{ matrix.airflow-version }}-${{ hashFiles('pyproject.toml') }}-${{ hashFiles('cosmos/__init__.py') }}

- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v4
with:
python-version: ${{ matrix.python-version }}

- name: Install packages and dependencies
run: |
python -m pip install uv
uv pip install --system hatch
hatch -e tests.py${{ matrix.python-version }}-${{ matrix.airflow-version }} run pip freeze

- name: Test Cosmos against Airflow ${{ matrix.airflow-version }} and Python ${{ matrix.python-version }}
run: |
hatch run tests.py${{ matrix.python-version }}-${{ matrix.airflow-version }}:test-integration-sqlite-setup
hatch run tests.py${{ matrix.python-version }}-${{ matrix.airflow-version }}:test-integration-sqlite
env:
AIRFLOW_HOME: /home/runner/work/astronomer-cosmos/astronomer-cosmos/
AIRFLOW_CONN_EXAMPLE_CONN: postgres://postgres:[email protected]:5432/postgres
AIRFLOW_CONN_AWS_S3_CONN: ${{ secrets.AIRFLOW_CONN_AWS_S3_CONN }}
AIRFLOW_CONN_GCP_GS_CONN: ${{ secrets.AIRFLOW_CONN_GCP_GS_CONN }}
AIRFLOW_CONN_AZURE_ABFS_CONN: ${{ secrets.AIRFLOW_CONN_AZURE_ABFS_CONN }}
AIRFLOW__CORE__DAGBAG_IMPORT_TIMEOUT: 90.0
PYTHONPATH: /home/runner/work/astronomer-cosmos/astronomer-cosmos/:$PYTHONPATH
AIRFLOW__COSMOS__ENABLE_CACHE: 0
COSMOS_CONN_POSTGRES_PASSWORD: ${{ secrets.COSMOS_CONN_POSTGRES_PASSWORD }}
DATABRICKS_CLUSTER_ID: mock
DATABRICKS_HOST: mock
DATABRICKS_WAREHOUSE_ID: mock
DATABRICKS_TOKEN: mock
POSTGRES_HOST: localhost
POSTGRES_USER: postgres
POSTGRES_PASSWORD: postgres
POSTGRES_DB: postgres
POSTGRES_SCHEMA: public
POSTGRES_PORT: 5432
AIRFLOW__COSMOS__REMOTE_TARGET_PATH: "s3://cosmos-remote-cache/target_compiled/"
AIRFLOW__COSMOS__REMOTE_TARGET_PATH_CONN_ID: aws_s3_conn

- name: Upload coverage to Github
uses: actions/upload-artifact@v4
with:
name: coverage-integration-sqlite-test-${{ matrix.python-version }}-${{ matrix.airflow-version }}
path: .coverage
include-hidden-files: true

env:
AIRFLOW_HOME: /home/runner/work/astronomer-cosmos/astronomer-cosmos/
AIRFLOW_CONN_EXAMPLE_CONN: postgres://postgres:[email protected]:5432/postgres
PYTHONPATH: /home/runner/work/astronomer-cosmos/astronomer-cosmos/:$PYTHONPATH
# TODO: https://github.com/astronomer/astronomer-cosmos/issues/1341
# Run-Integration-Tests-Sqlite:
# needs: Authorize
# runs-on: ubuntu-latest
# strategy:
# matrix:
# python-version: ["3.11"]
# airflow-version: ["2.8"]
#
# steps:
# - uses: actions/checkout@v3
# with:
# ref: ${{ github.event.pull_request.head.sha || github.ref }}
# - uses: actions/cache@v3
# with:
# path: |
# ~/.cache/pip
# .local/share/hatch/
# key: integration-sqlite-${{ runner.os }}-${{ matrix.python-version }}-${{ matrix.airflow-version }}-${{ hashFiles('pyproject.toml') }}-${{ hashFiles('cosmos/__init__.py') }}
#
# - name: Set up Python ${{ matrix.python-version }}
# uses: actions/setup-python@v4
# with:
# python-version: ${{ matrix.python-version }}
#
# - name: Install packages and dependencies
# run: |
# python -m pip install uv
# uv pip install --system hatch
# hatch -e tests.py${{ matrix.python-version }}-${{ matrix.airflow-version }} run pip freeze
#
# - name: Test Cosmos against Airflow ${{ matrix.airflow-version }} and Python ${{ matrix.python-version }}
# run: |
# hatch run tests.py${{ matrix.python-version }}-${{ matrix.airflow-version }}:test-integration-sqlite-setup
# hatch run tests.py${{ matrix.python-version }}-${{ matrix.airflow-version }}:test-integration-sqlite
# env:
# AIRFLOW_HOME: /home/runner/work/astronomer-cosmos/astronomer-cosmos/
# AIRFLOW_CONN_EXAMPLE_CONN: postgres://postgres:[email protected]:5432/postgres
# AIRFLOW_CONN_AWS_S3_CONN: ${{ secrets.AIRFLOW_CONN_AWS_S3_CONN }}
# AIRFLOW_CONN_GCP_GS_CONN: ${{ secrets.AIRFLOW_CONN_GCP_GS_CONN }}
# AIRFLOW_CONN_AZURE_ABFS_CONN: ${{ secrets.AIRFLOW_CONN_AZURE_ABFS_CONN }}
# AIRFLOW__CORE__DAGBAG_IMPORT_TIMEOUT: 90.0
# PYTHONPATH: /home/runner/work/astronomer-cosmos/astronomer-cosmos/:$PYTHONPATH
# AIRFLOW__COSMOS__ENABLE_CACHE: 0
# COSMOS_CONN_POSTGRES_PASSWORD: ${{ secrets.COSMOS_CONN_POSTGRES_PASSWORD }}
# DATABRICKS_CLUSTER_ID: mock
# DATABRICKS_HOST: mock
# DATABRICKS_WAREHOUSE_ID: mock
# DATABRICKS_TOKEN: mock
# POSTGRES_HOST: localhost
# POSTGRES_USER: postgres
# POSTGRES_PASSWORD: postgres
# POSTGRES_DB: postgres
# POSTGRES_SCHEMA: public
# POSTGRES_PORT: 5432
# AIRFLOW__COSMOS__REMOTE_TARGET_PATH: "s3://cosmos-remote-cache/target_compiled/"
# AIRFLOW__COSMOS__REMOTE_TARGET_PATH_CONN_ID: aws_s3_conn
#
# - name: Upload coverage to Github
# uses: actions/upload-artifact@v4
# with:
# name: coverage-integration-sqlite-test-${{ matrix.python-version }}-${{ matrix.airflow-version }}
# path: .coverage
# include-hidden-files: true
#
# env:
# AIRFLOW_HOME: /home/runner/work/astronomer-cosmos/astronomer-cosmos/
# AIRFLOW_CONN_EXAMPLE_CONN: postgres://postgres:[email protected]:5432/postgres
# PYTHONPATH: /home/runner/work/astronomer-cosmos/astronomer-cosmos/:$PYTHONPATH

Run-Integration-Tests-DBT-1-5-4:
needs: Authorize
Expand Down
2 changes: 2 additions & 0 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -144,6 +144,8 @@ dependencies = [
"types-requests",
"types-python-dateutil",
"Werkzeug<3.0.0",
"eval_type_backport", # TODO: https://github.com/astronomer/astronomer-cosmos/issues/1342
"dbt-core<1.8.9" # TODO: https://github.com/astronomer/astronomer-cosmos/issues/1343
]
pre-install-commands = ["sh scripts/test/pre-install-airflow.sh {matrix:airflow} {matrix:python}"]

Expand Down
2 changes: 1 addition & 1 deletion scripts/test/integration-sqlite-setup.sh
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
pip uninstall -y dbt-core dbt-sqlite openlineage-airflow openlineage-integration-common; \
rm -rf airflow.*; \
airflow db init; \
pip install 'dbt-core==1.4' 'dbt-sqlite<=1.4' 'dbt-databricks<=1.4' 'dbt-postgres<=1.4'
pip install 'dbt-core==1.4' 'dbt-sqlite==1.4' 'dbt-databricks==1.4' 'dbt-postgres==1.4' #'databricks-sdk==0.16.0'
4 changes: 2 additions & 2 deletions scripts/test/kubernetes-setup.sh
Original file line number Diff line number Diff line change
Expand Up @@ -5,8 +5,8 @@
set -x
set -e


pip install dbt-postgres==1.8.2 psycopg2==2.9.3 pytz
# TODO: https://github.com/astronomer/astronomer-cosmos/issues/1344
pip install 'dbt-postgres<1.8' 'psycopg2==2.9.3' 'pytz'

# Create a Kubernetes secret named 'postgres-secrets' with the specified literals for host and password
kubectl create secret generic postgres-secrets \
Expand Down
2 changes: 1 addition & 1 deletion tests/test_example_dags.py
Original file line number Diff line number Diff line change
Expand Up @@ -79,7 +79,7 @@ def get_dag_bag() -> DagBag:
file.writelines(["example_cosmos_sources.py\n"])
if DBT_VERSION < Version("1.6.0"):
file.writelines(["example_model_version.py\n"])
file.writelines(["example_clone.py\n"])
file.writelines(["example_operators.py\n"])
pankajastro marked this conversation as resolved.
Show resolved Hide resolved

if DBT_VERSION < Version("1.5.0"):
file.writelines(["example_source_rendering.py\n"])
Expand Down
2 changes: 1 addition & 1 deletion tests/test_example_dags_no_connections.py
Original file line number Diff line number Diff line change
Expand Up @@ -55,7 +55,7 @@ def get_dag_bag() -> DagBag:

if DBT_VERSION < Version("1.6.0"):
file.writelines(["example_model_version.py\n"])
file.writelines(["example_clone.py\n"])
file.writelines(["example_operators.py\n"])
# cosmos_profile_mapping uses the automatic profile rendering from an Airflow connection.
# so we can't parse that without live connections
for file_name in ["cosmos_profile_mapping.py"]:
Expand Down
Loading