-
Notifications
You must be signed in to change notification settings - Fork 161
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add support to run dbt Python models #375
Conversation
👷 Deploy Preview for amazing-pothos-a3bca0 processing.
|
Codecov ReportPatch coverage:
Additional details and impacted files@@ Coverage Diff @@
## main #375 +/- ##
==========================================
+ Coverage 90.98% 91.03% +0.04%
==========================================
Files 45 45
Lines 1542 1539 -3
==========================================
- Hits 1403 1401 -2
+ Misses 139 138 -1
☔ View full report in Codecov by Sentry. |
50a3c83
to
688e3da
Compare
The only test failing is test coverage related to a part of the code which was already in the code base and wasn't previously tested. Since this PR is already quite extensive, I suggest we disregard this check, exceptionally. |
cp dev/dags/basic_cosmos_dag.py dev/dags/example_cosmos_python_models.py cp -r dev/dags/dbt/jaffle_shop dev/dags/dbt/jaffle_shop_python
1fd4b35
to
dc71269
Compare
Add support to run dbt Python models, as described in the official documentation: https://docs.getdbt.com/docs/build/python-models
Add an example of Cosmos running Python models on Databricks (Postgres is not supported as of dbt 1.6).
The dbt example can be run by itself from the directory
dev/dags/dbt/jaffle_shop_python
by exporting the following environment variables:ca312a2206dfb361
)And running:
To validate the feature from a Cosmos perspective, set up the
databrics_default
connection. One way of accomplishing this is by using environment variables:From a previously set-up Airflow environment, run the
example_cosmos_python_models
DAG. An example of how to execute it from the command line:This feature was validated with
load_mode=LoadMode.DBT_LS
andLoadMode.CUSTOM
. Example of the rendered DAG:Review can be simplified by checking the commits individually, especially between 796d6e8 and 78a7d31.
The downside with this change is that the integration tests are now slower to run and depend on the following environment variables being set up in the CI:
AIRFLOW_CONN_DATABRICKS_DEFAULT
DATABRICKS_CLUSTER_ID
Closes: #182