You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When we us Helm to install Airflow, our own example should ship with it directly. Let everything handle automatically (no manual copying of DAG to container)
For now we can use the default examples. Unfortunately the runtime of these are very short and it would be better if we could monitor a job of at least a minute. Therefore we need a solution that allows us to add our own example easily.
I added the default default-airflow-values.yaml file to our repo because there is already an example on how the git sync for DAGs could look like. If we decide to use this in the future.
if we want the airflow to recognize our DAGs, we can 'kubectl cp' them to the airflow-scheduler pod in the "/opt/airflow/dags/" directory. Or we can create a storage for this purpose and assign it as the default DAG directory for airflow.
When we us Helm to install Airflow, our own example should ship with it directly. Let everything handle automatically (no manual copying of DAG to container)
e.g.
https://airflow.apache.org/docs/helm-chart/stable/quick-start.html
https://cloud.google.com/composer/docs/run-apache-airflow-dag
The text was updated successfully, but these errors were encountered: