Run a demonstration airflow environment;
$ make build
- builds the airflow docker image withspell
installed$ make init
- initialises databases, creates default userairflow
passwordairflow
(NB This only needs to be run once)$ make up
- launches airflow environment at http:/0.0.0.0:8080
DAGs in dags
directory will be visible to the testing airflow instance
Put the token from above in a file in the root of the directory called settings.env
SPELL_TOKEN=<... your spell token ...>
Then issue $ make add-spell-connection
and (as long as the docker-compose cluster is running)
the spell credentials are added to the airflow connections list.
One of;
- Run
$ pip install -e .
in this directory - Add
airflow_spell
to thePYTHONPATH
environment variable - Add
airflow_spell
to the airflow plugins directory
The default connection id is called spell_conn_id
(but this can be over-ridden in SpellRunOperator
).
The following fields in the connection map to the spell.run authentication system;
password: str
put your spell.runtoken
here (from~/.spell/config
when authenticated from the CLi).host: Optional[str]
your spell.run "owner" - the entity that "owns" some object in spell - useful if you wish to launch runs in a team account, wherehost
could be your team name.
hello_task = SpellRunOperator(
task_id="spell-task",
command='python -c "import sys; sys.stderr.write(sys.version)"',
spell_conn_id="spell_conn_id",
spell_owner="organisation",
machine_type="GPU-V100",
)
Create spell runs using the SpellRunOperator
.
task_id: (str)
must be set for all DAG operatorscommand: (str)
the command to be executed in the spell run (see spell API docs)spell_conn_id: (str)
the name of the Airflow connection setup in Connection sectionspell_ownder: (str)
by default your spell user name, over-riding this is helpful if you have an organizational planmachine_type
: (str)` for setting the type of machine to run the spell run on (default "CPU")
To build the source and binary distributions run
$ make release
This requires twine
to be installed - it is listed in dev-requirements.txt
.
(Remember to bump the version number in setup.py
!.)
To upload the release, run
$ make upload-release
NB You will be prompted for a pypi username and password.
- apache-airflow 1.10.6
This version has an unmet / unlisted dependecy on blinker
. The blinker
module must be installed when
installing this version of apache-airflow.