Skip to content

Commit

Permalink
fix: adding secrets to sdk
Browse files Browse the repository at this point in the history
  • Loading branch information
vijayvammi committed Apr 17, 2024
1 parent 4d7e29d commit fe5feb4
Show file tree
Hide file tree
Showing 25 changed files with 995 additions and 366 deletions.
40 changes: 40 additions & 0 deletions examples/03-parameters/passing_parameters_notebook.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,40 @@
from examples.common.functions import read_parameter
from runnable import NotebookTask, Pipeline, PythonTask, metric, pickled


def main():
write_parameters_from_notebook = NotebookTask(
notebook="examples/common/write_parameters.ipynb",
returns=[
pickled("df"),
"integer",
"floater",
"stringer",
"pydantic_param",
metric("score"),
],
name="set_parameter",
)

read_parameters = PythonTask(
function=read_parameter,
name="get_parameters",
)

read_parameters_in_notebook = NotebookTask(
notebook="examples/common/read_parameters.ipynb",
terminate_with_success=True,
name="read_parameters_in_notebook",
)

pipeline = Pipeline(
steps=[write_parameters_from_notebook, read_parameters, read_parameters_in_notebook],
)

_ = pipeline.execute()

return pipeline


if __name__ == "__main__":
main()
Empty file.
52 changes: 52 additions & 0 deletions examples/03-parameters/passing_parameters_python.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,52 @@
"""
The below example shows how to set/get parameters in python
tasks of the pipeline.
The function, set_parameter, returns
- simple python data types (int, float, str)
- pydantic models
- pandas dataframe, any "object" type
pydantic models are implicitly handled by runnable
but "object" types should be marked as "pickled".
Use pickled even for python data types is advised for
reasonably large collections.
"""

from examples.common.functions import read_parameter, write_parameter
from runnable import Pipeline, PythonTask, metric, pickled


def main():
write_parameters = PythonTask(
function=write_parameter,
returns=[
pickled("df"),
"integer",
"floater",
"stringer",
"pydantic_param",
metric("score"),
],
name="set_parameter",
)

read_parameters = PythonTask(
function=read_parameter,
terminate_with_success=True,
name="get_parameters",
)

pipeline = Pipeline(
steps=[write_parameters, read_parameters],
)

_ = pipeline.execute()

return pipeline


if __name__ == "__main__":
main()
Empty file.
41 changes: 41 additions & 0 deletions examples/03-parameters/passing_parameters_shell.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,41 @@
from examples.common.functions import read_unpickled_parameter
from runnable import Pipeline, PythonTask, ShellTask, metric


def main():
export_env_command = """
export integer=1
export floater=3.14
export stringer="hello"
export pydantic_param='{"x": 10, "foo": "bar"}'
export score=0.9
"""
write_parameters_in_shell = ShellTask(
command=export_env_command,
returns=[
"integer",
"floater",
"stringer",
"pydantic_param",
metric("score"),
],
name="write_parameter",
)

read_parameters = PythonTask(
function=read_unpickled_parameter,
name="read_parameters",
terminate_with_success=True,
)

pipeline = Pipeline(
steps=[write_parameters_in_shell, read_parameters],
)

_ = pipeline.execute()

return pipeline


if __name__ == "__main__":
main()
Empty file.
55 changes: 55 additions & 0 deletions examples/03-parameters/static_parameters_non_python.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,55 @@
"""
The below example showcases setting up known initial parameters for a pipeline
of notebook and shell based commands.
The initial parameters as defined in the yaml file are:
integer: 1
floater : 3.14
stringer : hello
pydantic_param:
x: 10
foo: bar
runnable exposes the nested parameters as dictionary for notebook based tasks
as a json string for the shell based tasks.
"""

from runnable import NotebookTask, Pipeline, ShellTask


def main():
read_params_in_notebook = NotebookTask(
name="read_params_in_notebook",
notebook="examples/common/read_parameters.ipynb",
)

shell_command = """
if [ "$integer" = 1 ] \
&& [ "$floater" = 3.14 ] \
&& [ "$stringer" = "hello" ] \
&& [ "$pydantic_param" = '{"x": 10, "foo": "bar"}' ]; then
echo "yaay"
exit 0;
else
echo "naay"
exit 1;
fi
"""
read_params_in_shell = ShellTask(
name="read_params_in_shell",
command=shell_command,
terminate_with_success=True,
)

pipeline = Pipeline(
steps=[read_params_in_notebook, read_params_in_shell],
)

_ = pipeline.execute(parameters_file="examples/common/initial_parameters.yaml")

return pipeline


if __name__ == "__main__":
main()
41 changes: 41 additions & 0 deletions examples/03-parameters/static_parameters_non_python.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,41 @@
dag:
description: |
The below example showcases setting up known initial parameters for a pipeline
of notebook and shell based commands.
The initial parameters as defined in the yaml file are:
integer: 1
floater : 3.14
stringer : hello
pydantic_param:
x: 10
foo: bar
runnable exposes the nested parameters as dictionary for notebook based tasks
as a json string for the shell based tasks.
start_at: read_params_in_notebook
steps:
read_params_in_notebook:
type: task
command_type: notebook
command: examples/common/read_parameters.ipynb
next: read_params_in_shell
read_params_in_shell:
type: task
command_type: shell
command: |
if [ "$integer" = 1 ] \
&& [ "$floater" = 3.14 ] \
&& [ "$stringer" = "hello" ] \
&& [ "$pydantic_param" = '{"x": 10, "foo": "bar"}' ]; then
echo "yaay"
exit 0;
else
echo "naay"
exit 1;
fi
next: success
success:
type: success
fail:
type: fail
49 changes: 49 additions & 0 deletions examples/03-parameters/static_parameters_python.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,49 @@
"""
The below example showcases setting up known initial parameters for a pipeline
of only python tasks
The initial parameters as defined in the yaml file are:
simple: 1
complex_param:
x: 10
y: "hello world!!"
runnable allows using pydantic models for deeply nested parameters and
casts appropriately based on annotation. eg: read_initial_params_as_pydantic
If no annotation is provided, the parameter is assumed to be a dictionary.
eg: read_initial_params_as_json
"""

from examples.common.functions import (
read_initial_params_as_json,
read_initial_params_as_pydantic,
)
from runnable import Pipeline, PythonTask


def main():
read_params_as_pydantic = PythonTask(
function=read_initial_params_as_pydantic,
name="read_params_as_pydantic",
)

read_params_as_json = PythonTask(
function=read_initial_params_as_json,
terminate_with_success=True,
name="read_params_json",
)

pipeline = Pipeline(
steps=[read_params_as_pydantic, read_params_as_json],
add_terminal_nodes=True,
)

_ = pipeline.execute(parameters_file="examples/common/initial_parameters.yaml")

return pipeline


if __name__ == "__main__":
main()
30 changes: 30 additions & 0 deletions examples/03-parameters/static_parameters_python.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,30 @@
dag:
description: |
The below example showcases setting up known initial parameters for a pipeline
of only python tasks
The initial parameters as defined in the yaml file are:
simple: 1
complex_param:
x: 10
y: "hello world!!"
runnable allows using pydantic models for deeply nested parameters and
casts appropriately based on annotation. eg: read_initial_params_as_pydantic
If no annotation is provided, the parameter is assumed to be a dictionary.
eg: read_initial_params_as_json
start_at: read_params_as_pydantic
steps:
read_params_as_pydantic:
type: task
command: examples.common.functions.read_initial_params_as_pydantic
next: read_params_json
read_params_json:
type: task
command: examples.common.functions.read_initial_params_as_json
next: success
success:
type: success
fail:
type: fail
28 changes: 22 additions & 6 deletions examples/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -23,14 +23,30 @@ The next section has examples on stitching these tasks together for complex oper

- 02-sequential: Examples of stitching tasks together including behavior in case of failures.

- traversal: A pipeline which is a mixed bag of notebooks, python functions and
- [traversal.py](./02-sequential/traversal.py), [traversal.yaml](./02-sequential/traversal.yaml): A pipeline which is a mixed bag of notebooks, python functions and
shell scripts.
- default_failure: The default failure behavior.
- on_failure_fail: On failure of a step, do some action and fail
- on_failure_success: On failure of a step, take a different route
- [default_fail.py](./02-sequential/default_fail.py), [default_fail.yaml](./02-sequential/default_fail.yaml): The default failure behavior.
- [on_failure_fail](./02-sequential/on_failure_fail.py), [on_faliure_fail.yaml](./02-sequential/on_failure_fail.yaml) On failure of a step, do some action and fail
- [on_failure_success.py](./02-sequential/on_failure_succeed.py), [on_failure_success.yaml](./02-sequential/on_failure_succeed.yaml): On failure of a step, take a different route


The above examples show stitching complex operations of the pipeline.
The next section has examples on
The next section has examples on communicating between tasks during execution.

- 03: Examples of passing parameters between tasks
- 03: Examples of passing parameters between tasks of a pipeline.

Guidelines:

- python functions can get/set simple python data types, pydantic models, objects marked as pickled. Some of the
simple data types can also be marked as a metric.
-


- [static_parameters_python.py](./03-parameters/static_parameters_python.py), [static_parameters_python.yaml](./03-parameters/static_parameters_python.yaml): A pipeline to show the access of static or known parameters by python tasks.

- [static_parameters_non_python.py](./03-parameters/static_parameters_non_python.py), [static_parameters_non_python.yaml](./03-parameters/static_parameters_non_python.yaml): A pipeline to show the access of static or known parameters by python tasks.

- [passing_parameters_python.py](./03-parameters/passing_parameters_python.py), [passing_parameters_python.yaml](./03-parameters/passing_parameters_python.yaml): shows the mechanism of passing parameters (simple python datatypes, "dillable" objects, pydantic models) and registering metrics between python tasks.

- [passing_parameters_notebook.py](./03-parameters/passing_parameters_notebook.py), [passing_parameters_notebook.yaml](./03-parameters/passing_parameters_notebook.yaml): shows the mechanism of passing parameters (simple python datatypes, "dillable" objects, pydantic models) and registering metrics between tasks. runnable can "get" object
parameters from notebooks but cannot inject them into notebooks.
Loading

0 comments on commit fe5feb4

Please sign in to comment.