Skip to content

Commit

Permalink
Merge branch 'main' into extension-docs
Browse files Browse the repository at this point in the history
  • Loading branch information
pavithraes authored Sep 15, 2023
2 parents 228b4c1 + 71e515f commit 425e472
Show file tree
Hide file tree
Showing 11 changed files with 768 additions and 331 deletions.
25 changes: 17 additions & 8 deletions docs/docs/how-tos/domain-registry.md
Original file line number Diff line number Diff line change
Expand Up @@ -61,11 +61,16 @@ Finally, set the token value as an environment variable:
export CLOUDFLARE_TOKEN="cloudflaretokenvalue"
```

Also, add the flag `--dns-provider=cloudflare` to the [Nebari `deploy` command][nebari-deploy].
Also, add a `dns` section to the `nebari-config.yaml` file.

```yaml
dns:
provider: cloudflare
```
## Using other DNS providers
Currently, Nebari only supports CloudFlare for [automatic DNS registration](link to automatic section below). If an alternate DNS provider is desired, change the `--dns-provider` flag from `cloudflare` to `none` on the Nebari `deploy` command.
Currently, Nebari only supports CloudFlare for [automatic DNS registration](link to automatic section below). If an alternate DNS provider is desired, change the `dns.provider` field from `cloudflare` to `none` in the `nebari-config.yaml` file.

Below are the links to detailed documentation on how to create and manage DNS records on a few providers:

Expand All @@ -81,18 +86,22 @@ The amount of time this takes varies for each DNS provider. Validate such inform

## Automatic DNS provision

Nebari has an extra flag for deployments that grants management and the creation of the DNS records for you automatically. For automatic DNS provision add `--dns-auto-provision` to your Nebari `deploy` command:
Nebari also supports management and the creation of the DNS records for you automatically. For automatic DNS provision add `dns.auto-provision` to your Nebari config file:

```bash
nebari deploy -c nebari-config \
--dns-provider cloudflare \
--dns-auto-provision
```yaml
dns:
provider: cloudflare
auto-provision: true
```

This will set the DNS provider as Cloudflare and automatically handle the creation or updates to the Nebari domain DNS records on Cloudflare.

:::warning
The usage of `--dns-auto-provision` is restricted to Cloudflare as it is the only fully integrated DNS provider that Nebari currently supports.
The usage of `dns.auto-provision` is restricted to Cloudflare as it is the only fully integrated DNS provider that Nebari currently supports.
:::

:::warning
Earlier version of Nebari supports dns settings through `--dns-provider` and `--dns-auto-provision` flags in the `deploy` command. But this feature is removed in favor of using the `nebari-config.yaml` file.
:::

When you are done setting up the domain name, you can refer back to the [Nebari deployment documentation][nebari-deploy] and continue the remaining steps.
Expand Down
126 changes: 103 additions & 23 deletions docs/docs/how-tos/using-argo.md
Original file line number Diff line number Diff line change
@@ -1,48 +1,123 @@
---
id: using-argo
title: Automate workflows with Argo
title: Automate your first workflow with Argo
description: Argo workflow management
---

# Automate workflows with Argo

Argo Workflows is an open source container-native workflow engine for orchestrating parallel jobs on Kubernetes. Argo
workflows comes enabled by default with Nebari deployments.
[Argo Workflows](https://argoproj.github.io/workflows) is an open source container-native
workflow engine for orchestrating parallel jobs on Kubernetes. In other words,
Argo helps you run a sequence of tasks or functions without you having to be
present (it will manage the server resources for you). Argo workflows
comes enabled by default with Nebari deployments.

## Access Argo Server
Access control for Argo on Nebari is done through Keycloak user groups. All
users in the `admin` or `developer` groups have access to Argo.

If Argo Workflows is enabled, users can access argo workflows server at: `your-nebari-domain.com/argo`. Log in via
Keycloak with your usual credentials.
:::note
Also see the [Set up Argo Workflows documentation](/docs/how-tos/setup-argo).
:::


## Access the Argo Server

If Argo Workflows is enabled, users can access Argo Workflows UI at:
`your-nebari-domain.com/argo`. Log in via Keycloak with your usual credentials.

You can also download the
[Argo CLI](https://github.com/argoproj/argo-workflows/releases) if you prefer
a command line experience.

## Introduction to the Argo UI

Navigate to the Argo UI at `your-nebari-domain.com/argo`.

![Argo Server Landing Page](/img/how-tos/argo_landing_page.png)

From this page, you can see all the Argo servers currently running for each
workflow.

For kubernetes deployments, it important to note that these are
active pods. The two workflows shown in the UI above indicate that the workflows
are complete (the green check), but that the server is still running.

:::warning
We highly recommend setting the default timeout, otherwise the Argo pods will not
be culled on their own!
:::

You can click on each individual workflow to see the DAG and details for each
step in the workflow.

![Argo workflow detail](/img/how-tos/argo_workflow_details.png)

## Submit a workflow via Argo Server
## Submit a workflow

You can submit a workflow by clicking "SUBMIT NEW WORKFLOW" on the landing page assuming you have the appropriate
permissions.
You can submit a workflow through the UI by clicking "+ SUBMIT NEW WORKFLOW" on
the landing page. Argo offers a template for the workflow yaml format.

![Argo Server Landing Page](/img/tutorials/argo_server_landing_page.png)
![Argo UI submit new workflow](/img/how-tos/argo_submit_new_workflow.png)

Click `+ CREATE` when you're ready to submit. The yaml format is not the only
option for generating workflows. Argo also allows you to create workflows via
python. More information on how to generate these specifications will follow.

## Submit a workflow via Argo CLI

You can submit or manage workflows via the Argo CLI. The Argo CLI can be downloaded from the
[Argo Releases](https://github.com/argoproj/argo-workflows/releases) page. After downloading the CLI, you can get your
token from the Argo Server UI by clicking on the user tab in the bottom left corner and then clicking "Copy To
Clipboard". You'll need to make a few edits to access to what was copied for Argo CLI to work correctly. The base href
should be `ARGO_BASE_HREF=/argo` in the default nebari installation and you need to set the namespace where Argo was
deployed (dev by default) `ARGO_NAMESPACE=dev`. After setting those variables and the others copied from the Argo Server
UI, you can check that things are working by running `argo list`.
You can also submit or manage workflows via the Argo CLI. The Argo CLI can be
downloaded from the
[Argo Releases](https://github.com/argoproj/argo-workflows/releases) page.

You can submit a workflow through the CLI using `argo submit my-workflow.yaml`.

The `argo list` command will list all the running workflows.

If you've just submitted a workflow and you want to check on it, you can run
`argo get @latest` to get the latest submitted workflow.

You can also access the logs for a workflow using
`argo logs -n workflow_name @latest`.

For more information on Argo workflows via the UI or the CLI, you can visit the
[Argo docs](https://argoproj.github.io/argo-workflows/workflow-concepts/).

[Hera](https://hera-workflows.readthedocs.io/) is a framework for building and
submitting Argo workflows in Python. Learn more in the [Argo Workflows walkthrough tutorial](/docs/tutorials/argo-workflows-walkthrough).

## Access your Nebari environments and file system while on an Argo pod (BETA)

![Argo Workflows User Tab](/img/tutorials/argo_workflows_user_tab.png)
Once you move beyond the "Hello World" Argo examples, you may realize that the
conda environments and the persistent storage you have on Nebari would be
really useful in your temporary Argo pods. Lucky for you, we've solved that
problem for you!

## Jupyterflow-Override (Beta)
Nebari comes with [Nebari Workflow Controller (BETA)](https://github.com/nebari-dev/nebari-workflow-controller), abbreviated as NWC,
which transfers the user's environment variables, home and shared directories,
docker image, and available conda environments to the server where the Workflow
is running. Users can then run a script that loads and saves from their home
directory with a particular conda environment.

All of these things are enabled when users add the `jupyterflow-override` label
to their workflow as in this example using Hera:

```python
from hera.workflows import Workflow
Workflow(
...
labels = {`jupyterflow-override`: 'true'},
)
```

Behind the scenes, NWC will override a portion of the workflow spec, mount
directories, etc. The label can be added to the Workflow in a kubernetes
manifest, via Hera, the Argo CLI, or via the Argo Server Web UI.

:::note
This feature requires that you have a Jupyter user pod running when the "jupyterflow-override" workflow is submitted. The workflow will not be created if you don't have a Jupyter user pod running.
:::

New users of Argo Workflows are often frustrated because the Argo Workflow pods do not have access to the same conda environments and shared files as the Jupyterlab user pod by default. To help with this use case, Nebari comes with [Nebari Workflow Controller](https://github.com/nebari-dev/nebari-workflow-controller) which overrides a portion of the Workflow spec when the
`jupyterflow-override` label is applied to a workflow. The Jupyterlab user pod's environment variables, home and shared directories, docker image, and more will be added to the Workflow. Users can then e.g. run a script that loads and saves from their home directory with a particular conda environment. This works whether the label is added to the Workflow in a kubernetes manifest, via Hera, the argo CLI, or via the Argo Server Web UI. However, this does require that a Jupyter user pod be running when the workflow is submitted. The Workflow pod will have the same resources (cpu, memory) that the user pod has.

### Example
### YAML Example

```
api: argoproj.io/v1alpha1
Expand Down Expand Up @@ -73,3 +148,8 @@ The jupyterflow-override feature is in beta so please [leave some feedback](http
## Additional Argo Workflows Resources

Refer to the [Argo documentation](https://argoproj.github.io/argo-workflows/) for further details on Argo Workflows.

## Next Steps

Now that you have had an introduction, check out the [more detailed tutorial](/tutorials/argo-workflows-walkthrough.md) on
Argo for some practical examples!
Loading

0 comments on commit 425e472

Please sign in to comment.