Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adding Infrastructure as code with updated documentatiom #75

Open
wants to merge 18 commits into
base: development
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
50 changes: 50 additions & 0 deletions .devcontainer/devcontainer.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,50 @@
// For format details, see https://aka.ms/devcontainer.json. For config options, see the
// README at: https://github.com/devcontainers/templates/tree/main/src/python
{
"name": "Python 3",
// Or use a Dockerfile or Docker Compose file. More info: https://containers.dev/guide/dockerfile
"image": "mcr.microsoft.com/devcontainers/python:0-3.9",

// Features to add to the dev container. More info: https://containers.dev/features.
"features": {
"ghcr.io/devcontainers/features/azure-cli:1": {
"version": "latest"
},
"ghcr.io/rchaganti/vsc-devcontainer-features/azurebicep:1": {
"version": "latest"
}
},

// Configure tool-specific properties.
"customizations": {
// Configure properties specific to VS Code.
"vscode": {
"settings": {},
"extensions": [
"ms-python.python",
"ms-vscode.azure-account",
"prompt-flow.prompt-flow"
]
}
},



// Use 'forwardPorts' to make a list of ports inside the container available locally.
// "forwardPorts": [9000],

// Use 'portsAttributes' to set default properties for specific forwarded ports.
// More info: https://containers.dev/implementors/json_reference/#port-attributes
"portsAttributes": {
"9000": {
"label": "Hello World",
"onAutoForward": "notify"
}
},

// Use 'postCreateCommand' to run commands after the container is created.
"postCreateCommand": "pip3 install -r \"./.devcontainer/requirements.txt\" && az extension add --name \"ml\""

// Uncomment to connect as root instead. More info: https://aka.ms/dev-containers-non-root.
// "remoteUser": "root"
}
7 changes: 7 additions & 0 deletions .devcontainer/requirements.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
promptflow
promptflow-tools
promptflow-sdk[builtins]
jinja2
promptflow[azure]
openai
python-dotenv
3 changes: 3 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -1,3 +1,6 @@
llmops_config.json
ssh/

# Byte-compiled / optimized / DLL files
__pycache__/
*.py[cod]
Expand Down
17 changes: 12 additions & 5 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -73,6 +73,7 @@ Additionally, there is a llmops_config.json file that refers to important infras

# Documentation

- Full documentation on deploying an base architecture with/without network isolation can be found [here](./docs/tutorial/02-Infra%20deployment.md)
- Full documentation on using this repo using Azure DevOps can be found [here](./docs/Azure_devops_how_to_setup.md)
- Full documentation on using this repo using Github Workflows can be found [here](./docs/github_workflows_how_to_setup.md)
- Documentation about adding a new flow is available [here](./docs/how_to_onboard_new_flows.md)
Expand All @@ -83,9 +84,12 @@ The repo helps in deploying to **Kubernetes, Kubernetes ARC and AzureML Managed

![Deployment](./docs/images/endpoints.png)


![A/B Deployments](./docs/images/abdeployments.png)

You will also find infrastructure as code to deploy the deploy resources with the choice to enable network isolation:

![Architecture](./docs/images/architecture.png)

# Pipeline

The pipeline execution consists of multiple stages and jobs in each stage:
Expand All @@ -110,25 +114,28 @@ To harness the capabilities of the **local execution**, follow these installatio
git clone https://github.com/microsoft/llmops-promptflow-template.git
```

2. **setup env file**: create .env file at top folder level and provide information for items mentioned. Add as many connection names as needed. All the flow examples in this repo uses AzureOpenAI connection named `aoai`. Add a line `aoai={"api_key": "","api_base": "","api_type": "azure","api_version": "2023-03-15-preview"}` with updated values for api_key and api_base. If additional connections with different names are used in your flows, they should be added accordingly. Currently, flow with AzureOpenAI as provider as supported.
1. **Optional: Use the dev container:** The code includes a dev container configuration file that can be used to create a development container with all the dependencies installed. This is the recommended way to run the code. If you are using VS Code, you can open the folder in a container by clicking on the "Reopen in Container" button in the bottom right corner of the window. The required packages and PromptFlow VS Code extension will be installed automatically when the container is created. If you are using another IDE, you can use the dev container configuration file to create a [development container](https://code.visualstudio.com/docs/devcontainers/containers). This requires [Docker Desktop ](https://www.docker.com/products/docker-desktop/) to be installed on your machine.

1. **setup env file**: create .env file at top folder level and provide information for items mentioned. Add as many connection names as needed. All the flow examples in this repo uses AzureOpenAI connection named `aoai`. Add a line `aoai={"api_key": "","api_base": "","api_type": "azure","api_version": "2023-03-15-preview"}` with updated values for api_key and api_base. If additional connections with different names are used in your flows, they should be added accordingly. Currently, flow with AzureOpenAI as provider as supported.

```bash

experiment_name=
connection_name_1={ "api_key": "","api_base": "","api_type": "azure","api_version": "2023-03-15-preview"}
connection_name_2={ "api_key": "","api_base": "","api_type": "azure","api_version": "2023-03-15-preview"}
```
3. Prepare the local conda or virtual environment to install the dependencies.
1. Prepare the local conda or virtual environment to install the dependencies.

If you decide to not use the dev container, you can create a virtual environment or conda environment and install the dependencies using the following command:
```bash

python -m pip install promptflow promptflow-tools promptflow-sdk jinja2 promptflow[azure] openai promptflow-sdk[builtins] python-dotenv

```

4. Bring or write your flows into the template based on documentation [here](./docs/how_to_onboard_new_flows.md).
1. Bring or write your flows into the template based on documentation [here](./docs/how_to_onboard_new_flows.md).

5. Write python scripts similar to the provided examples in local_execution folder.
1. Write python scripts similar to the provided examples in local_execution folder.

## Contributing

Expand Down
43 changes: 43 additions & 0 deletions deployment_config.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,43 @@
{
"azure_managed_endpoint": {
"ENV_NAME": "dev",
"TEST_FILE_PATH": "sample-request.json",
"PUBLIC_ACCESS": "true",
"ENDPOINT_NAME": "",
"ENDPOINT_DESC": "An online endpoint serving a flow for [task]",
"DEPLOYMENT_DESC": "prompt flow deployment",
"PRIOR_DEPLOYMENT_NAME": "",
"PRIOR_DEPLOYMENT_TRAFFIC_ALLOCATION": "",
"CURRENT_DEPLOYMENT_NAME": "",
"CURRENT_DEPLOYMENT_TRAFFIC_ALLOCATION": "100",
"DEPLOYMENT_VM_SIZE": "Standard_F4s_v2",
"DEPLOYMENT_BASE_IMAGE_NAME": "mcr.microsoft.com/azureml/promptflow/promptflow-runtime:latest",
"DEPLOYMENT_CONDA_PATH": "environment/conda.yml",
"DEPLOYMENT_INSTANCE_COUNT": 1,
"ENVIRONMENT_VARIABLES": {
"example-name": "example-value"
}
},
"kubernetes_endpoint": {
"ENV_NAME": "dev",
"TEST_FILE_PATH": "sample-request.json",
"PUBLIC_ACCESS": "true",
"ENDPOINT_NAME": "",
"ENDPOINT_DESC": "An kubernetes endpoint serving a flow for [task]",
"DEPLOYMENT_DESC": "prompt flow deployment",
"PRIOR_DEPLOYMENT_NAME": "",
"PRIOR_DEPLOYMENT_TRAFFIC_ALLOCATION": "",
"CURRENT_DEPLOYMENT_NAME": "",
"CURRENT_DEPLOYMENT_TRAFFIC_ALLOCATION": 100,
"COMPUTE_NAME": "",
"DEPLOYMENT_VM_SIZE": "promptinstancetype",
"DEPLOYMENT_BASE_IMAGE_NAME": "mcr.microsoft.com/azureml/promptflow/promptflow-runtime:latest",
"DEPLOYMENT_CONDA_PATH": "environment/conda.yml",
"DEPLOYMENT_INSTANCE_COUNT": 1,
"CPU_ALLOCATION": "",
"MEMORY_ALLOCATION": "",
"ENVIRONMENT_VARIABLES": {
"example-name": "example-value"
}
}
}
47 changes: 47 additions & 0 deletions deployment_config.json.sample
Original file line number Diff line number Diff line change
@@ -0,0 +1,47 @@
{
"azure_managed_endpoint":[
{
"ENV_NAME": "dev",
"TEST_FILE_PATH": "sample-request.json",
"PUBLIC_ACCESS": "true",
"ENDPOINT_NAME": "",
"ENDPOINT_DESC": "An online endpoint serving a flow for [task]",
"DEPLOYMENT_DESC": "prompt flow deployment",
"PRIOR_DEPLOYMENT_NAME": "",
"PRIOR_DEPLOYMENT_TRAFFIC_ALLOCATION": "",
"CURRENT_DEPLOYMENT_NAME": "",
"CURRENT_DEPLOYMENT_TRAFFIC_ALLOCATION": "100",
"DEPLOYMENT_VM_SIZE": "Standard_F4s_v2",
"DEPLOYMENT_BASE_IMAGE_NAME": "mcr.microsoft.com/azureml/promptflow/promptflow-runtime:latest",
"DEPLOYMENT_CONDA_PATH": "environment/conda.yml",
"DEPLOYMENT_INSTANCE_COUNT": 1,
"ENVIRONMENT_VARIABLES": {
"example-name": "example-value"
}
}
],
"kubernetes_endpoint":[
{
"ENV_NAME": "dev",
"TEST_FILE_PATH": "sample-request.json",
"PUBLIC_ACCESS": "true",
"ENDPOINT_NAME": "",
"ENDPOINT_DESC": "An kubernetes endpoint serving a flow for [task]",
"DEPLOYMENT_DESC": "prompt flow deployment",
"PRIOR_DEPLOYMENT_NAME": "",
"PRIOR_DEPLOYMENT_TRAFFIC_ALLOCATION": "",
"CURRENT_DEPLOYMENT_NAME": "",
"CURRENT_DEPLOYMENT_TRAFFIC_ALLOCATION": 100,
"COMPUTE_NAME": "",
"DEPLOYMENT_VM_SIZE": "promptinstancetype",
"DEPLOYMENT_BASE_IMAGE_NAME": "mcr.microsoft.com/azureml/promptflow/promptflow-runtime:latest",
"DEPLOYMENT_CONDA_PATH": "environment/conda.yml",
"DEPLOYMENT_INSTANCE_COUNT": 1,
"CPU_ALLOCATION": "",
"MEMORY_ALLOCATION": "",
"ENVIRONMENT_VARIABLES": {
"example-name": "example-value"
}
}
]
}
6 changes: 6 additions & 0 deletions docs/Azure_devops_how_to_setup.md
Original file line number Diff line number Diff line change
Expand Up @@ -197,6 +197,8 @@ curl --request POST \
}"
```

**Note:** If you have provisioned a managed VNET for your Azure ML workspace, this operation will not work for now. You need to use a serverless runtime for now.

15. Get runtime creation status using REST API. Execute this step multiple times unless either you get output that shows createdOn with a valid date and time value or failure. In case of failure, troubleshoot the issue before moving forward.

```bash
Expand Down Expand Up @@ -330,6 +332,8 @@ Update configuration so that we can create a pull request for any one of the exa

### Update llmops_config.json

**Note:** If you decide to use [the infrastructure deployed with the deployment script of this code base](../docs/tutorial/02-Infra%20deployment.md), this file is created and populated automatically.

Modify the configuration values in `llmops_config.json` file available for each example based on description. Update the `KEYVAULT_NAME`, `RESOURCE_GROUP_NAME` and Azure Machine Learning `WORKSPACE_NAME`.

- `ENV_NAME`: This represents the environment type. (The template supports *pr* and *dev* environments.)
Expand All @@ -344,6 +348,8 @@ The template uses 'pr' and 'dev' to refer to environment types. The template can

### Update config/deployment_config.json

**Note:** If you decide to use [the infrastructure deployed with the deployment script of this code base](../docs/tutorial/02-Infra%20deployment.md), this file is created and populated automatically. You can modify some of the default values if required.

Modify the configuration values in `deployment_config.json` file for each environment. These are required for deploying Prompt flows in Azure ML. Ensure the values for `ENDPOINT_NAME` and `CURRENT_DEPLOYMENT_NAME` are changed before pushing the changes to remote repository.

- `ENV_NAME`: This indicates the environment name, referring to the "development" or "production" or any other environment where the prompt will be deployed and used in real-world scenarios.
Expand Down
6 changes: 6 additions & 0 deletions docs/github_workflows_how_to_setup.md
Original file line number Diff line number Diff line change
Expand Up @@ -181,6 +181,8 @@ curl --request POST \
}"
```

**Note:** If you have provisioned a managed VNET for your Azure ML workspace, this operation will not work for now. You need to use a serverless runtime for now.

15. Get runtime creation status using REST API. Execute this step multiple times unless either you get output that shows createdOn with a valid date and time value or failure. In case of failure, troubleshoot the issue before moving forward.

```bash
Expand Down Expand Up @@ -278,6 +280,8 @@ Update code so that we can create a pull request. Update the `llmops_config.json

### Update llmops_config.json

**Note:** If you decide to use [the infrastructure deployed with the deployment script of this code base](../docs/tutorial/02-Infra%20deployment.md), this file is created and populated automatically.

Modify the configuration values in the `llmops_config.json` file available for each example based on description.

- `ENV_NAME`: This represents the environment type. (The template supports *pr* and *dev* environments.)
Expand All @@ -292,6 +296,8 @@ For the optional post production evaluation workflow, the above configuration wi

### Update deployment_config.json in config folder

**Note:** If you decide to use [the infrastructure deployed with the deployment script of this code base](../docs/tutorial/02-Infra%20deployment.md), this file is created and populated automatically. You can modify some of the default values if required.

Modify the configuration values in the `deployment_config.json` file for each environment. These are required for deploying Prompt flows in Azure ML. Ensure the values for `ENDPOINT_NAME` and `CURRENT_DEPLOYMENT_NAME` are changed before pushing the changes to remote repository.

- `ENV_NAME`: This indicates the environment name, referring to the "development" or "production" or any other environment where the prompt will be deployed and used in real-world scenarios.
Expand Down
Binary file added docs/images/architecture.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/images/firefox-proxy.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/images/inbound-rule.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading