diff --git a/README.md b/README.md index aaea0480..ffb4934b 100644 --- a/README.md +++ b/README.md @@ -1,5 +1,5 @@ --- -name: Contoso Chat Retail with Azure AI Studio and Promptflow +name: Contoso Chat Retail with Azure AI Studio and Prompty description: A retail copilot that answers customer queries with responses grounded in retailer's product and customer data. languages: - python @@ -15,7 +15,7 @@ page_type: sample urlFragment: contoso-chat --- -# Contoso Chat Retail with Azure AI Studio and Promptflow +# Contoso Chat Retail with Azure AI Studio and Prompty This sample creates a customer support chat agent for an online retailer called Contoso Outdoors. The solution uses a _retrieval-augmented generation pattern_ to ground responses in the company's product and customer data. Customers can ask questions about the retailer's product catalog, and also get recommendations based on their prior purchases. @@ -32,8 +32,8 @@ The sample uses [Azure AI Search](https://learn.microsoft.com/azure/search/) to By exploring and deploying this sample, you will learn to: - Build a retail copilot application using the [_RAG pattern_](https://learn.microsoft.com/azure/ai-studio/concepts/retrieval-augmented-generation). -- Define and engineer prompts using the [_Prompty_ asset](https://microsoft.github.io/promptflow/tutorials/prompty-quickstart.html?highlight=prompty#). -- Design, run & evaluate a copilot using the [_Promptflow_ framework](https://microsoft.github.io/promptflow/tutorials/flex-flow-quickstart.html). +- Define and engineer prompts using the Prompty +- Design, run & evaluate a copilot - Provision and deploy the solution to Azure using the [_Azure Developer CLI_](https://learn.microsoft.com/azure/developer/azure-developer-cli/). - Understand and apply Responsible AI practices like [_evaluation and content safety_](https://learn.microsoft.com/en-us/azure/ai-services/responsible-use-of-ai-overview?context=%2Fazure%2Fai-studio%2Fcontext%2Fcontext). @@ -103,7 +103,6 @@ This has been the signature sample used to showcase end-to-end development of a * [Visual Studio Code](https://code.visualstudio.com) - recommended IDE for local development. * [Azure Developer CLI (azd)](https://aka.ms/install-azd) - to manage Azure deployment. * [Python 3.10+](https://www.python.org/downloads/) - to run, test & evaluate application. -* [Promptflow 1.10+](https://microsoft.github.io/promptflow/) - to build, evaluate, and deploy application flows. You will also need: * [Azure Subscription](https://azure.microsoft.com/free/) - sign up for a free account. @@ -222,26 +221,7 @@ The [contoso_chat](./contoso_chat) sample contains an example [chat.prompty](./c - `question` section to embed user query - `Instructions` section to reference related product recommendations -This specific prompty takes 3 inputs: a `customer` object, a `documentation` object (that could be chat history) and a `question` string that represents the user query. You can now _load_, _execute_, and _trace_ individual prompty assets for a more granular prompt engineering solution. - - * See the [prompty specification](https://microsoft.github.io/promptflow/how-to-guides/develop-a-prompty/index.html#prompty-specification) for more details on the format. - * Read the [prompty examples](https://github.com/microsoft/promptflow/tree/main/examples/prompty) for usage guidance from SDK or CLI. - -### Testing the Application Flow - -This sample uses a [flex-flow](https://microsoft.github.io/promptflow/how-to-guides/develop-a-flex-flow/index.html) feature that lets you "create LLM apps using a Python class or function as the entry point" - making it easier to test and run them using a code-first experience. - - This sample implements a _Function based flow_ - - The entry point is the _get_response_ functionin `chat_request.py` - -You can now [test the flow](https://microsoft.github.io/promptflow/how-to-guides/develop-a-flex-flow/function-based-flow.html#flow-test) in different ways: - - Run it directly, like any Python script - - Convert it to a flow, then use `pf flow test --flow ...` - - Start a UI to chat with the flow using `pf flow test --flow ... --ui` - -🌟 | Watch this space for more testing guidance. - - -## Guidance +This specific prompty takes 3 inputs: a `customer` object, a `documentation` object (that could be chat history) and a `question` string that represents the user query. You can now _load_, _execute_, and _trace_ individual prompty assets for a more granular prompt ### Region Availability @@ -270,14 +250,6 @@ This template uses [Managed Identity](https://learn.microsoft.com/entra/identity Additionally, we have added a [GitHub Action tool](https://github.com/microsoft/security-devops-action) that scans the infrastructure-as-code files and generates a report containing any detected issues. To ensure best practices we recommend anyone creating solutions based on our templates ensure that the [Github secret scanning](https://docs.github.com/code-security/secret-scanning/about-secret-scanning) setting is enabled in your repo. - -## Resources - -* [Azure AI Studio Documentation](https://learn.microsoft.com/azure/ai-studio/) -* [Promptflow/Prompty Documentation](https://microsoft.github.io/promptflow/reference/python-library-reference/promptflow-core/promptflow.core.html?highlight=prompty#promptflow.core.Prompty) -* [Develop Python apps that use Azure AI services](https://learn.microsoft.com/azure/developer/python/azure-ai-for-python-developers) -* Related Sample: [Process Automation: Speech to Text and Summarization with ACA](https://github.com/Azure-Samples/summarization-openai-python-promptflow/blob/main/README.md) - ## Troubleshooting Have questions or issues to report? Please [open a new issue](https://github.com/Azure-Samples/contoso-chat/issues) after first verifying that the same question or issue has not already been reported. In the latter case, please add any additional comments you may have, to the existing issue. diff --git a/deployment/environment.yaml b/deployment/environment.yaml deleted file mode 100644 index 4d011878..00000000 --- a/deployment/environment.yaml +++ /dev/null @@ -1,15 +0,0 @@ -$schema: https://azuremlschemas.azureedge.net/latest/environment.schema.json -build: - path: image_build_with_requirements - dockerfile_path: Dockerfile -# inference config is used to build a serving container for online deployments -inference_config: - liveness_route: - path: /health - port: 8080 - readiness_route: - path: /health - port: 8080 - scoring_route: - path: /score - port: 8080 \ No newline at end of file diff --git a/deployment/image_build_with_requirements/Dockerfile b/deployment/image_build_with_requirements/Dockerfile deleted file mode 100644 index 942c8c97..00000000 --- a/deployment/image_build_with_requirements/Dockerfile +++ /dev/null @@ -1,3 +0,0 @@ -FROM mcr.microsoft.com/azureml/promptflow/promptflow-runtime:latest -COPY ./requirements.txt . -RUN pip install -r requirements.txt \ No newline at end of file diff --git a/deployment/image_build_with_requirements/requirements.txt b/deployment/image_build_with_requirements/requirements.txt deleted file mode 100644 index 69a21e8c..00000000 --- a/deployment/image_build_with_requirements/requirements.txt +++ /dev/null @@ -1,12 +0,0 @@ -azure-cosmos -azure-ai-ml -azure-ai-resources -azure-search-documents==11.4.0 -promptflow==1.11.0 -promptflow[azure]==1.11.0 -promptflow-tools==1.4.0 -azure-identity==1.16.0 -python-dotenv==1.0.1 -jsonlines -promptflow.evals -nbconvert \ No newline at end of file