Skip to content

Commit

Permalink
Merge pull request #235 from MicrosoftDocs/main
Browse files Browse the repository at this point in the history
9/11/2024 PM Publish
  • Loading branch information
Taojunshen authored Sep 11, 2024
2 parents 58bae4e + fa205db commit 022de22
Show file tree
Hide file tree
Showing 2 changed files with 46 additions and 45 deletions.
7 changes: 4 additions & 3 deletions articles/ai-studio/tutorials/copilot-sdk-build-rag.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,7 @@ ms.date: 08/29/2024
ms.reviewer: lebaro
ms.author: sgilley
author: sdgilley
ms.custom: [copilot-learning-hub]
#customer intent: As a developer, I want to learn how to use the prompt flow SDK so that I can build a RAG-based chat app.
---

Expand All @@ -29,7 +30,7 @@ This tutorial is part two of a three-part tutorial.

* Complete [Tutorial: Part 1 - Create resources for building a custom chat application with the prompt flow SDK](copilot-sdk-create-resources.md).

* You need a local copy of product data. The [Azure-Samples/rag-data-openai-python-promptflow repository on GitHub](https://github.com/Azure-Samples/rag-data-openai-python-promptflow/) contains sample retail product information that's relevant for this tutorial scenario. [Download the example Contoso Trek retail product data in a ZIP file](https://github.com/Azure-Samples/rag-data-openai-python-promptflow/tree/main/tutorial/data) to your local machine.
* You need a local copy of product data. The [Azure-Samples/rag-data-openai-python-promptflow repository on GitHub](https://github.com/Azure-Samples/rag-data-openai-python-promptflow/) contains sample retail product information that's relevant for this tutorial scenario. [Download the example Contoso Trek retail product data in a ZIP file](https://github.com/Azure-Samples/rag-data-openai-python-promptflow/blob/main/tutorial/data/product-info.zip) to your local machine.

## Application code structure

Expand Down Expand Up @@ -114,7 +115,7 @@ These steps deploy a model to a real-time endpoint from the AI Studio [model cat
When you deploy the `gpt-3.5-turbo` model, find the following values in the **View Code** section, and add them to your **.env** file:
```env
AZURE_OPENAI_ENDPOINT=<chat_model_endpoint_value>
AZURE_OPENAI_ENDPOINT=<endpoint_value>
AZURE_OPENAI_CHAT_DEPLOYMENT=<chat_model_deployment_name>
AZURE_OPENAI_API_VERSION=<api_version>
```
Expand Down Expand Up @@ -155,7 +156,7 @@ The goal with this RAG-based application is to ground the model responses in you

If you don't have an Azure AI Search index already created, we walk through how to create one. If you already have an index to use, you can skip to the [set the search environment variable](#set-search-index) section. The search index is created on the Azure AI Search service that was either created or referenced in the previous step.

1. Use your own data or [download the example Contoso Trek retail product data in a ZIP file](https://github.com/Azure-Samples/rag-data-openai-python-promptflow/tree/main/tutorial/data) to your local machine. Unzip the file into your **rag-tutorial** folder. This data is a collection of markdown files that represent product information. The data is structured in a way that is easy to ingest into a search index. You build a search index from this data.
1. Use your own data or [download the example Contoso Trek retail product data in a ZIP file](https://github.com/Azure-Samples/rag-data-openai-python-promptflow/blob/main/tutorial/data/product-info.zip) to your local machine. Unzip the file into your **rag-tutorial/data** folder. This data is a collection of markdown files that represent product information. The data is structured in a way that is easy to ingest into a search index. You build a search index from this data.

1. The prompt flow RAG package allows you to ingest the markdown files, locally create a search index, and register it in the cloud project. Install the prompt flow RAG package:

Expand Down
Loading

0 comments on commit 022de22

Please sign in to comment.