Skip to content
4 changes: 2 additions & 2 deletions 066-OpenAIFundamentals/Coach/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -62,12 +62,12 @@ Always refer students to the [What The Hack website](https://aka.ms/wth) for the
This hack requires students to have access to an Azure subscription where they can create and consume Azure resources. These Azure requirements should be shared with a stakeholder in the organization that will be providing the Azure subscription(s) that will be used by the students.

- [Azure subscription](https://azure.microsoft.com/en-us/free/)
<!-- Estimated spend may be around $10 based on running Cognitive Search for four days (total length of time depends on implementation time) -->
<!-- Estimated spend may be around $10 based on running Azure AI Search for four days (total length of time depends on implementation time) -->
- [Access to Azure OpenAI](https://customervoice.microsoft.com/Pages/ResponsePage.aspx?id=v4j5cvGGr0GRqy180BHbR7en2Ais5pxKtso_Pz4b1_xUOFA5Qk1UWDRBMjg0WFhPMkIzTzhKQ1dWNyQlQCN0PWcu)
- Jupyter Notebook editor (we recommend [Visual Studio Code](https://code.visualstudio.com/Download) or [Azure Machine Learning Studio](https://ml.azure.com/))
- If using Visual Studio Code, we also recommend installing [Anaconda](https://docs.anaconda.com/anaconda/install) OR [Miniconda](https://docs.anaconda.com/anaconda/install) for project environment management
- [Python](https://www.python.org/downloads/) (version 3.7.1 or later), plus the package installer [pip](https://pypi.org/project/pip/)
- [Azure Cognitive Search](https://learn.microsoft.com/azure/search) (Basic Tier) - This will be created during the Hack and is not necessary to get started.
- [Azure AI Search](https://learn.microsoft.com/azure/search) (Basic Tier) - This will be created during the Hack and is not necessary to get started.

## Suggested Hack Agenda

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@
"source": [
"## Introduction\n",
"\n",
"In this notebook, we will explore the practical application of RAG with a more manageable type of data i.e structured data such as relational data or text data stored in csv files. The main objective is to introduce a specific use case that demonstrates the utilization of Azure Cognitive Search to extract relevant documents and the power of ChatGPT to address relevant portions of the document, providing concise summaries based on user prompts. It aims to showcase how Azure OpenAI's ChatGPT capabilities can be adapted to suit your summarization needs, while also guiding you through the setup and evaluation of summarization results. This method can be customized to suit various summarization use cases and applied to diverse datasets.\n",
"In this notebook, we will explore the practical application of RAG with a more manageable type of data i.e structured data such as relational data or text data stored in csv files. The main objective is to introduce a specific use case that demonstrates the utilization of Azure AI Search to extract relevant documents and the power of ChatGPT to address relevant portions of the document, providing concise summaries based on user prompts. It aims to showcase how Azure OpenAI's ChatGPT capabilities can be adapted to suit your summarization needs, while also guiding you through the setup and evaluation of summarization results. This method can be customized to suit various summarization use cases and applied to diverse datasets.\n",
"\n",
"This notebook leverages **Semantic Kernel** as an orchestration framework to coordinate multiple AI services and manage the RAG workflow. Semantic Kernel provides:\n",
"\n",
Expand All @@ -32,7 +32,7 @@
"- **Configuration Management**: Unified handling of model parameters and execution settings\n",
"- **Async Operations**: Efficient handling of concurrent AI service calls\n",
"\n",
"The kernel acts as the central hub that orchestrates the interaction between Azure Cognitive Search for document retrieval and Azure OpenAI for embeddings and completions.\n",
"The kernel acts as the central hub that orchestrates the interaction between Azure AI Search for document retrieval and Azure OpenAI for embeddings and completions.\n",
"\n",
"## Student Tasks\n",
"Your goals for this challenge are to read through this notebook and complete the code where there is a TODO comment. Use Github Copilot to write the code! Ensure you run each code block, observe the results, and then be able to answer the questions posed in the student guide."
Expand Down Expand Up @@ -96,7 +96,7 @@
},
{
"cell_type": "code",
"execution_count": 111,
"execution_count": null,
"id": "69bd738e",
"metadata": {},
"outputs": [
Expand All @@ -112,7 +112,7 @@
}
],
"source": [
"# Import Azure Cognitive Search, Semantic Kernel, and other python modules\n",
"# Import Azure AI Search, Semantic Kernel, and other python modules\n",
"\n",
"import os, json, requests, sys, re\n",
"import asyncio\n",
Expand All @@ -121,7 +121,7 @@
"import numpy as np\n",
"from sklearn.metrics.pairwise import cosine_similarity\n",
"\n",
"# Azure Cognitive Search imports\n",
"# Azure AI Search imports\n",
"from azure.core.credentials import AzureKeyCredential\n",
"from azure.search.documents.indexes import SearchIndexClient \n",
"from azure.search.documents import SearchClient\n",
Expand Down Expand Up @@ -299,7 +299,7 @@
},
{
"cell_type": "code",
"execution_count": 114,
"execution_count": null,
"id": "11eff67e",
"metadata": {},
"outputs": [
Expand All @@ -315,7 +315,7 @@
}
],
"source": [
"# Create a Cognitive Search Index client\n",
"# Create a Azure AI Search Index client\n",
"service_endpoint = os.getenv(\"AZURE_AI_SEARCH_ENDPOINT\") \n",
"key = os.getenv(\"AZURE_AI_SEARCH_KEY\")\n",
"credential = AzureKeyCredential(key)\n",
Expand Down
26 changes: 19 additions & 7 deletions 066-OpenAIFundamentals/Student/Challenge-00.md
Original file line number Diff line number Diff line change
Expand Up @@ -68,12 +68,28 @@ You are ready to run the Jupyter Notebook files, hooray! Skip to section: [Setup

**NOTE:** You can skip this section if are using GitHub Codespaces!

If you want to setup a Jupyter Notebooks environment on your local workstation, expand the section below and follow the requirements listed.
If you want to setup this environment on your local workstation, expand the section below and follow the requirements listed. While you could install all of the prerequisite software directly on your workstation, that is not recommended. You are proceding at your own risk! It would be preferable for you to use dev containers instead if you do not want to use GitHub Codespaces.

<details markdown=1>
<summary markdown="span"><strong>Click to expand/collapse Local Workstation Requirements</strong></summary>

To work on your local workstation, please ensure you have the following tools and resources before hacking:
#### Set Up Local Dev Container

You will next be setting up your local workstation so that it can use dev containers. A Dev Container is a Docker-based environment designed to provide a consistent and reproducible development setup. The VS Code Dev Containers extension lets you easily open projects inside a containerized environment.

**NOTE:** On Windows, Dev Containers run in the Windows Subsystem for Linux (WSL).

On Windows and Mac OS (**NOTE:** only tested on Apple Silicon):
- (Windows only) Install the Windows Subsystem for Linux along with a Linux distribution such as Ubuntu. You will need to copy the `Resources.zip` to your Linux home directory and unzip it there.
- Download and install Docker Desktop
- Open the root folder of the Student resource package in Visual Studio Code
- You should get prompted to re-open the folder in a Dev Container. You can do that by clicking the Yes button, but if you miss it or hit no, you can also use the Command Palette in VS Code and select `Dev Containers: Reopen in Container`

##### Setup GitHub Copilot

For parts of this hack we will be relying heavily on GitHub Copilot for coding. Please setup [VS Code with GitHub Copilot](https://code.visualstudio.com/docs/copilot/setup-simplified?wt.md_id=AZ-MVP-5004796)

To work on your local workstation directly (not with dev containers), please ensure you have the following tools and resources before hacking:

- [Student Resources](#student-resources)
- [Visual Studio Code](#visual-studio-code)
Expand Down Expand Up @@ -247,10 +263,8 @@ If using GitHub Codespaces:
- `.env` <= Copied from `.env.sample`
- `.gitignore`
- `requirements.txt`
- Verify that you have created the Project in Microsoft Foundry.
- Verify that you have the following resources: Azure OpenAI, deployed the necessary models, AI Search, Document Intelligence, Azure Blob.

If working on a local workstation:
If working directly on a local workstation:

- Verify that you have Python and Conda installed
- Verify that you can run Jupyter Notebooks in Visual Studio Code or Azure Machine Learning Studio
Expand All @@ -260,8 +274,6 @@ If working on a local workstation:
- `.env` <= Renamed from `.env.sample`
- `.gitignore`
- `requirements.txt`
- Verify that you have created the Project in your Microsoft Foundry.
- Verify that you have the following resources: Azure OpenAI, deployed the necessary models, AI Search, Document Intelligence, Azure Blob.

## Learning Resources

Expand Down
46 changes: 32 additions & 14 deletions 066-OpenAIFundamentals/Student/Challenge-02.md
Original file line number Diff line number Diff line change
Expand Up @@ -76,24 +76,42 @@ Scenario: You are a product manager at a multinational tech company, and your te
### 2.4 Model Router
#### Student Task 2.4
- Navigate to Microsoft Foundry and deploy an instance of model router in the same project as your other models
- In Chat Playground use the model router deployment and prompt it with a variety of questions ranging simple to difficult. You can use the sample prompts below or come up with your own! Note how different models are used for each query (you can see this switch in the metadata on top of the prompt).
- After trying the below prompts navigate to a browser window and open Copilot. Ask Copilot the pricing for the three different models each query used. Note the price difference for each model. The smart routing is optimizing cost by using light weight models (which are cheaper) for the easier prompts!
- In Chat Playground use the model router deployment and prompt it with a variety of questions ranging simple to difficult. You will need to try some prompts that demonstrate the model router in action.



##### ⚡ Simple Transformations (should route to small/cheap models)

These are intentionally lightweight.

**Examples**

```Summarize this sentence in five words: The cat slept on the warm windowsill.``` <br>
```Convert this list into comma‑separated values: apples, bananas, pears, grapes”```<br>
```Classify the sentiment: I guess it’s fine, whatever.”```<br>
```Rewrite this in past tense: I walk to the store.```

Simple Prompt:
##### 🧠 Moderate Reasoning Tasks (often routes to mid‑tier models)

```
What is the capital of the United States?
```
Medium Prompt:
These require some structure but not deep reasoning.

```
Given a hybrid cloud architecture with latency-sensitive workloads, how would you design a multi-region failover strategy using Azure services?
```
Difficult Prompt:
**Examples**

```Explain the difference between throughput and latency to a non‑technical audience.”``` <br>
```Generate 10 creative marketing slogans for a local bakery.```

##### 🧩 Deep Reasoning / Multi‑Step Logic

These are the ones that really test the router.

**Examples**

```A factory produces widgets using three machines with different failure rates. If machine A fails 3% of the time, B fails 1%, and C fails 0.5%, and they operate in sequence, what is the probability a widget is defective? Show your reasoning.``` <br>
```Design a 4‑week onboarding curriculum for new data engineers, including prerequisites, labs, and assessments.```

- Notice how different models are used for each query (you can see this switch in the metadata on top of the prompt).
- After trying the below prompts navigate to a browser window and open Copilot. Ask Copilot the pricing for the three different models each query used. Note the price difference for each model. The smart routing is optimizing cost by using light weight models (which are cheaper) for the easier prompts!

```
Generate a Bicep script to deploy a secure, autoscaling AKS cluster with Azure Entra ID integration and private networking.
```
## Success Criteria

To complete this challenge successfully, you should be able to:
Expand Down
2 changes: 1 addition & 1 deletion 066-OpenAIFundamentals/Student/Challenge-05.md
Original file line number Diff line number Diff line change
Expand Up @@ -213,7 +213,7 @@ As you move forward, remember the significance of grounding responses in accurat
## Learning Resources

- [Overview of Responsible AI practices for Azure OpenAI models](https://learn.microsoft.com/en-us/legal/cognitive-services/openai/overview)
- [Azure Cognitive Services - What is Content Filtering](https://learn.microsoft.com/en-us/azure/cognitive-services/openai/concepts/content-filter)
- [Azure AI Services - What is Content Filtering](https://learn.microsoft.com/en-us/azure/cognitive-services/openai/concepts/content-filter)
- [Azure AI Content Safety tool](https://learn.microsoft.com/en-us/azure/cognitive-services/content-safety/overview)
- [Azure Content Safety Annotations feature](https://learn.microsoft.com/en-us/azure/cognitive-services/openai/concepts/content-filter#annotations-preview)
- [OpenAI PII Detection Plugin](https://github.com/openai/chatgpt-retrieval-plugin/tree/main#plugins)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -257,6 +257,7 @@ module document 'modules/document.bicep' = {
name: 'document-${suffix}'
location: location
customSubDomainName: toLower('document-intelligence-${suffix}')
userObjectId: userObjectId
}
}

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,9 @@ param location string
@description('Custom subdomain name for the Azure Document Intelligence.')
param customSubDomainName string

@description('Specifies the object id of a Microsoft Entra ID user. In general, this the object id of the system administrator who deploys the Azure resources.')
param userObjectId string = ''

resource account 'Microsoft.CognitiveServices/accounts@2025-09-01' = {
name: name
location: location
Expand All @@ -20,4 +23,22 @@ resource account 'Microsoft.CognitiveServices/accounts@2025-09-01' = {
}
}

// Cognitive Services Data Reader role - required for reading Cognitive Services data plane resources
// https://learn.microsoft.com/en-us/azure/role-based-access-control/built-in-roles#ai--machine-learning
resource cognitiveServicesDataReaderRoleDefinition 'Microsoft.Authorization/roleDefinitions@2022-04-01' existing = {
name: 'b59867f0-fa02-499b-be73-45a86b5b3e1c'
scope: subscription()
}

// This role assignment grants the user permissions to read Cognitive Services data plane resources
resource cognitiveServicesDataReaderRoleAssignment 'Microsoft.Authorization/roleAssignments@2022-04-01' = if (!empty(userObjectId)) {
name: guid(account.id, cognitiveServicesDataReaderRoleDefinition.id, userObjectId)
scope: account
properties: {
roleDefinitionId: cognitiveServicesDataReaderRoleDefinition.id
principalType: 'User'
principalId: userObjectId
}
}

output endpoint string = account.properties.endpoint
Loading