Skip to content

danhas14/Fabric_AzureOpenAI_Example

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

55 Commits
 
 
 
 
 
 

Repository files navigation

Using Azure OpenAI endpoints natively in Microsoft Fabric

Your image description

A useful feature in Microsoft Fabric is the ability to use Azure AI services, like Azure OpenAI, directly in Fabric without having to deploy a separate endpoint or service. Per the Fabric documentation, "Fabric seamlessly integrates with Azure AI services, allowing you to enrich your data with prebuilt AI models without any prerequisite. We recommend using this option as you can utilize your Fabric authentication to access AI services, and all usage are billed against your Fabric capacity". This solution is currently in public preview.

https://learn.microsoft.com/en-us/fabric/data-science/ai-services/ai-services-overview

This tutorial will walk you through implementing a simple use-case leveraging the Fabric native Azure OpenAI endpoint. Note that you will need a workspace in a Power BI Premium capacity or a Fabric F-SKU (F64 and above) in order to use the built-in Azure AI services. If you don't see some of the options mentioned in the tutorial, you may need to ensure Fabric is enabled for your tenant. The data in the tutorial is from an open dataset that contains sample reviews for a McDonald's restaurant.

To start, open up Microsoft Fabric (www.powerbi.com) and create a new Lakehouse by selecting 'Create' in the top left hand corner and then selecting 'Lakehouse' as shown in the image

image

For the name of the Lakehouse, type Reviews_Lakehouse as shown below and then hit Create:

image

Now lets upload some data we can use in the lakehouse. See the attached CSV file McDonalds Reviews and download it to your desktop. Once you do that, go back to your new Fabric Lakehouse and upload the data as shown in the following picture by selecting 'Upload Files':

image

Select the file you just downloaded to upload:

image

You should see the file show up in the Files section of the Lakehouse if the upload was successful:

image

Now right click on the file and select 'Load to Tables' to load the data into a Fabric lakehouse table.

image

Select 'Load'

image

Now select the Tables folder in the lakehouse and right click and select 'Refresh'. There should be a new mcdonalds_reviews table created from the CSV.

Note the table will have blanks in the review_topic column and review_sentiment columns. We will have OpenAI populate the data for those columns in just a minute.

image

Now we need to get the location of the table to use it in our notebook. To do that, right click on the table and select 'Properties'.

image

Now copy the last property on the right called 'ABFS path' as shown below. Save the path in a text file to be used later.

image

Now download the attached file 'Fabric OpenAI Restaurant Reviews.ipynb'.

Next, select the icon on the bottom left side of the Fabric UI and select 'Data Engineering'. Once in the Data Engineering area of Fabric, select the 'Import Notebook' icon. Select the 'Upload' icon to select the Fabric OpenAI Restaurant Reviews.ipynb file you just downloaded.

image

Now select and open the notebook you just imported to open it in Fabric.

image

On the left hand side, select 'Lakehouses' to associate our Reviews_Lakehouse with the notebook. Select 'Add' to add a lakehouse, then At the prompt select 'Existing Lakehouse', then check the 'Reviews_Lakehouse' and then select 'Add'. You should then see something like below:

image

In the first cell of the notebook, paste the URL you copied to the text file above as the value for the 'lakehouse_table' variable as shown below:

image

Now run the first four cells in the notebook one at a time. Note it may take a minute for the Fabric Spark engine to initialize the first time you run it. The second cell will import the OpenAI library used in the notebook while the third cell will specify some categories we want Azure OpenAI to use to categorize the reviews. Note that we didn't have to train Azure OpenAI on what the categories represent ahead of time and you can add new ones to the list if you would like. The fourth cell will load the data from our lakehouse table into a Spark dataframe.

image

Now run the fifth cell. This cell will loop through each row in the table and then ask Azure OpenAI to provide any topics relevant to the review along with an overall sentiment. You can see the prompt, or instructions, provided to Azure OpenAI on line 10-15 of the cell. Note we didn't have to provide an Azure OpenAI URL to call or a key since Fabric takes care of all of that for us. We just provide the Azure OpenAI model we want to use, which is gpt-35-turbo in this case. Lines 57-63 of the notebook do a Delta 'merge' command to merge the new topics and sentiment returned by Azure OpenAI back into the existing table based using the unique _unit_id column as a key. Note that larger tables may take awhile to run. It may be better in some cases to filter just the specific rows you need to send to Azure OpenAI.

image

Now run the sixth cell, which does a select from the table again and now shows our updated rows based on the updates from Azure OpenAI.

image

Now select our Reviews_Lakehouse icon on the left hand side of the screen to return to the lakehouse view. Don't worry if you don't see our new data in the table just yet. It's still using a cached copy of the table in this view.

image

On the top right side there is a drop-down where it says 'Lakehouse'. From the drop-down, select 'SQL Analytics Endpoint'

image

In the SQL Analytics Endpoint you can run queries on the data, but for now just select the 'New Report' option. This will open the Power BI report development environment where you can create a report using the data in the table.

image

image

This tutorial walked you through importing some data, using Azure OpenAI in Fabric to classify it, then use those new classifications in a Power BI Report.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published