Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ The Syncfusion AI AssistView supports integration with [Gemini](https://ai.googl

## Getting Started With the AI AssistView control

Before integrating Gemini AI, ensure that the Syncfusion AI AssistView control is correctly rendered in your Vue application:
Before integrating Gemini AI, ensure that the Syncfusion AI AssistView control is correctly rendered in your MVC application:

[ MVC Getting Started Guide](../getting-started)

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ The Syncfusion AI AssistView supports integration with [LLM via Ollama](https://

## Getting Started With the AI AssistView control

Before integrating LLM Model, ensure that the Syncfusion AI AssistView control is correctly rendered in your Vue application:
Before integrating LLM Model, ensure that the Syncfusion AI AssistView control is correctly rendered in your MVC application:

[ MVC Getting Started Guide](../getting-started)

Expand Down
Original file line number Diff line number Diff line change
@@ -1,26 +1,26 @@
---
layout: post
title: Open AI in ##Platform_Name## AI AssistView Control | Syncfusion
description: Checkout and learn about Integration of Open AI in Syncfusion ##Platform_Name## AI AssistView control of Syncfusion Essential JS 2 and more.
title: Azure Open AI in ##Platform_Name## AI AssistView Control | Syncfusion
description: Checkout and learn about Integration of Azure Open AI in Syncfusion ##Platform_Name## AI AssistView control of Syncfusion Essential JS 2 and more.
platform: ej2-asp-core-mvc
control: Open AI
control: Azure Open AI
publishingplatform: ##Platform_Name##
documentation: ug
---

# Integration of Open AI With AI AssistView control
# Integration of Azure Open AI With AI AssistView control

The Syncfusion AI AssistView supports integration with [OpenAI](https://platform.openai.com/docs/overview), enabling advanced conversational AI features in your MVC applications.
The Syncfusion AI AssistView supports integration with [Azure Open AI](https://microsoft.github.io/PartnerResources/skilling/ai-ml-academy/resources/openai), enabling advanced conversational AI features in your Angular applications.

## Getting Started With the AI AssistView control

Before integrating Open AI, ensure that the Syncfusion AI AssistView control is correctly rendered in your Vue application:
Before integrating Azure Open AI, ensure that the Syncfusion AI AssistView control is correctly rendered in your MVC application:

[ MVC Getting Started Guide](../getting-started)

## Prerequisites

* OpenAI account to generate an API key for accessing the `OpenAI` API
* An Azure account with access to `Azure Open AI` services and a generated API key.
* [System requirements for ASP.NET MVC controls](https://ej2.syncfusion.com/aspnetmvc/documentation/system-requirements) to create MVC application

## Install Packages
Expand All @@ -33,35 +33,39 @@ NuGet\Install-Package Syncfusion.EJ2.MVC5

```

Install the Open AI package in the application using Package Manager Console.
Install the Open AI and Azure Open AI package in the application using Package Manager Console.

```bash

NuGet\Install-Package OpenAI

NuGet\Install-Package Azure.AI.OpenAI
NuGet\Install-Package Azure.Core

```

## Generate API Key
## Configure Azure Open AI

1. Go to [Open AI](https://platform.openai.com/docs/overview) and sign in with your Google account. If you don’t have one, create a new account.
1. Log in to the [Azure Portal](https://portal.azure.com/#home) and navigate to your Azure Open AI resource.

2. Once logged in, click on your profile icon in the top-right corner and select `API Keys` from the dropdown menu.
2. Under Resource Management, select Keys and Endpoint to retrieve your API key and endpoint URL.

3. Click the `+ Create new secret key` button. You’ll be prompted to name the key (optional). Confirm to generate the key.
3. Copy the API key, endpoint, and deployment name (e.g., gpt-4o-mini). Ensure the API version (e.g., 2024-07-01-preview) matches your resource configuration.

4. Your API key will be displayed once. Copy it and store it securely, as it won’t be shown again.
4. Store these values securely, as they will be used in your application.

> `Security Note`: Never commit the API key to version control. Use environment variables or a secret manager for production.
> `Security Note`: expose your API key in client-side code for production applications. Use a server-side proxy or environment variables to manage sensitive information securely.

## Integration Open AI with AI AssistView
## Integration Azure Open AI with AI AssistView

You can add the below respective files in your application:

* Add your generated `API Key` at the line in .cs file
* Update the following configuration values with your Azure Open AI details:

```bash

string apiKey = 'Place your API key here';
string endpoint = "Your_Azure_OpenAI_Endpoint";
string apiKey = "Your_Azure_OpenAI_API_Key";
string deploymentName = "Your_Deployment_Name";

```

Expand All @@ -74,12 +78,12 @@ string apiKey = 'Place your API key here';
{% endhighlight %}
{% endtabs %}

![Open AI](../../images/open-ai.png)
![Azure Open AI](../../images/open-ai.png)

## Run and Test

Run the application in the browser using the following command.

Build and run the app (Ctrl + F5).

Open `https://localhost:44321` to interact with your Open AI for dynamic response.
Open `https://localhost:44321` to interact with your Azure Open AI for dynamic response.
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ The Syncfusion AI AssistView supports integration with [Gemini](https://ai.googl

## Getting Started With the AI AssistView control

Before integrating Gemini AI, ensure that the Syncfusion AI AssistView control is correctly rendered in your Vue application:
Before integrating Gemini AI, ensure that the Syncfusion AI AssistView control is correctly rendered in your Core application:

[ ASP.NET CORE Getting Started Guide](../getting-started)

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ The Syncfusion AI AssistView supports integration with [LLM via Ollama](https://

## Getting Started With the AI AssistView control

Before integrating LLM Model, ensure that the Syncfusion AI AssistView control is correctly rendered in your Vue application:
Before integrating LLM Model, ensure that the Syncfusion AI AssistView control is correctly rendered in your Core application:

[ CORE Getting Started Guide](../getting-started)

Expand Down
Original file line number Diff line number Diff line change
@@ -1,26 +1,26 @@
---
layout: post
title: Open AI in ##Platform_Name## AI AssistView Control | Syncfusion
description: Checkout and learn about Integration of Open AI in Syncfusion ##Platform_Name## AI AssistView control of Syncfusion Essential JS 2 and more.
title: Azure Open AI in ##Platform_Name## AI AssistView Control | Syncfusion
description: Checkout and learn about Integration of Azure Open AI in Syncfusion ##Platform_Name## AI AssistView control of Syncfusion Essential JS 2 and more.
platform: ej2-asp-core-mvc
control: Open AI
control: Azure Open AI
publishingplatform: ##Platform_Name##
documentation: ug
---

# Integration of Open AI With AI AssistView control
# Integration of Azure Open AI With AI AssistView control

The Syncfusion AI AssistView supports integration with [OpenAI](https://platform.openai.com/docs/overview), enabling advanced conversational AI features in your Core applications.
The Syncfusion AI AssistView supports integration with [Azure Open AI](https://microsoft.github.io/PartnerResources/skilling/ai-ml-academy/resources/openai), enabling advanced conversational AI features in your Angular applications.

## Getting Started With the AI AssistView control

Before integrating Open AI, ensure that the Syncfusion AI AssistView control is correctly rendered in your Vue application:
Before integrating Azure Open AI, ensure that the Syncfusion AI AssistView control is correctly rendered in your Core app:

[ CORE Getting Started Guide](../getting-started)

## Prerequisites

* OpenAI account to generate an API key for accessing the `OpenAI` API
* An Azure account with access to `Azure Open AI` services and a generated API key.
* [System requirements for ASP.NET MVC controls](https://ej2.syncfusion.com/aspnetmvc/documentation/system-requirements) to create Core application

## Install Packages
Expand All @@ -33,35 +33,39 @@ NuGet\Install-Package Syncfusion.EJ2.AspNet.Core

```

Install the Open AI package in the application using Package Manager Console.
Install the Open AI and Azure Open AI package in the application using Package Manager Console.

```bash

NuGet\Install-Package

NuGet\Install-Package OpenAI
NuGet\Install-Package Azure.AI.OpenAI
NuGet\Install-Package Azure.Core

```

## Generate API Key
## Configure Azure Open AI

1. Go to [Open AI](https://platform.openai.com/docs/overview) and sign in with your Google account. If you don’t have one, create a new account.
1. Log in to the [Azure Portal](https://portal.azure.com/#home) and navigate to your Azure Open AI resource.

2. Once logged in, click on your profile icon in the top-right corner and select `API Keys` from the dropdown menu.
2. Under Resource Management, select Keys and Endpoint to retrieve your API key and endpoint URL.

3. Click the `+ Create new secret key` button. You’ll be prompted to name the key (optional). Confirm to generate the key.
3. Copy the API key, endpoint, and deployment name (e.g., gpt-4o-mini). Ensure the API version (e.g., 2024-07-01-preview) matches your resource configuration.

4. Your API key will be displayed once. Copy it and store it securely, as it won’t be shown again.
4. Store these values securely, as they will be used in your application.

> `Security Note`: Never commit the API key to version control. Use environment variables or a secret manager for production.
> `Security Note`: Never expose your API key in client-side code for production applications. Use a server-side proxy or environment variables to manage sensitive information securely.

## Integration Open AI with AI AssistView
## Integration Azure Open AI with AI AssistView

You can add the below respective files in your application:

* Add your generated `API Key` at the line in .cs file
* Update the following configuration values with your Azure Open AI details:

```bash

string apiKey = 'Place your API key here';
string endpoint = "Your_Azure_OpenAI_Endpoint";
string apiKey = "Your_Azure_OpenAI_API_Key";
string deploymentName = "Your_Deployment_Name";

```

Expand All @@ -74,12 +78,12 @@ string apiKey = 'Place your API key here';
{% endhighlight %}
{% endtabs %}

![Open AI](../../images/open-ai.png)
![Azure Open AI](../../images/open-ai.png)

## Run and Test

Run the application in the browser using the following command.

Build and run the app (Ctrl + F5).

Open `https://localhost:44321` to interact with your Open AI for dynamic response.
Open `https://localhost:44321` to interact with your Azure Open AI for dynamic response.
Original file line number Diff line number Diff line change
Expand Up @@ -77,4 +77,3 @@ public class ToolbarItemModel
public string iconCss { get; set; }
}
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -48,16 +48,16 @@ public async Task<IActionResult> GetAIResponse([FromBody] PromptRequest request)

if (string.IsNullOrEmpty(responseText?.Text))
{
_logger.LogError("OpenAI API returned no text.");
return BadRequest("No response from OpenAI.");
_logger.LogError("Gemini API returned no text.");
return BadRequest("No response from Gemini.");
}

_logger.LogInformation("OpenAI response received: {Response}", responseText?.Text);
_logger.LogInformation("Gemini response received: {Response}", responseText?.Text);
return Json(responseText?.Text);
}
catch (Exception ex)
{
_logger.LogError("Exception in OpenAI call: {Message}", ex.Message);
_logger.LogError("Exception in Gemini call: {Message}", ex.Message);
return BadRequest($"Error generating response: {ex.Message}");
}
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -83,7 +83,6 @@
streamResponse(text);
})
.catch(error => {
console.error('Error fetching AI response:', error);
assistObj.addPromptResponse('⚠️ Something went wrong while connecting to the AI service. Please try again later.');
stopStreaming = true;
});
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -79,7 +79,6 @@
streamResponse(text);
})
.catch(error => {
console.error('Error fetching AI response:', error);
assistObj.addPromptResponse('⚠️ Something went wrong while connecting to the AI service. Please try again later.');
stopStreaming = true;
});
Expand Down
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
using OpenAI;

using Azure;
using Azure.AI.OpenAI;
namespace WebApplication4.Pages
{
public class IndexModel : PageModel
Expand Down Expand Up @@ -38,25 +39,32 @@ public async Task<IActionResult> OnPostGetAIResponse([FromBody] PromptRequest re
return BadRequest("Prompt cannot be empty.");
}

string apiKey = ""; // Replace with your OpenAI API key
var openAiClient = new OpenAIClient(apiKey);
var chatClient = openAiClient.GetChatClient("gpt-4o-mini"); // Use your preferred model, e.g., "gpt-4o-mini" or "gpt-4o"
string endpoint = ""; // Replace with your Azure OpenAI endpoint
string apiKey = ""; // Replace with your Azure OpenAI API key
string deploymentName = ""; // Replace with your Azure OpenAI deployment name (e.g., gpt-4o-mini)

ChatCompletion completion = await chatClient.CompleteChatAsync(request.Prompt);
string responseText = completion.Content[0].Text;
var credential = new AzureKeyCredential(apiKey);
var client = new AzureOpenAIClient(new Uri(endpoint), credential);
var chatClient = client.GetChatClient(deploymentName);

var chatCompletionOptions = new ChatCompletionOptions();
var completion = await chatClient.CompleteChatAsync(
new[] { new UserChatMessage(request.Prompt) },
chatCompletionOptions
);
string responseText = completion.Value.Content[0].Text;
if (string.IsNullOrEmpty(responseText))
{
_logger.LogError("OpenAI API returned no text.");
return BadRequest("No response from OpenAI.");
_logger.LogError("Azure OpenAI API returned no text.");
return BadRequest("No response from Azure OpenAI.");
}

_logger.LogInformation("OpenAI response received: {Response}", responseText);
_logger.LogInformation("Azure OpenAI response received: {Response}", responseText);
return new JsonResult(responseText);
}
catch (Exception ex)
{
_logger.LogError("Exception in Gemini call: {Message}", ex.Message);
_logger.LogError("Exception in Azure OpenAI call: {Message}", ex.Message);
return BadRequest($"Error generating response: {ex.Message}");
}
}
Expand All @@ -79,4 +87,3 @@ public class ToolbarItemModel
public string iconCss { get; set; }
}
}
}
Original file line number Diff line number Diff line change
@@ -1,4 +1,7 @@
using OpenAI;
using OpenAI.Chat;
using Azure;
using Azure.AI.OpenAI;

namespace AssistViewDemo.Controllers
{
Expand Down Expand Up @@ -37,25 +40,34 @@ public async Task<IActionResult> GetAIResponse([FromBody] PromptRequest request)
return BadRequest("Prompt cannot be empty.");
}

string apiKey = ""; // Replace with your OpenAI API key
var openAiClient = new OpenAIClient(apiKey);
var chatClient = openAiClient.GetChatClient("gpt-4o-mini"); // Use your preferred model, e.g., "gpt-4o-mini" or "gpt-4o"
// Azure OpenAI configuration
string endpoint = ""; // Replace with your Azure OpenAI endpoint
string apiKey = ""; // Replace with your Azure OpenAI API key
string deploymentName = ""; // Replace with your Azure OpenAI deployment name (e.g., gpt-4o-mini)

OpenAI.Chat.ChatCompletion completion = await chatClient.CompleteChatAsync(request.Prompt);
string responseText = completion.Content[0].Text;
var credential = new AzureKeyCredential(apiKey);
var client = new AzureOpenAIClient(new Uri(endpoint), credential);
var chatClient = client.GetChatClient(deploymentName);

var chatCompletionOptions = new ChatCompletionOptions();
var completion = await chatClient.CompleteChatAsync(
new[] { new UserChatMessage(request.Prompt) },
chatCompletionOptions
);

string responseText = completion.Value.Content[0].Text;
if (string.IsNullOrEmpty(responseText))
{
_logger.LogError("OpenAI API returned no text.");
return BadRequest("No response from OpenAI.");
_logger.LogError("Azure OpenAI API returned no text.");
return BadRequest("No response from Azure OpenAI.");
}

_logger.LogInformation("OpenAI response received: {Response}", responseText);
_logger.LogInformation("Azure OpenAI response received: {Response}", responseText);
return Json(responseText);
}
catch (Exception ex)
{
_logger.LogError("Exception in OpenAI call: {Message}", ex.Message);
_logger.LogError("Exception in Azure OpenAI call: {Message}", ex.Message);
return BadRequest($"Error generating response: {ex.Message}");
}
}
Expand Down
Loading