Skip to content

Commit

Permalink
Feature: Copilot Chat (microsoft#357)
Browse files Browse the repository at this point in the history
- Adds frontend webapp (React)
- Add backend SKWebApi (SK REST API Service) for CopilotChat sample
- Adds CosmosDB connector memory store

---------

Co-authored-by: Craig Presti <146438+craigomatic@users.noreply.github.com>
Co-authored-by: Teresa Hoang <125500434+teresaqhoang@users.noreply.github.com>
Co-authored-by: Tao Chen <TaoChenOSU@users.noreply.github.com>
Co-authored-by: Gil LaHaye <gillahaye@microsoft.com>
Co-authored-by: tehoang <tehoang@microsoft.com>
Co-authored-by: Lee Miller <lemiller@microsoft.com>
Co-authored-by: Devis Lucato <dluc@users.noreply.github.com>
Co-authored-by: amsacha <amsacha@microsoft.com>
Co-authored-by: microsoftShannon <100870671+microsoftShannon@users.noreply.github.com>
Co-authored-by: Roger Barreto <19890735+RogerBarreto@users.noreply.github.com>
Co-authored-by: SergeyMenshykh <68852919+SergeyMenshykh@users.noreply.github.com>
Co-authored-by: Gina Triolo <51341242+gitri-ms@users.noreply.github.com>
Co-authored-by: Abby Harrison <54643756+awharrison-28@users.noreply.github.com>
Co-authored-by: Abby Harrison <abby.harrison@microsoft.com>
Co-authored-by: Devis Lucato <devis@microsoft.com>
  • Loading branch information
16 people committed Apr 7, 2023
0 parents commit d5e9e99
Show file tree
Hide file tree
Showing 63 changed files with 14,271 additions and 0 deletions.
133 changes: 133 additions & 0 deletions samples/apps/copilot-chat-app/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,133 @@
# Copilot Chat Sample Application
>! IMPORTANT This learning sample is for educational purposes only and should
not be used in any production use case. It is intended to highlight concepts of
Semantic Kernel and not any architectural / Security design practices to be used.

## About the Copilot
The Copilot Chat sample allows you to build your own integrated large language
model chatbot. This is an enriched intelligence app, with multiple dynamic
components including command messages, user intent, and memories.

The chat prompt and response will evolve as the conversation between the user
and the application proceeds. This chat experience is a chat skill containing
multiple functions that work together to construct the final prompt for each
exchange.


![UI Sample](images/UI-Sample.png)

## Dependencies:

Before following these instructions, please ensure your development environment
and these components are functional:
1. [Visual Studio Code](https://code.visualstudio.com/Download)
2. [Git](https://git-scm.com/book/en/v2/Getting-Started-Installing-Git)
3. [.NET 6.0](https://dotnet.microsoft.com/en-us/download/dotnet/6.0)
4. [Node.js](https://nodejs.org/en/download)
5. [Yarn](https://classic.yarnpkg.com/lang/en/docs/install)


## Running the Sample
1. You will need an [Open AI Key](https://platform.openai.com/account/api-keys)
or Azure Open AI Service Key for this sample.
2. You will need an application registration.
[Follow the steps to register an app here.](https://learn.microsoft.com/en-us/azure/active-directory/develop/quickstart-register-app)

1. Select Single-page application (SPA) as platform type, and the redirect
URI will be `http://localhost:3000`
2. Select `Accounts in any organizational directory and personal Microsoft Accounts`
as supported account types for this sample.
3. Make a note of this Application (client) ID from the Azure Portal, we will
make use of it later.
3. The sample uses two applications, a front-end web UI, and a back-end API server.
First, let’s set up and verify the back-end API server is running.

1. Navigate to `\samples\apps\copilot-chat-app\SKWebApi`
2. Update `appsettings.json` with these settings:

* If you wish to run the back-end API server without an SSL certificate,
you may change `"UseHttp": false,` to `True` to overide the default
use of https.

* Under the `“CompletionConfig”` block, make the following configuration
changes to match your instance:

* `“AIService”: “AzureOpenAI”`, or whichever option is appropriate for
your instance.
* `“DeploymentOrModelID”: “text-davinci-003”,` or whichever option is
appropriate for your instance.
* `“Endpoint”:` “Your Azure Endpoint address, i.e. http://contoso.openai.azure.com”.
If you are using OpenAI, leave this blank.
* You will insert your Azure endpoint key during build of the backend
API Server

* Under the `“EmbeddingConfig”` block, make sure the following configuration
changes to match your instance:
* `“AIService”: “AzureOpenAI”,` or whichever option is appropriate
for your instance.
* `“DeploymentOrModelID”: “text-embedding-ada-002”,` or whichever
option is appropriate for your instance.
* You will insert your Azure endpoint key during build of the backend
API Server

4. Build the back-end API server by following these instructions:
1. In the terminal navigate to `\samples\apps\copilot-chat-app\SKWebApi`
2. Run the command: `dotnet user-secrets set "CompletionConfig:Key" "YOUR OPENAI KEY or AZURE OPENAI KEY"`
3. Run the command: `dotnet user-secrets set "EmbeddingConfig:Key" "YOUR OPENAI KEY or AZURE OPENAI KEY"`
4. Execute the command `dotnet build`
5. Once the build is complete, Execute the command `dotnet run`
6. Test the back-end server to confirm it is running.
* Open a web browser, and navigate to `https://localhost:40443/probe`
* You should see a confirmation message: `Semantic Kernel service is up and running`

>Note: you may need to accept the locally signed certificate on your machine
in order to see this message. It is important to do this, as your browser may
need to accept the certificate before allowing the WebApp to communicate
with the backend.

>Note: You may need to acknowledge the Windows Defender Firewall, and allow
the app to communicate over private or public netowrks as appropriate.

5. Now that the back-end API server is setup, and confirmed operating, let’s
proceed with setting up the front-end WebApp.
1. Navigate to `\apps\copilot-chat-app\webapp`
2. Copy `.env.example` into a new file with the name “`.env`” and make the
following configuration changes to match your instance:
3. Use the Application (client) ID from the Azure Portal steps above and
paste the GUID into the .env file next to `REACT_APP_CHAT_CLIENT_ID= `
4. Execute the command `yarn install`
5. Execute the command `yarn start`

6. Wait for the startup to complete.
7. With the back end and front end running, your web browser should automatically
launch and navigate to `https://localhost:3000`
8. Sign in in with your Microsoft work or personal account details.
9. Grant permission to use your account details, this is normally just to
read your account name.
10. If you you experience any errors or issues, consult the troubleshooting
section below.

> !CAUTION: Each chat interaction will call OpenAI which will use tokens that you will be billed for.
## Troubleshooting
![](images/Cert-Issue.png)

If you are stopped at an error message similar to the one above, your browser
may be blocking the front-end access to the back end while waiting for your
permission to connect.
To resolve this, try the following:

1. Confirm the backend service is running by opening a web browser, and navigating
to `https://localhost:40443/probe`
2. You should see a confirmation message: `Semantic Kernel service is up and running`
3. If your browser asks you to acknowledge the risks of visiting an insecure
website, you must acknowledge the message before the front end will be
allowed to connect to the back-end server. Please acknowledge, and navigate
until you see the message Semantic Kernel service is up and running
4. Return to your original browser window, or navigate to `https://localhost:3000`,
and refresh the page. You should now successfully see the Copilot Chat
application and can interact with the prompt.

* If you continue to experience trouble using SSL based linking, you may wish to
run the back-end API server without an SSL certificate, you may change
`"UseHttp": false,` to `"UseHttp": true,` to overide the default use of https.
43 changes: 43 additions & 0 deletions samples/apps/copilot-chat-app/SKWebApi/Config/AIServiceConfig.cs
Original file line number Diff line number Diff line change
@@ -0,0 +1,43 @@
// Copyright (c) Microsoft. All rights reserved.

// TODO: align with SK naming and expand to have all fields from both AzureOpenAIConfig and OpenAIConfig
// Or actually split this into two classes

namespace SemanticKernel.Service.Config;

#pragma warning disable CA1812 // Avoid uninstantiated internal classes - Instantiated by deserializing JSON
internal class AIServiceConfig
#pragma warning restore CA1812 // Avoid uninstantiated internal classes
{
public const string OpenAI = "OPENAI";
public const string AzureOpenAI = "AZUREOPENAI";

public string Label { get; set; } = string.Empty;
public string AIService { get; set; } = string.Empty;
public string DeploymentOrModelId { get; set; } = string.Empty;
public string Endpoint { get; set; } = string.Empty;
public string Key { get; set; } = string.Empty;

// TODO: add orgId and pass it all the way down

public bool IsValid()
{
switch (this.AIService.ToUpperInvariant())
{
case OpenAI:
return
!string.IsNullOrEmpty(this.Label) &&
!string.IsNullOrEmpty(this.DeploymentOrModelId) &&
!string.IsNullOrEmpty(this.Key);

case AzureOpenAI:
return
!string.IsNullOrEmpty(this.Endpoint) &&
!string.IsNullOrEmpty(this.Label) &&
!string.IsNullOrEmpty(this.DeploymentOrModelId) &&
!string.IsNullOrEmpty(this.Key);
}

return false;
}
}
100 changes: 100 additions & 0 deletions samples/apps/copilot-chat-app/SKWebApi/Config/ConfigExtensions.cs
Original file line number Diff line number Diff line change
@@ -0,0 +1,100 @@
// Copyright (c) Microsoft. All rights reserved.

using System.Reflection;
using Microsoft.SemanticKernel;
using Microsoft.SemanticKernel.AI.Embeddings;
using Microsoft.SemanticKernel.Connectors.OpenAI.TextEmbedding;
using Microsoft.SemanticKernel.Reliability;

namespace SemanticKernel.Service.Config;

internal static class ConfigExtensions
{
public static IHostBuilder ConfigureAppSettings(this IHostBuilder host)
{
string? environment = Environment.GetEnvironmentVariable("ASPNETCORE_ENVIRONMENT");

host.ConfigureAppConfiguration((ctx, builder) =>
{
builder.AddJsonFile("appsettings.json", false, true);
builder.AddJsonFile($"appsettings.{environment}.json", true, true);
builder.AddEnvironmentVariables();
builder.AddUserSecrets(Assembly.GetExecutingAssembly(), optional: true, reloadOnChange: true);
// For settings from Key Vault, see https://learn.microsoft.com/en-us/aspnet/core/security/key-vault-configuration?view=aspnetcore-7.0
});

return host;
}

public static void AddCompletionBackend(this KernelConfig kernelConfig, AIServiceConfig serviceConfig)
{
if (!serviceConfig.IsValid())
{
throw new ArgumentException("The provided completion backend settings are not valid");
}

switch (serviceConfig.AIService.ToUpperInvariant())
{
case AIServiceConfig.AzureOpenAI:
kernelConfig.AddAzureOpenAITextCompletionService(serviceConfig.Label, serviceConfig.DeploymentOrModelId,
serviceConfig.Endpoint, serviceConfig.Key);
break;

case AIServiceConfig.OpenAI:
kernelConfig.AddOpenAITextCompletionService(serviceConfig.Label, serviceConfig.DeploymentOrModelId,
serviceConfig.Key);
break;

default:
throw new ArgumentException("Invalid AIService value in completion backend settings");
}
}

public static void AddEmbeddingBackend(this KernelConfig kernelConfig, AIServiceConfig serviceConfig)
{
if (!serviceConfig.IsValid())
{
throw new ArgumentException("The provided embeddings backend settings are not valid");
}

switch (serviceConfig.AIService.ToUpperInvariant())
{
case AIServiceConfig.AzureOpenAI:
kernelConfig.AddAzureOpenAIEmbeddingGenerationService(serviceConfig.Label, serviceConfig.DeploymentOrModelId,
serviceConfig.Endpoint, serviceConfig.Key);
break;

case AIServiceConfig.OpenAI:
kernelConfig.AddOpenAIEmbeddingGenerationService(serviceConfig.Label, serviceConfig.DeploymentOrModelId,
serviceConfig.Key);
break;

default:
throw new ArgumentException("Invalid AIService value in embedding backend settings");
}
}

public static IEmbeddingGeneration<string, float> ToTextEmbeddingsService(this AIServiceConfig serviceConfig,
ILogger? logger = null,
IDelegatingHandlerFactory? handlerFactory = null)
{
if (!serviceConfig.IsValid())
{
throw new ArgumentException("The provided embeddings backend settings are not valid");
}

switch (serviceConfig.AIService.ToUpperInvariant())
{
case AIServiceConfig.AzureOpenAI:
return new AzureTextEmbeddingGeneration(serviceConfig.DeploymentOrModelId, serviceConfig.Endpoint,
serviceConfig.Key, "2022-12-01", logger, handlerFactory);

case AIServiceConfig.OpenAI:
return new OpenAITextEmbeddingGeneration(serviceConfig.DeploymentOrModelId, serviceConfig.Key,
log: logger, handlerFactory: handlerFactory);

default:
throw new ArgumentException("Invalid AIService value in embeddings backend settings");
}
}
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
// Copyright (c) Microsoft. All rights reserved.

// TODO: replace this controller with a better health check:
// https://learn.microsoft.com/en-us/aspnet/core/host-and-deploy/health-checks?view=aspnetcore-7.0

using Microsoft.AspNetCore.Mvc;

namespace SemanticKernel.Service.Controllers;

[Route("[controller]")]
[ApiController]
public class ProbeController : ControllerBase
{
[HttpGet]
public ActionResult<string> Get()
{
return "Semantic Kernel service up and running";
}
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,82 @@
// Copyright (c) Microsoft. All rights reserved.
// Licensed under the MIT License.

using Microsoft.AspNetCore.Mvc;
using Microsoft.SemanticKernel;
using Microsoft.SemanticKernel.Orchestration;
using SemanticKernel.Service.Model;

namespace SemanticKernel.Service.Controllers;

[ApiController]
public class SemanticKernelController : ControllerBase
{
private readonly IServiceProvider _serviceProvider;
private readonly IConfiguration _configuration;
private readonly ILogger<SemanticKernelController> _logger;

public SemanticKernelController(IServiceProvider serviceProvider, IConfiguration configuration, ILogger<SemanticKernelController> logger)
{
this._serviceProvider = serviceProvider;
this._configuration = configuration;
this._logger = logger;
}

/// <summary>
/// Invoke a Semantic Kernel function on the server.
/// </summary>
/// <remarks>
/// We create and use a new kernel for each request.
/// We feed the kernel the ask received via POST from the client
/// and attempt to invoke the function with the given name.
/// </remarks>
/// <param name="kernel">Semantic kernel obtained through dependency injection</param>
/// <param name="ask">Prompt along with its parameters</param>
/// <param name="skillName">Skill in which function to invoke resides</param>
/// <param name="functionName">Name of function to invoke</param>
/// <returns>Results consisting of text generated by invoked function along with the variable in the SK that generated it</returns>
[Route("skills/{skillName}/functions/{functionName}/invoke")]
[HttpPost]
[ProducesResponseType(StatusCodes.Status200OK)]
[ProducesResponseType(StatusCodes.Status400BadRequest)]
[ProducesResponseType(StatusCodes.Status404NotFound)]
public async Task<ActionResult<AskResult>> InvokeFunctionAsync([FromServices] Kernel kernel, [FromBody] Ask ask,
string skillName, string functionName)
{
this._logger.LogDebug("Received call to invoke {SkillName}/{FunctionName}", skillName, functionName);

string semanticSkillsDirectory = this._configuration.GetSection(SKWebApiConstants.SemanticSkillsDirectoryConfigKey).Get<string>();
if (!string.IsNullOrWhiteSpace(semanticSkillsDirectory))
{
kernel.RegisterSemanticSkills(semanticSkillsDirectory, this._logger);
}

kernel.RegisterNativeSkills(this._logger);

ISKFunction? function = null;
try
{
function = kernel.Skills.GetFunction(skillName, functionName);
}
catch (KernelException)
{
return this.NotFound($"Failed to find {skillName}/{functionName} on server");
}

// Put ask's variables in the context we will use
var contextVariables = new ContextVariables(ask.Input);
foreach (var input in ask.Variables)
{
contextVariables.Set(input.Key, input.Value);
}

// Run function
SKContext result = await kernel.RunAsync(contextVariables, function!);
if (result.ErrorOccurred)
{
return this.BadRequest(result.LastErrorDescription);
}

return this.Ok(new AskResult { Value = result.Result, Variables = result.Variables.Select(v => new KeyValuePair<string, string>(v.Key, v.Value)) });
}
}
Loading

0 comments on commit d5e9e99

Please sign in to comment.