Skip to content

Latest commit

 

History

History
59 lines (44 loc) · 3.57 KB

File metadata and controls

59 lines (44 loc) · 3.57 KB

Simple Chat Summary Sample Learning App

Watch the Chat Summary Quick Start Video.

!IMPORTANT This learning sample is for educational purposes only and should not be used in any production use case. It is intended to highlight concepts of Semantic Kernel and not any architectural / security design practices to be used.

Running the sample

  1. You will need an Open AI Key or Azure Open AI Service key for this sample.
  2. Ensure the KernelHttpServer sample is already running at http://localhost:7071. If not, follow the steps to start it here.
  3. Copy .env.example into a new file with name ".env".

    Note: Samples are configured to use chat completion AI models (e.g., gpt-3.5-turbo, gpt-4, etc.). See https://platform.openai.com/docs/models/model-endpoint-compatibility for chat completion model options.

  4. You will also need to Run the following command yarn install (if you have never run the sample before) and/or yarn start from the command line.
  5. A browser will automatically open, otherwise you can navigate to http://localhost:3000 to use the sample.

Working with Secrets: KernelHttpServer's Readme has a note on safely working with keys and other secrets.

About the Simple Chat Summary Sample

The Simple Chat Summary sample allows you to see the power of semantic functions used in a chat.

The sample highlights the SummarizeConversation, GetConversationActionItems, and GetConversationTopics native functions in the Conversation Summary Skill. Each function calls Open AI to review the information in the chat window and produces insights.

The chat data can be loaded from this data file – which you can edit or just add more to the chat while you are on the page.

Caution

Each function will call Open AI which will use tokens that you will be billed for.

chat-summary

Next Steps: Try a more advanced sample app

Book creator – learn how Planner and chaining of semantic functions can be used in your app.

Deeper Learning Tips

  • Try modifying the array exported as ChatThread in ChatThread.ts to alter the conversation that is supplied by default.
  • View loadSummarySkill, loadActionItemsSkill and loadTopicsSkill in ChatInteraction.tsx to see fetch requests that POST skills to the Semantic Kernel hosted by the Azure Function
  • Notice how AISummary.tsx makes POST requests to the Azure Function to invoke skills that were previously added. Also take note of the Skill definition and configuration in Skills.ts - these skills were copied from the skills folder and added into the project to create a simple first run experience.