title | titleSuffix | description | author | manager | ms.service | ms.topic | ms.date | ms.author | zone_pivot_groups | ms.custom |
---|---|---|---|---|---|---|---|---|---|---|
How to recognize intents with custom entity pattern matching |
Azure AI services |
In this guide, you learn how to recognize intents and custom entities from simple patterns. |
chschrae |
travisw |
azure-ai-speech |
how-to |
1/21/2024 |
chschrae |
programming-languages-set-thirteen |
devx-track-cpp, devx-track-csharp, mode-other, devx-track-extended-java, linux-related-content |
The Azure AI services Speech SDK has a built-in feature to provide intent recognition with simple language pattern matching. An intent is something the user wants to do: close a window, mark a checkbox, insert some text, etc.
In this guide, you use the Speech SDK to develop a console application that derives intents from speech utterances spoken through your device's microphone. You learn how to:
[!div class="checklist"]
- Create a Visual Studio project referencing the Speech SDK NuGet package
- Create a speech configuration and get an intent recognizer
- Add intents and patterns via the Speech SDK API
- Add custom entities via the Speech SDK API
- Use asynchronous, event-driven continuous recognition
Use pattern matching if:
- You're only interested in matching strictly what the user said. These patterns match more aggressively than conversational language understanding (CLU).
- You don't have access to a CLU model, but still want intents.
For more information, see the pattern matching overview.
Be sure you have the following items before you begin this guide:
- An Azure AI services resource or a Unified Speech resource
- Visual Studio 2019 (any edition).
::: zone pivot="programming-language-csharp" [!INCLUDE csharp] ::: zone-end
::: zone pivot="programming-language-cpp" [!INCLUDE cpp] ::: zone-end
::: zone pivot="programming-language-java" [!INCLUDE java] ::: zone-end