You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository was archived by the owner on Jun 30, 2022. It is now read-only.
Copy file name to clipboardExpand all lines: docs/_docs/reference/skills/architecture.md
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -23,7 +23,7 @@ Within an Enterprise, this could be creating one parent bot bringing together mu
23
23
24
24
Skills are themselves Bots, invoked remotely and a Skill developer template (.NET, TS) is available to facilitate creation of new Skills.
25
25
26
-
A key design goal for Skills was to maintain the consistent Activity protocol and ensure the development experience was as close to any normal V4 SDK bot as possible. To that end, a Bot simply starts a `SkilllDialog` which abstracts the skill invocation mechanics.
26
+
A key design goal for Skills was to maintain the consistent Activity protocol and ensure the development experience was as close to any normal V4 SDK bot as possible. To that end, a Bot simply starts a `SkillDialog` which abstracts the skill invocation mechanics.
27
27
28
28
## Invocation Flow
29
29
@@ -32,7 +32,7 @@ A key design goal for Skills was to maintain the consistent Activity protocol an
32
32
### Dispatcher
33
33
{:.no_toc}
34
34
35
-
The Dispatcher plays a central role to enabling a Bot to understand how to best process a given utterance. The Dispatch through use of the [Skill CLI]({{site.baseurl}}/reference/botskills) is updated with triggering utterances for a given Skill and a new Dispatch intent is created for a given Skill. An example of a Dispatch model with a point of interest skill having been added is shown below.
35
+
The Dispatcher plays a central role to enabling a Bot to understand how to best process a given utterance. The Dispatch through use of the [Skill CLI]({{site.baseurl}}/reference/skills/botskills) is updated with triggering utterances for a given Skill and a new Dispatch intent is created for a given Skill. An example of a Dispatch model with a point of interest skill having been added is shown below.
36
36
37
37

Copy file name to clipboardExpand all lines: docs/_docs/reference/skills/automotive.md
+43-43Lines changed: 43 additions & 43 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -63,7 +63,7 @@ An example transcript file demonstrating the Skill in action can be found [here]
63
63
64
64
## Language Understanding (LUIS)
65
65
66
-
LUIS models for the Skill are provided in .LU file format as part of the Skill. These are currently available in English with other languages to follow.
66
+
LUIS models for the Skill are provided in `.lu` file format as part of the Skill. These are currently available in English with other languages to follow.
67
67
68
68
The following Top Level intents are available with the main `settings` LUIS model
69
69
@@ -131,36 +131,36 @@ The MSBot tool will outline the deployment plan including location and SKU. Ensu
131
131
132
132
> After deployment is complete, it's **imperative** that you make a note of the .bot file secret provided as this will be required for later steps. The secret can be found near the top of the execution output and will be in purple text.
133
133
134
-
- Update your `appsettings.json` file with the newly created .bot file name and .bot file secret.
134
+
- Update your `appsettings.json` file with the newly created `.bot` file name and `.bot` file secret.
135
135
- Run the following command and retrieve the InstrumentationKey for your Application Insights instance and update `InstrumentationKey` in your `appsettings.json` file.
136
136
137
137
```
138
138
msbot list --bot YOURBOTFILE.bot --secret YOUR_BOT_SECRET
139
139
```
140
140
141
141
```json
142
-
{
143
-
"botFilePath": ".//YOURBOTFILE.bot",
144
-
"botFileSecret": "YOUR_BOT_SECRET",
145
-
"ApplicationInsights": {
146
-
"InstrumentationKey": "YOUR_INSTRUMENTATION_KEY"
147
-
}
142
+
{
143
+
"botFilePath": ".//YOURBOTFILE.bot",
144
+
"botFileSecret": "YOUR_BOT_SECRET",
145
+
"ApplicationInsights": {
146
+
"InstrumentationKey": "YOUR_INSTRUMENTATION_KEY"
148
147
}
148
+
}
149
149
```
150
150
151
-
- Finally, add the .bot file paths for each of your language configurations (English only at this time).
151
+
- Finally, add the `.bot` file paths for each of your language configurations (English only at this time).
Once you have followed the deployment instructions above, open the provided .bot file with the Bot Framework Emulator.
163
+
Once you have followed the deployment instructions above, open the provided `.bot` file with the Bot Framework Emulator.
164
164
165
165
### Adding the Skill to an existing Virtual Assistant deployment
166
166
@@ -169,33 +169,33 @@ Follow the instructions below to add the Automotive Skill to an existing Virtual
169
169
1. Update the Virtual Assistant deployment scripts.
170
170
- Add the additional automotive skill LUIS models to the bot.recipe file located within your assistant project: `assistant/DeploymentScripts/en/bot.recipe`
- Add dispatch references to the core LUIS intents for the skill within the **assistant/CognitiveModels/en/dispatch.lu** file as shown below. Only the vehicle settings model is required for dispatch. This enables the Dispatcher to understand your new capabilities and route utterances to your skill
- Add dispatch references to the core LUIS intents for the skill within the **assistant/CognitiveModels/en/dispatch.lu** file as shown below. Only the vehicle settings model is required for dispatch. This enables the Dispatcher to understand your new capabilities and route utterances to your skill
0 commit comments