title | description | titleSuffix | author | manager | ms.service | ms.topic | ms.date | ms.author | ms.custom |
---|---|---|---|---|---|---|---|---|---|
How to view orchestration workflow models details |
Learn how to view details for your model and evaluate its performance. |
Azure AI services |
jboback |
nitinme |
azure-ai-language |
how-to |
12/19/2023 |
jboback |
language-service-custom-classification |
After model training is completed, you can view your model details and see how well it performs against the test set. Observing how well your model performed is called evaluation. The test set consists of data that wasn't introduced to the model during the training process.
Note
Using the Automatically split the testing set from training data option may result in different model evaluation result every time you train a new model, as the test set is selected randomly from your utterances. To make sure that the evaulation is calcualted on the same test set every time you train a model, make sure to use the Use a manual split of training and testing data option when starting a training job and define your Testing set when add your utterances.
Before viewing a model's evaluation, you need:
- An orchestration workflow project.
- A successfully trained model
See the project development lifecycle for more information.
In the view model details page, you'll be able to see all your models, with their current training status, and the date they were last trained.
[!INCLUDE Model performance]
[!INCLUDE Evaluate model]
[!INCLUDE Load export model]
[!INCLUDE Load export model]
[!INCLUDE Delete model]
[!INCLUDE Delete model]
- As you review how your model performs, learn about the evaluation metrics that are used.
- If you're happy with your model performance, you can deploy your model