Model inference is a powerful technique to inspect the process and gain insights of what the process is capable of by analyzing the traces generated by this process. In order to test the sensitivity and generalization of model inference, we generate a large amount of traces from intents flow in multi-turn question answering conversations. We perform both qualitative and quantitative analysis of the inferred models, and manage to make some intuitive observations from them. The experiments indicate that given a large of amount of traces, model inference can handle the complexity of human conversations, and provide insights on several patterns of QA dialogues.
forked from prdwb/Model-Inference
-
Notifications
You must be signed in to change notification settings - Fork 0
geshijoker/Model-Inference
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
About
Final project for CS520
Resources
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published
Languages
- Jupyter Notebook 67.7%
- Python 32.3%