Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to test our own dialogs,prompts #67

Closed
rjgmail88 opened this issue Feb 22, 2018 · 7 comments
Closed

How to test our own dialogs,prompts #67

rjgmail88 opened this issue Feb 22, 2018 · 7 comments
Labels

Comments

@rjgmail88
Copy link

@microsoftly, I looked over code samples [1] from your README.md file. and those are really helpful to get started. But, I have a doubt about testing my own dialogs or prompts which has a specific conversation flow. I see in your examples you are creating a sample dialog '/' with prompts & and testing it with BotTester but how do we test how our actual bot is responding the way we want if sample data/input is provided.

[1]

    it('can test prompts', () => {
        bot.dialog('/', [(session) => {
            new Prompts.text(session, 'Hi there! Tell me something you like');
        }, (session, results) => {
            session.send(`${results.response} is pretty cool.`);
            new Prompts.text(session, 'Why do you like it?');
        }, (session) => session.send('Interesting. Well, that\'s all I have for now')]);

        return new BotTester(bot)
            .sendMessageToBot('Hola!', 'Hi there! Tell me something you like')
            .sendMessageToBot('The sky', 'The sky is pretty cool.', 'Why do you like it?')
            .sendMessageToBot('It\'s blue', 'Interesting. Well, that\'s all I have for now')
            .runTest();
    });
@microsoftly
Copy link
Owner

The framework simulates a conversation through the bot. You can build up the conversation to get to any point with your bot. You are not limited to using one BotTester instance to move the conversation flow around. Here's an example of a test where I have a BotTester utility function that will move the conversation forward to a default state for a set of tests.

If you want to jump directly to a particular dialog, that could require putting some checks in the code to behave differently during test modes (which I wouldn't recommend).

Instead, if you break your dialogs down into their own exportable files, you could create a new bot, attach the relevant dialogs/flows, and mock the initial state to reach it. Think of it as unit testing a dialog by mocking the bot the dialog lives in.

Let me know if it helps! Check out the rest of the test files on the AgentHandoff project (simplify model branch). That project is why I made this framework. The cases are much more advanced and showcase the power of the BotTester framework.

@rjgmail88
Copy link
Author

rjgmail88 commented Feb 22, 2018

@microsoftly, Sorry but I could not quite get you. I think I wasnt clear on my question so let me ask you this about one if test cases from your project. Your test case 'can handle multiple responses'
is testing if user says Hola! then from bot you will have 2 responses back. hello & how are you doing?
For this test you wrote the dialog as a part of test inside it('.....') block. So let do you really have any dialog in your project which responses like this ? If yes, shouldn't we test that instead re-writing that dialog code inside t('.....') block & then testing it.

//# Test for multiple responses
    it('can handle multiple responses', () => {
        bot.dialog('/', (session) => {
            session.send('hello!');
            session.send('how are you doing?');
        });

        new BotTester(bot)
            .sendMessageToBot('Hola!', 'hello!', 'how are you doing?')
            .runTest();
    });

@microsoftly
Copy link
Owner

Ahh. I see.

As long as the dialogs are defined on the bot when you're running the tests, you'll be fine. I wrote the dialogs in line with the tests so it is clear what the expected responses were when looking at the readme.

Alternatively, if you wanted to test just a single (or subset of all) dialogs, you could implement your test(s) like the README and create a new bot with those dialogs inside the tests. These tests would require more mocking, hacking, or abstractions to allow each dialog to be functionally independent of each other such that they can be tested.

Does that clear things up?

@rjgmail88
Copy link
Author

Yes, so you mean I should encapsulate actual dialog and unite test together ?

@microsoftly
Copy link
Owner

It depends on what you're looking to gain from the test.

If you want to test the dialog independent of any other bot functionality, I would follow a similar path to the tests in this repo's readme (e.g. bot defined in test scope with 1 (or n relevant dialogs) applied to it with one of them being the root).

If you want to test how your bot will actually run, you need to register all the dialogs to the bot and simulate the conversation upto the point you wish to test.

I would recommend looking at the former option as a unit test for a dialog and the latter as a component test for the bot/conversation. They both have their places in a proper testing strategy, they can both compliment each other, but there is a correct time and place for each. Whether either or both fit your needs is beyond what I can offer (at least with the info I have).

At the very least, I'd recommend making some simple bots and dialogs and testing your hypotheses on your own. If you do go down that path, I'd very much like to see the results of it if you don't mind sharing

@rjgmail88
Copy link
Author

When you say,

.........you need to register all the dialogs to the bot and simulate the conversation upto the point you wish to test.

Do you mean use same bot object for testing and for actual chat bot. Because, I noticed bot might have different connector.
ex: builder.ConsoleConnector() for bot object we gonna use for testing and
new builder.ChatConnector for actual chat bot.

Any example you could copy ?

@microsoftly
Copy link
Owner

microsoftly commented Feb 23, 2018

Yes, I'm going to refer to the test file that most of the README is from.

The BotTester framework exports a TestConnector

That you can use when creating bots

But you're under no obligation to do so. The BotTester framework would work with any connector; there may be unintended side effects if a ChatConnector is used and it is significantly slower with a ConsoleConnector.

I would recommend testing out your questions locally. You will get much faster feedback than responses from me.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants