Skip to content
This repository has been archived by the owner on Sep 20, 2024. It is now read-only.

Conversations frequently end early when gap between replies is > 100ms #20

Closed
zachdunn opened this issue Dec 18, 2015 · 11 comments
Closed

Comments

@zachdunn
Copy link

Bot conversations seem to end prematurely when dealing with multiple delayed responses. Here's an example using promises and synchronous timeouts to illustrate the point:

var Promise = require('bluebird');
//..
// Regular bot config, etc.
//..
controller.hears(['promise test'],['direct_message'], function (bot, message) {
  // Get timeout duration from message:
  // e.g. `promise test 3000` for 3 seconds
  var timeoutDuration = parseInt(message.text.split('promise test')[1].trim()) || 0;

  bot.startConversation(message, function (err, convo) {
    convo.say('Testing ' + timeoutDuration + 'ms API response time');

    // Simulated API request
    var apiRequest = function () {
      return new Promise(function (resolve) {
        console.log('1. Making request');
        // Fake response time
        setTimeout(function () {
          console.log('2. Sending back the response');
          resolve();
        }, timeoutDuration);
      });
    };

    apiRequest().then(function () {
      console.log('3. API response received');
      convo.say('API response received');
    });

    setTimeout(function () {
      console.log('4. Synchronous timeout reached');
      convo.say('Synchronous timeout reached');
    }, timeoutDuration);
  });
}

Expected behavior is the conversation will continue until all of the convo.say methods run. You can see the inconsistency with the conversation below:

screen shot 2015-12-18 at 3 38 09 pm

The console logs will always show up as expected, but when the subsequent replies lag > 100ms the conversation will end early more than half the time. This feels like a race condition.

Via Node logs, when the conversation ends early it ignores all but the first reply.

>   [Start]  8  Conversation with  [USER] in [CHANNEL]
1. Making requesting
2. Sending back the response
4. Synchronous timeout reached
3. API response received
>   [End]  8  Conversation with  [USER] in [CHANNEL]
[End]  8  Task for  [USER] in [CHANNEL]
[Start]  9  Task for  [USER] in [CHANNEL]
>   [Start]  9  Conversation with  [USER] in [CHANNEL]
1. Making requesting
>   [End]  9  Conversation with  [USER] in [CHANNEL]
[End]  9  Task for  [USER] in [CHANNEL]
2. Sending back the response
4. Synchronous timeout reached
3. API response received

Happy to be helpful with more info if needed.

@ChrisCinelli
Copy link
Contributor

Are you hitting the 1 message/sec api limit?

@zachdunn
Copy link
Author

I'm not sure that limit would be in play here. You can see that the bot always replies to the message, and it's a matter of the conversation closing on the Node side (as logs show) intermittently before the API response is sent.

Of course I may have just missed your point entirely.

@pseudozach
Copy link

I face a similar issue when I'm trying to get some data from datastore and do convo.ask with this data, in some cases conversation ends prematurely, in other cases convo.ask succeeds. This has all the indications of a race condition.
What is the time limit to add a new ask into a conversation before it ends?
How can we set it?
More importantly how can we make sure conversation doesn't end prematurely before some API request is completed?

@kylemac
Copy link

kylemac commented Feb 2, 2016

+1 – i've encountered this unusual behavior as well. any thoughts on what might be the culprit?

@zachdunn
Copy link
Author

zachdunn commented Feb 2, 2016

We've found some stability improvements by wrapping the handler with Promisify. It's not a perfect solution, but it's one of the ways we've managed to keep things a bit more predictable until it's patched.

Here's a sample gist for response handler:

'use strict';
const Promise = require('bluebird');

module.exports = function (bot, message) {
  Promise.promisifyAll(bot);
  bot.startConversationAsync(message).then(function (convo) {
    return Promise.fromCallback(function(callback) {
        return convo.ask('What did you need?', function (a, b) {
          callback(null, [a, b]);
        });
    }, true).spread(function (response, convo) {
      convo.say(`Let's see what I can do for *"${response.text}"*`);
      convo.next();
      return new Promise((resolve) => {
        convo.ask(`Found something. Do you want it?`, [
          {
            pattern: bot.utterances.yes,
            callback: (response, convo) => {
              convo.say(`Great. It's all yours.`);
              resolve();
            }
          },
          {
            pattern: bot.utterances.no,
            callback: (response, convo) => {
              convo.say('OK, nevermind.');
              resolve();
            }
          },
          {
            default: true,
            callback: (response, convo) => {
              convo.repeat();
            }
          }
        ]);
      });
    }).finally(function () {
      convo.next();
    });
  }).catch(function (err) {
    bot.reply(message, err.toString());
  });
};

This is not a magic wand, and there's still some kind of race condition in play here.

@kylemac
Copy link

kylemac commented Feb 2, 2016

💋 @zachdunn

@benbrown
Copy link
Contributor

benbrown commented Feb 2, 2016

The issue here is that conversations proceed, at roughly 1 message per second until there are no messages left in the queue.

If you do something asynchronous while the message queue is processing as normal, it may end prematurely because you have not yet pushed another message onto the queue.

The current best way to handle this is to not call convo.next() until after your asynchronous action has finished.

Another approach would be to do all the asynchronous actions before starting the conversation.

@zachdunn
Copy link
Author

zachdunn commented Feb 2, 2016

If you do something asynchronous while the message queue is processing as normal, it may end prematurely because you have not yet pushed another message onto the queue.

Suddenly #20 (comment) makes a lot more sense. Is there a way to disable/adjust this timeout on conversations? Seems like there's a parallel to KeepAlive in server configurations.

Or was that achieved by your convo.next() comment?

Reason I ask -- you can imagine a bot where the workflow isn't as predictable and ability to specify how quickly a conversation expires (or if it ever does without explicit commands) is needed:

User says something like "Let's search for hats"
Conversation starts
Bot switches into a context that's primed for hats. "What type of hat did you want?"
User says "baseball hats"
Bot fetches results from whatever leading hat-based API exists out there
Riveting hat conversation continues. User can ask for more information (more API calls)
User is done, says "OK we're done"
Exit hat context
End conversation

Happy to be wildly off base here. Would make life much simpler. 😀

@pseudozach
Copy link

@benbrown I've tried both increasing tick time from 1 sec -> 2 sec and calling convo.next() inside my callback function but behavior is still erratic.
I think I'll try promisify method or use some other workaround until I optimize my datastore response time.

@zachdunn you can increase the tick time to 1.5 seconds but this still does not guarantee your call will be done until the next tick. Also now your conversation will flow %50 percent slower overall.

@darrenparkinson
Copy link
Contributor

It's a shame this is closed, as it seems it's still a problem. I'm doing a convo.say, then mapping over an array to make an api request for each item to build a response and the conversation ends before I get the next response out. This happens even if there is only a single item in the array. I'm only doing this because bot.reply messages aren't guaranteed to be sent in order.

@benbrown
Copy link
Contributor

@darrenparkinson the solution to your issue is to do those asyncronous calls outside of the conversation - by using createConversation, doing the async, then calling convo.activate(), or by doing it in a convo.ask callback before calling convo.next.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests

7 participants