Skip to content

Conversation

@danbarbarito
Copy link

@danbarbarito danbarbarito commented Apr 10, 2025

Description

The LLM is returning the bot action inconsistently. When it returns something in a format the system does not expect, a colang parsing error happens and the application errors out.

Some examples of bot actions that were returned to me were:

bot responds with "You can call the number at 123-456-7891"

bot says You can call the number at 123-456-7891

bot provides information on "You can call the number at 123-456-7891"

All of these caused the bot to error out and crash.

These changes ensure that the bot action is a "bot say" in the proper format. There may be some edge cases I am missing but these changes have made my application way more stable. I have been testing my application for the past 30 or so minutes and have yet to see any colang parsing errors.

Related Issue(s)

Checklist

  • I've read the CONTRIBUTING guidelines.
  • I've updated the documentation if applicable.
  • I've added tests if applicable.
  • @mentions of the person or team responsible for reviewing proposed changes.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

bug: Getting an error when answering questions using KB - Unexpected token Token('$END', '') at line 2, column 87

2 participants