Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Report in frontend the specific server error #61

Closed
mallibus opened this issue Mar 27, 2023 · 3 comments
Closed

Report in frontend the specific server error #61

mallibus opened this issue Mar 27, 2023 · 3 comments
Labels
enhancement New feature or request frontend

Comments

@mallibus
Copy link

After playing with the cat for a while, looks like the context memory gets full.
The frontend message is

"Something went wrong while sending your message. Please try refreshing the page"

but the backend log reports

openai.error.InvalidRequestError: This model's maximum context length is 4097 tokens. However, your messages resulted in 12341 tokens. Please reduce the length of the messages.

I am not in the code yet enough to propose a pull but from the user perspective my suggestion are

  1. To make the frontend message more explicit (e.g, The context memory exceeded its maximum size, please remove something) and then the user shall do it in some way, e.g. via the endpoint mentioned here .
  2. To inform the user that the memory is exceeded and then automatically remove the oldest memories and reiterate the query. The criterion to remove memories could also be flexible, e.g. remove the oldest, remove the ones closer to a given token.
@pieroit
Copy link
Member

pieroit commented Mar 27, 2023

Thank you for signaling @mallibus !
We planned to contain prompt size by using summarization (issue #16).
Hopefully memories management will be customizable via plugin, so you can do what you want.

I'm leaving this issue open and editing the title for your second point, better error information.

@pieroit pieroit changed the title Manage context length limit Reporto in frontend the specific server error Mar 27, 2023
@pieroit pieroit changed the title Reporto in frontend the specific server error Report in frontend the specific server error Mar 27, 2023
@pieroit pieroit added enhancement New feature or request frontend backend labels Mar 27, 2023
@pieroit pieroit removed the backend label Jun 5, 2023
@pieroit
Copy link
Member

pieroit commented Jul 15, 2023

Fixed, now core gives error name and description also via websocket

@pieroit pieroit closed this as completed Jul 15, 2023
@pieroit
Copy link
Member

pieroit commented Jul 15, 2023

Admin shows this info already

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request frontend
Projects
Archived in project
Development

No branches or pull requests

2 participants