Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Context too Large #45

Open
slyticoon opened this issue Dec 10, 2023 · 4 comments
Open

Context too Large #45

slyticoon opened this issue Dec 10, 2023 · 4 comments

Comments

@slyticoon
Copy link

Hello,
Thank you for creating this integration. It has been a very awesome integration into my smart home.

However, I am experiencing an issue where the request to openai exceeds my usage limits per request of 10,000 tokens. I know that my limits will go up the longer I use OpenAI's api, but I think it would be a nice feature to either:

  1. Be able to limit the size of the Messages object sent in a request (currently, mine is almost 4000 tokens in size, which locks me out after a few requests). Perhaps by limiting the number of messages permitted to exist in the history.

  2. Have a way to clear that messages object manually when this occurs.

I am not sure how home assistant tracks the conversation_id that determines if the message history is inserted into the messages object, but my experience is as follows:

Closing the voice assistant on a mobile device does not change the conversation ID in the eyes of your python program, however, it does remove the message history from the GUI. Yet, when I make a new request, I see that the messages object is still almost 4000 tokens.

Thanks again for leading the development on this integration.

@jekalmin
Copy link
Owner

jekalmin commented Dec 11, 2023

Thanks for reporting an issue!

Closing the voice assistant on a mobile device does not change the conversation ID in the eyes of your python program, however, it does remove the message history from the GUI. Yet, when I make a new request, I see that the messages object is still almost 4000 tokens.

This is strange. The conversation ID is changed on my phone as the message history is removed from GUI.
Does anyone experience the same?

However, I am experiencing an issue where the request to openai exceeds my usage limits per request of 10,000 tokens. I know that my limits will go up the longer I use OpenAI's api, but I think it would be a nice feature to either:

  1. Be able to limit the size of the Messages object sent in a request (currently, mine is almost 4000 tokens in size, which locks me out after a few requests). Perhaps by limiting the number of messages permitted to exist in the history.
  2. Have a way to clear that messages object manually when this occurs.

As you mentioned, there should be a way to keep request token limited and clear messages.
In addition to limiting token size, it would be better, I think, to add an option to remove old messages if exceeds token size, resulting in circulating messages. (quite not sure if reduce messages by let gpt summarize message history is needed)

There should be experiments done for computing request token size which would take some time.
I'm quite not sure how to clearing messages yet, but what I can think of now is by providing a function or service.

Although it would take some time to get it done, this is the feature that the component should support.

@sanderkooger
Copy link

I think there need to be 2 separate solutions, One for a conversation with a too large token size, then a shorter message needs to be compiled to GPT

The other is to add an option to clear messages. This to prevent previous context on new queries, but also save money.

@sanderkooger
Copy link

And Hi @jekalmin I am really enjoying your work!

@jekalmin
Copy link
Owner

I just released 1.0.2-beta2.

At first, I only added clearing message feature.
Please try this and give a feedback.

1

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants