Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature Request] Over 2k token message context support? #21

Closed
imesha10 opened this issue Sep 2, 2023 · 2 comments
Closed

[Feature Request] Over 2k token message context support? #21

imesha10 opened this issue Sep 2, 2023 · 2 comments

Comments

@imesha10
Copy link

imesha10 commented Sep 2, 2023

I've noticed that in poe.com that if I send a very large text over 2000 tokens I think it was, the poe bot does not respond and does not respond in the website.

My question and feature request is does this project support over 2000k context on gpt4 (I think I already tried and it failed on this repo), if not would it be possible to support it?

@snowby666
Copy link
Owner

It depends on the text to be honest. Sometimes the model can understand the context given and generate a response that makes sense, even if the text is over 2000 tokens. However, other times the model may struggle to understand the context, especially if the text is very long or contains complex language. In these cases, it's possible that the model will not be able to generate a response that accurately addresses the prompt.

@snowby666
Copy link
Owner

And the token limit on poe.com is just relatively correct since the devs added up a pre-prompt engineering process before sending the request.

@imesha10 imesha10 closed this as completed Sep 6, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants