Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Paper length exceeds maximum context length #21

Open
zhaozewang opened this issue May 22, 2023 · 3 comments
Open

Paper length exceeds maximum context length #21

zhaozewang opened this issue May 22, 2023 · 3 comments

Comments

@zhaozewang
Copy link

Got this error:
openai.error.InvalidRequestError: This model's maximum context length is 4097 tokens. However, your messages resulted in 7328 tokens. Please reduce the length of the messages.

Is it possible to break the paper into multiple pieces then query one paper piece at a time to avoid this issue?

@zhaozewang
Copy link
Author

The issue seems to be cause by the context length, not the paper length. Here's a similar problem langchain-ai/langchain#2133.

@cc-zehao
Copy link

the same question

@Chapoly1305
Copy link

Chapoly1305 commented Oct 9, 2023

@cc-zehao @zhaozewang

You may consider modify your "model_interface.py" and use gpt-4-32k instead. I modified and it can process much more than using gpt3.5.

Available options:
gpt-3.5-turbo, gpt-4, and gpt-4-32k

image

REF:
https://platform.openai.com/docs/models/continuous-model-upgrades

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants