Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[bug]每次都是这样的报错信息,求解 #15

Closed
cvenwu opened this issue Dec 16, 2022 · 2 comments
Closed

[bug]每次都是这样的报错信息,求解 #15

cvenwu opened this issue Dec 16, 2022 · 2 comments

Comments

@cvenwu
Copy link

cvenwu commented Dec 16, 2022

time="2022-12-16T22:21:06+08:00" level=info msg="{\n  \"error\": {\n    \"message\": \"This model's maximum context length is 4097 tokens, however you requested 4182 tokens (182 in your prompt; 4000 for the completion). Please reduce your prompt; or completion length.\",\n    \"type\": \"invalid_request_error\",\n    \"param\": null,\n    \"code\": null\n  }\n}\n"

经常发送一段文字过去收到报错,其实没有4000token,求解

@cvenwu
Copy link
Author

cvenwu commented Dec 16, 2022

time="2022-12-16T22:29:42+08:00" level=info msg="{\n "error": {\n "message": "This model's maximum context length is 4097 tokens, however you requested 4375 tokens (375 in your prompt; 4000 for the completion). Please reduce your prompt; or completion length.",\n "type": "invalid_request_error",\n "param": null,\n "code": null\n }\n}\n"

@houko
Copy link
Owner

houko commented Dec 17, 2022

api限制一次回复最多只有4000字,超了就会出错

@houko houko closed this as completed Dec 17, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants