forked from Significant-Gravitas/AutoGPT
-
Notifications
You must be signed in to change notification settings - Fork 0
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Possible workaround for Significant-Gravitas#796
- Loading branch information
Harun Esur
committed
Apr 14, 2023
1 parent
98efd26
commit c36f44f
Showing
2 changed files
with
47 additions
and
1 deletion.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -18,3 +18,4 @@ Pillow | |
coverage | ||
flake8 | ||
numpy | ||
tenacity |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
c36f44f
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Didn't work for me. I still get the same error message when trying to let Auto-GPT read larger files:
openai.error.InvalidRequestError: This model's maximum context length is 8191 tokens, however you requested 216008 tokens (216008 in your prompt; 0 for the completion). Please reduce your prompt; or completion length.
(Applied this to the current master branch)