Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Increasing token limit breaks co-pilot #1

Closed
noahevers97 opened this issue Apr 25, 2023 · 7 comments
Closed

Increasing token limit breaks co-pilot #1

noahevers97 opened this issue Apr 25, 2023 · 7 comments

Comments

@noahevers97
Copy link

When increasing the token limit of co-pilot above 1000, I do not get any response from chat-gpt, also not after a restart. Setting the token limit <1000 fixes the issue again.

@logancyang
Copy link
Owner

Thanks for reporting the issue @noahevers97! Hmm, I can't seem to reproduce this, I tried 2000 for both gpt3.5 and gpt4, they are working fine for me. Do you see any console errors?

@spiritualgeek
Copy link

I had mine set to 5000 and had to go back to 1000 otherwise it wouldn't work, complaining about credentials/API Key. I love this concept of interacting with GPT. Thanks for the plugin!

@logancyang
Copy link
Owner

I had mine set to 5000 and had to go back to 1000 otherwise it wouldn't work, complaining about credentials/API Key. I love this concept of interacting with GPT. Thanks for the plugin!

Thanks for trying it out! Sorry about the confusing error message, that API key complaint is currently covering all OpenAI errors. If you open your console you will see the true error message which is quite long. I suspect the 5000 token limit is over what is allowed for that model you are using. Feel free to copy and paste the console error message here, I can take a look.

I will add better error messages soon.

@spiritualgeek
Copy link

api.openai.com/v1/chat/completions:1     Failed to load resource: the server responded with a status of 400 ()
plugin:copilot:37818 Error in streamManager.streamSSE: CustomEvent
getAIResponse @ plugin:copilot:37818

@logancyang
Copy link
Owner

Basically the issue is that the model you picked does not support token limit as big as 5000

I added better error message like this, hope it helps

Screenshot 2023-05-10 at 4 27 36 PM

In the meantime, please keep the OpenAI limitations in mind. You can find the API token limits in their docs: https://platform.openai.com/docs/models/overview

I'm planning to support unlimited context in the near future, please stay tuned!

@logancyang
Copy link
Owner

Closing this one for now, please open a new issue with the new error messages if you still encounter problems.

@spiritualgeek
Copy link

Thank you for the info. Appreciated

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants