-
Notifications
You must be signed in to change notification settings - Fork 3.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Return BadRequest when using gpt-35-turbo model #140
Comments
Hi @vincilee2! Thank you for your message! So we currently don't support gpt-35-turbo because that is a ChatCompletion model and has a different API than the normal TextCompletion ones like text-davinci-003. But we know this is high priority and we're working on bringing it in! |
We just landed #161! @vincilee2 I'd check that out to get gpt-35-turbo (ChatGPT) working for you. Closing this issue now, but feel free to raise it again if it doesn't work! |
Does anyone can teach me how to use gpt-35-turbo? |
Hi @zijeibo64270! I'd refer to this example here for how to use the gpt-35-turbo model: There will also be another notebook example with this PR: #258 |
Describe the bug
Following the same code in README using the gpt-35-turbo model, returned bad request from azure openai. The reason is best_of is not supported at 3.5-turbo model.
To Reproduce
Run below code, use your own azure openai endpoint and api key
Expected behavior
Should return 200 status code.
Additional context
{"error":{"code":"BadRequest","message":"logprobs, best_of and echo parameters are not available on gpt-35-turbo model. Please remove the parameter and try again. For more details, see Azure OpenAI Service REST API reference
The text was updated successfully, but these errors were encountered: