Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

NotFoundError for OpenAI with gpt-3.5-turbo model #20441

Closed
5 tasks done
shashankdeshpande opened this issue Apr 14, 2024 · 4 comments
Closed
5 tasks done

NotFoundError for OpenAI with gpt-3.5-turbo model #20441

shashankdeshpande opened this issue Apr 14, 2024 · 4 comments
Labels
🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature 🔌: openai Primarily related to OpenAI integrations

Comments

@shashankdeshpande
Copy link
Contributor

Checked other resources

  • I added a very descriptive title to this issue.
  • I searched the LangChain documentation with the integrated search.
  • I used the GitHub search to find a similar question and didn't find it.
  • I am sure that this is a bug in LangChain rather than my code.
  • The bug is not resolved by updating to the latest stable version of LangChain (or the specific integration package).

Example Code

Getting error for this code -

from langchain_openai import OpenAI
llm = OpenAI(model='gpt-3.5-turbo', temperature=0, streaming=True)
llm('how are you?')

Error Message and Stack Trace (if applicable)

NotFoundError: Error code: 404 - {'error': {'message': 'This is a chat model and not supported in the v1/completions endpoint. Did you mean to use v1/chat/completions?', 'type': 'invalid_request_error', 'param': 'model', 'code': None}}

Description

Getting above specified error when configuring gpt-3.5-turbo model with OpenAI
However, this model works as expected with ChatOpenAI

On the other hand, gpt-3.5-turbo-instruct model also works as expected with OpenAI, code is mentioned below -

from langchain_openai import OpenAI
llm = OpenAI(model='gpt-3.5-turbo-instruct',temperature=0, streaming=True)
llm('how are you?')

Here is the screenshot for reference -
Screenshot 2024-04-15 at 12 02 51 AM

I believe this issue is due to configuring non-supported model with OpenAI instead of ChatOpenAI

Observation 🔍
I referred the codebase of openai python package and observed that there are some set of models which has only support of v1/chat/completions (ChatOpenAI as implemented within langchain). Check these files for more details -

Potential Fix 🤔
Should we validate the model name by referring to the same list when handling parameters for OpenAI, and raise an error accordingly?

I can work on this, please check and let me know

System Info

System Information

OS: Darwin
OS Version: Darwin Kernel Version 23.4.0: Fri Mar 15 00:11:05 PDT 2024; root:xnu-10063.101.17~1/RELEASE_X86_64
Python Version: 3.11.8 (main, Feb 26 2024, 15:43:17) [Clang 14.0.6 ]

Package Information

langchain_core: 0.1.42
langchain: 0.1.16
langchain_community: 0.0.32
langsmith: 0.1.47
langchain_openai: 0.1.3
langchain_text_splitters: 0.0.1
langchainhub: 0.1.15
openai: 1.17.0

Packages not installed (Not Necessarily a Problem)

The following packages were not found:

langgraph
langserve

@dosubot dosubot bot added 🔌: openai Primarily related to OpenAI integrations 🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature labels Apr 14, 2024
@shashankdeshpande
Copy link
Contributor Author

@hwchase17 @baskaryan @efriis Can you please check this?

@SzymonSt
Copy link

Bump I have the same problem latest langchain-openai==0.1.3

@baskaryan
Copy link
Collaborator

gpt-3.5 and gpt-4 are chat models so you must use from langchain_openai import ChatOpenAI. See https://python.langchain.com/docs/integrations/chat/openai/ for more

@shashankdeshpande
Copy link
Contributor Author

@baskaryan Yeah, I got this. However, the problem is lack of documentation regarding which model serves as the chat model which might cause confusion. It might be beneficial to either clarify this in the documentation or implement an explicit error message. What are your thoughts on this?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature 🔌: openai Primarily related to OpenAI integrations
Projects
None yet
Development

No branches or pull requests

3 participants