Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ChatGoogleGenerativeAI parameters are inconsistent with other llm provides #91

Closed
AmgadHasan opened this issue Mar 26, 2024 · 1 comment

Comments

@AmgadHasan
Copy link

Hi.

For the ChatGoogleGenerativeAI, we pass the following paramters:

 chat = ChatGoogleGenerativeAI(
        google_api_key=api_key,
        max_tokens=max_tokens,
        model=model,
    )

However, for openai and antrhopic, we use model_name and not model:

 chat = ChatAnthropic(
        temperature=0,
        anthropic_api_key=api_key,
        model_name=model,
    )
 chat = ChatOpenAI(
        openai_api_key=api_key,
        max_tokens=max_tokens,
        model_name=model,
    )

Can we make the interface of chat objects consistent across providers?
We can add a parameter model_name and deprecate model going forward.

@AmgadHasan
Copy link
Author

There is already a pr that can solve this:
#81

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants