Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature request: Vertex AI Function Calling #15073

Closed
lawwu opened this issue Dec 22, 2023 · 1 comment
Closed

Feature request: Vertex AI Function Calling #15073

lawwu opened this issue Dec 22, 2023 · 1 comment
Labels
🤖:enhancement A large net-new component, integration, or chain. Use sparingly. The largest features Ɑ: models Related to LLMs or chat model modules

Comments

@lawwu
Copy link
Contributor

lawwu commented Dec 22, 2023

Feature request

Google's gemini-pro supports function calling. It would be nice to be able to use langchain to support function calling when using the VertexAI class similar to OpenAI and OpenAI's version of function calling: https://cloud.google.com/vertex-ai/docs/generative-ai/multimodal/function-calling

Motivation

Here's a notebook where to access this functionality you have to use the vertexai library directly which means we lose the langchain standardization of input and output schemas: https://github.com/GoogleCloudPlatform/generative-ai/blob/main/gemini/function-calling/intro_function_calling.ipynb

Your contribution

Possibly can help.

Copy link

dosubot bot commented Dec 22, 2023

🤖

Hello @lawwu! I'm Dosu, a friendly bot here to assist you while we wait for a human maintainer. I'm here to help with troubleshooting, answering your queries, and guiding your journey to becoming a contributor. Nice to meet you!

Thank you for your feature request. As of the current version of LangChain (v0.0.352), there is no support for function calling similar to OpenAI's version. The framework does have integration with Google's gemini-pro model through the google.generativeai library, but there is no mention or usage of the vertexai library.

The GoogleGenerativeAI class in LangChain does have a model field which can be set to "gemini-pro", and the _generate and _stream methods have special handling for Gemini models. However, these methods do not currently support function calling.

Here is the relevant code:

class GoogleGenerativeAI(BaseLLM, BaseModel):
    """Google GenerativeAI models.

    Example:
        .. code-block:: python

            from langchain_google_genai import GoogleGenerativeAI
            llm = GoogleGenerativeAI(model="gemini-pro")
    """

    model: str = Field(
        ...,
        description="""The name of the model to use.
Supported examples:
    - gemini-pro
    - models/text-bison-001""",
    )

    @property
    def is_gemini(self) -> bool:
        """Returns whether a model is belongs to a Gemini family or not."""
        return _is_gemini_model(self.model)

    def _generate(
        self,
        prompts: List[str],
        stop: Optional[List[str]] = None,
        run_manager: Optional[CallbackManagerForLLMRun] = None,
        **kwargs: Any,
    ) -> LLMResult:
        for prompt in prompts:
            if self.is_gemini:
                res = _completion_with_retry(
                    self,
                    prompt=prompt,
                    stream=False,
                    is_gemini=True,
                    run_manager=run_manager,
                    generation_config=generation_config,
                )
    def _stream(
        self,
        prompt: str,
        stop: Optional[List[str]] = None,
        run_manager: Optional[CallbackManagerForLLMRun] = None,
        **kwargs: Any,
    ) -> Iterator[GenerationChunk]:
        for stream_resp in _completion_with_retry(
            self,
            prompt,
            stream=True,
            is_gemini=True,
            run_manager=run_manager,
            generation_config=generation_config,
            **kwargs,
        ):

Sources:

Your feature request for function calling support when using the VertexAI class is a valuable suggestion. It would indeed be beneficial to have LangChain standardize the input and output schemas for this functionality. However, as a technical support rep, I'm unable to create pull requests or issues. I would recommend reaching out to the LangChain developers directly with your feature request.

Your offer to possibly help with this feature is greatly appreciated. Contributions from users like you are what help open-source projects like LangChain thrive and improve. If you have any further questions or need more information, please don't hesitate to ask.

Best regards,
Dosu

Sources


This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

If you want to continue the conversation, start your reply with @dosu-bot.

@dosubot dosubot bot added Ɑ: models Related to LLMs or chat model modules 🤖:enhancement A large net-new component, integration, or chain. Use sparingly. The largest features labels Dec 22, 2023
efriis added a commit that referenced this issue Jan 17, 2024
Replace this entire comment with:
  - **Description:** Description: added support for tools on VertexAI
  - **Issue:** #15073 
  - **Twitter handle:**  lkuligin

---------

Co-authored-by: Erick Friis <erick@langchain.dev>
serena-ruan pushed a commit to serena-ruan/langchain that referenced this issue Jan 18, 2024
…i#15822)

Replace this entire comment with:
  - **Description:** Description: added support for tools on VertexAI
  - **Issue:** langchain-ai#15073 
  - **Twitter handle:**  lkuligin

---------

Co-authored-by: Erick Friis <erick@langchain.dev>
lz-chen pushed a commit to lz-chen/langchain that referenced this issue Jan 22, 2024
…i#15822)

Replace this entire comment with:
  - **Description:** Description: added support for tools on VertexAI
  - **Issue:** langchain-ai#15073 
  - **Twitter handle:**  lkuligin

---------

Co-authored-by: Erick Friis <erick@langchain.dev>
@dosubot dosubot bot added the stale Issue has not had recent activity or appears to be solved. Stale issues will be automatically closed label Mar 22, 2024
@dosubot dosubot bot closed this as not planned Won't fix, can't repro, duplicate, stale Mar 29, 2024
@dosubot dosubot bot removed the stale Issue has not had recent activity or appears to be solved. Stale issues will be automatically closed label Mar 29, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🤖:enhancement A large net-new component, integration, or chain. Use sparingly. The largest features Ɑ: models Related to LLMs or chat model modules
Projects
None yet
Development

No branches or pull requests

1 participant