Skip to content

crashes with gemini via instructor #92

@academo

Description

@academo

I tried running the basic chat example with gemini https://github.com/BrainBlend-AI/atomic-agents/blob/main/atomic-examples/quickstart/quickstart/1_basic_chatbot.py

all the same except the client is defined as this:

import google.generativeai as genai

# ...
client = instructor.from_gemini(
    client=genai.GenerativeModel(
        model_name="models/gemini-1.5-flash-latest",
    ),
    mode=instructor.Mode.GEMINI_JSON,
)

agent = BaseAgent(
    config=BaseAgentConfig(
        client=client,
        model="models/gemini-1.5-flash-latest",
        memory=memory,
    )
)

crash:

Traceback (most recent call last):
  File "/home/academo/repos/llm-auto-update-docs/main.py", line 68, in <module>
    response = agent.run(input_schema)
               ^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/academo/repos/llm-auto-update-docs/.venv/lib/python3.12/site-packages/atomic_agents/agents/base_agent.py", line 189, in run
    response = self.get_response(response_model=self.output_schema)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/academo/repos/llm-auto-update-docs/.venv/lib/python3.12/site-packages/atomic_agents/agents/base_agent.py", line 165, in get_response
    response = self.client.chat.completions.create(
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/academo/repos/llm-auto-update-docs/.venv/lib/python3.12/site-packages/instructor/client.py", line 176, in create
    return self.create_fn(
           ^^^^^^^^^^^^^^^
  File "/home/academo/repos/llm-auto-update-docs/.venv/lib/python3.12/site-packages/instructor/patch.py", line 187, in new_create_sync
    response_model, new_kwargs = handle_response_model(
                                 ^^^^^^^^^^^^^^^^^^^^^^
  File "/home/academo/repos/llm-auto-update-docs/.venv/lib/python3.12/site-packages/instructor/process_response.py", line 750, in handle_response_model
    response_model, new_kwargs = mode_handlers[mode](response_model, new_kwargs)
                                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/academo/repos/llm-auto-update-docs/.venv/lib/python3.12/site-packages/instructor/process_response.py", line 445, in handle_gemini_json
    "model" not in new_kwargs
AssertionError: Gemini `model` must be set while patching the client, not passed as a parameter to the create method

I suspect the implementation is too biased towards openai.

i also tried removing the model param from the BaseConfigAgent without success

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions