Skip to content

litellm might be failing to parse pydantic.BaseModel for response_format #6830

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
tikendraw opened this issue Nov 20, 2024 · 1 comment
Closed

Comments

@tikendraw
Copy link

litellm might be failing to parse pydantic.BaseModel for response_format

using litellm = "1.52.10"

code i used

import litellm
from pydantic import BaseModel

litellm.set_verbose=True

messages = [{"role": "user", "content": "List 5 important events in the 21 century"}]

class CalendarEvent(BaseModel):
  name: str
  date: str
  participants: list[str]

resp = litellm.completion(
    model=models[-2],
    messages=messages,
    response_format=CalendarEvent
)

print("Received={}".format(resp))

Output: for gemini/gemini-1.5-pro -> ok

for gemini/gemini-1.5-pro

Logging Details LiteLLM-Success Call: Cache_hit=None
Looking up model=gemini/gemini-1.5-pro in model_cost_map, custom_llm_provider=gemini, call_type=completion
Looking up model=gemini/gemini-1.5-pro in model_cost_map, custom_llm_provider=gemini, call_type=completion
Received=ModelResponse(id='chatcmpl-4019d0a8-15a8-42c3-af33-044ad40073f7', created=1732110699, model='gemini-1.5-pro', object='chat.completion', system_fingerprint=None, choices=[Choices(finish_reason='stop', index=0, message=Message(content='{"date": "2001-09-11", "name": "September 11 attacks", "participants": ["Al-Qaeda", "United States"]}\n', role='assistant', tool_calls=None, function_call=None))], usage=Usage(completion_tokens=38, prompt_tokens=12, total_tokens=50, completion_tokens_details=None, prompt_tokens_details=None), vertex_ai_grounding_metadata=[], vertex_ai_safety_results=[], vertex_ai_citation_metadata=[])

Output: for gemini/gemini-1.5-flash -> ok

Received=ModelResponse(id='chatcmpl-7c5074b2-4f87-4b89-ab18-f80221306638', created=1732111031, model='gemini-1.5-flash', object='chat.completion', system_fingerprint=None, choices=[Choices(finish_reason='stop', index=0, message=Message(content='{"date": "2001-09-11", "name": "September 11 attacks", "participants": ["Al-Qaeda", "United States"] }\n', role='assistant', tool_calls=None, function_call=None))], usage=Usage(completion_tokens=39, prompt_tokens=12, total_tokens=51, completion_tokens_details=None, prompt_tokens_details=None), vertex_ai_grounding_metadata=[], vertex_ai_safety_results=[], vertex_ai_citation_metadata=[])

Output: for groq/llama-3.1-70b-versatile -> error

Request to litellm:
litellm.completion(model='groq/llama-3.1-70b-versatile', messages=[{'role': 'user', 'content': 'List 5 important events in the 21 century'}], response_format=<class '__main__.CalendarEvent'>)


19:28:18 - LiteLLM:WARNING: utils.py:290 - `litellm.set_verbose` is deprecated. Please set `os.environ['LITELLM_LOG'] = 'DEBUG'` for debug logs.
SYNC kwargs[caching]: False; litellm.cache: None; kwargs.get('cache')['no-cache']: False
Final returned optional params: {'response_format': <class '__main__.CalendarEvent'>, 'extra_body': {}}


POST Request Sent from LiteLLM:
curl -X POST \
https://api.groq.com/openai/v1/ \
-d '{'model': 'llama-3.1-70b-versatile', 'messages': [{'role': 'user', 'content': 'List 5 important events in the 21 century'}], 'response_format': <class '__main__.CalendarEvent'>, 'extra_body': {}}'


openai.py: Received openai error - You tried to pass a `BaseModel` class to `chat.completions.create()`; You must use `beta.chat.completions.parse()` instead

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.


Provider List: https://docs.litellm.ai/docs/providers

Traceback (most recent call last):
  File "/home/t/atest/use_computer/.venv/lib/python3.11/site-packages/litellm/llms/OpenAI/openai.py", line 854, in completion
    raise e
  File "/home/t/atest/use_computer/.venv/lib/python3.11/site-packages/litellm/llms/OpenAI/openai.py", line 790, in completion
    self.make_sync_openai_chat_completion_request(
  File "/home/t/atest/use_computer/.venv/lib/python3.11/site-packages/litellm/llms/OpenAI/openai.py", line 651, in make_sync_openai_chat_completion_request
    raise e
  File "/home/t/atest/use_computer/.venv/lib/python3.11/site-packages/litellm/llms/OpenAI/openai.py", line 633, in make_sync_openai_chat_completion_request
    raw_response = openai_client.chat.completions.with_raw_response.create(
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/t/atest/use_computer/.venv/lib/python3.11/site-packages/openai/_legacy_response.py", line 356, in wrapped
    return cast(LegacyAPIResponse[R], func(*args, **kwargs))
                                      ^^^^^^^^^^^^^^^^^^^^^
  File "/home/t/atest/use_computer/.venv/lib/python3.11/site-packages/openai/_utils/_utils.py", line 275, in wrapper
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
  File "/home/t/atest/use_computer/.venv/lib/python3.11/site-packages/openai/resources/chat/completions.py", line 828, in create
    validate_response_format(response_format)
  File "/home/t/atest/use_computer/.venv/lib/python3.11/site-packages/openai/resources/chat/completions.py", line 1744, in validate_response_format
    raise TypeError(
TypeError: You tried to pass a `BaseModel` class to `chat.completions.create()`; You must use `beta.chat.completions.parse()` instead

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/t/atest/use_computer/.venv/lib/python3.11/site-packages/litellm/main.py", line 1483, in completion
    response = groq_chat_completions.completion(
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/t/atest/use_computer/.venv/lib/python3.11/site-packages/litellm/llms/groq/chat/handler.py", line 41, in completion
    return super().completion(
           ^^^^^^^^^^^^^^^^^^^
  File "/home/t/atest/use_computer/.venv/lib/python3.11/site-packages/litellm/llms/OpenAI/openai.py", line 864, in completion
    raise OpenAIError(
litellm.llms.OpenAI.openai.OpenAIError: You tried to pass a `BaseModel` class to `chat.completions.create()`; You must use `beta.chat.completions.parse()` instead

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/t/atest/use_computer/test.py", line 110, in <module>
    resp = litellm.completion(
           ^^^^^^^^^^^^^^^^^^^
  File "/home/t/atest/use_computer/.venv/lib/python3.11/site-packages/litellm/utils.py", line 960, in wrapper
    raise e
  File "/home/t/atest/use_computer/.venv/lib/python3.11/site-packages/litellm/utils.py", line 849, in wrapper
    result = original_function(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/t/atest/use_computer/.venv/lib/python3.11/site-packages/litellm/main.py", line 3060, in completion
    raise exception_type(
          ^^^^^^^^^^^^^^^
  File "/home/t/atest/use_computer/.venv/lib/python3.11/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 2136, in exception_type
    raise e
  File "/home/t/atest/use_computer/.venv/lib/python3.11/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 404, in exception_type
    raise APIError(
litellm.exceptions.APIError: litellm.APIError: APIError: GroqException - You tried to pass a `BaseModel` class to `chat.completions.create()`; You must use `beta.chat.completions.parse()` instead

Output: for groq/llama-3.2-11b-vision-preview -> error

Request to litellm:
litellm.completion(model='groq/llama-3.2-11b-vision-preview', messages=[{'role': 'user', 'content': 'List 5 important events in the 21 century'}], response_format=<class '__main__.CalendarEvent'>)


19:30:42 - LiteLLM:WARNING: utils.py:290 - `litellm.set_verbose` is deprecated. Please set `os.environ['LITELLM_LOG'] = 'DEBUG'` for debug logs.
SYNC kwargs[caching]: False; litellm.cache: None; kwargs.get('cache')['no-cache']: False
Final returned optional params: {'response_format': <class '__main__.CalendarEvent'>, 'extra_body': {}}


POST Request Sent from LiteLLM:
curl -X POST \
https://api.groq.com/openai/v1/ \
-d '{'model': 'llama-3.2-11b-vision-preview', 'messages': [{'role': 'user', 'content': 'List 5 important events in the 21 century'}], 'response_format': <class '__main__.CalendarEvent'>, 'extra_body': {}}'


openai.py: Received openai error - You tried to pass a `BaseModel` class to `chat.completions.create()`; You must use `beta.chat.completions.parse()` instead

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.


Provider List: https://docs.litellm.ai/docs/providers

Traceback (most recent call last):
  File "/home/t/atest/use_computer/.venv/lib/python3.11/site-packages/litellm/llms/OpenAI/openai.py", line 854, in completion
    raise e
  File "/home/t/atest/use_computer/.venv/lib/python3.11/site-packages/litellm/llms/OpenAI/openai.py", line 790, in completion
    self.make_sync_openai_chat_completion_request(
  File "/home/t/atest/use_computer/.venv/lib/python3.11/site-packages/litellm/llms/OpenAI/openai.py", line 651, in make_sync_openai_chat_completion_request
    raise e
  File "/home/t/atest/use_computer/.venv/lib/python3.11/site-packages/litellm/llms/OpenAI/openai.py", line 633, in make_sync_openai_chat_completion_request
    raw_response = openai_client.chat.completions.with_raw_response.create(
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/t/atest/use_computer/.venv/lib/python3.11/site-packages/openai/_legacy_response.py", line 356, in wrapped
    return cast(LegacyAPIResponse[R], func(*args, **kwargs))
                                      ^^^^^^^^^^^^^^^^^^^^^
  File "/home/t/atest/use_computer/.venv/lib/python3.11/site-packages/openai/_utils/_utils.py", line 275, in wrapper
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
  File "/home/t/atest/use_computer/.venv/lib/python3.11/site-packages/openai/resources/chat/completions.py", line 828, in create
    validate_response_format(response_format)
  File "/home/t/atest/use_computer/.venv/lib/python3.11/site-packages/openai/resources/chat/completions.py", line 1744, in validate_response_format
    raise TypeError(
TypeError: You tried to pass a `BaseModel` class to `chat.completions.create()`; You must use `beta.chat.completions.parse()` instead

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/t/atest/use_computer/.venv/lib/python3.11/site-packages/litellm/main.py", line 1483, in completion
    response = groq_chat_completions.completion(
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/t/atest/use_computer/.venv/lib/python3.11/site-packages/litellm/llms/groq/chat/handler.py", line 41, in completion
    return super().completion(
           ^^^^^^^^^^^^^^^^^^^
  File "/home/t/atest/use_computer/.venv/lib/python3.11/site-packages/litellm/llms/OpenAI/openai.py", line 864, in completion
    raise OpenAIError(
litellm.llms.OpenAI.openai.OpenAIError: You tried to pass a `BaseModel` class to `chat.completions.create()`; You must use `beta.chat.completions.parse()` instead

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/t/atest/use_computer/test.py", line 110, in <module>
    resp = litellm.completion(
           ^^^^^^^^^^^^^^^^^^^
  File "/home/t/atest/use_computer/.venv/lib/python3.11/site-packages/litellm/utils.py", line 960, in wrapper
    raise e
  File "/home/t/atest/use_computer/.venv/lib/python3.11/site-packages/litellm/utils.py", line 849, in wrapper
    result = original_function(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/t/atest/use_computer/.venv/lib/python3.11/site-packages/litellm/main.py", line 3060, in completion
    raise exception_type(
          ^^^^^^^^^^^^^^^
  File "/home/t/atest/use_computer/.venv/lib/python3.11/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 2136, in exception_type
    raise e
  File "/home/t/atest/use_computer/.venv/lib/python3.11/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 404, in exception_type
    raise APIError(
litellm.exceptions.APIError: litellm.APIError: APIError: GroqException - You tried to pass a `BaseModel` class to `chat.completions.create()`; You must use `beta.chat.completions.parse()` instead
@krrishdholakia
Copy link
Contributor

Duplicate of #6845

@krrishdholakia krrishdholakia marked this as a duplicate of #6845 Nov 21, 2024
@krrishdholakia krrishdholakia closed this as not planned Won't fix, can't repro, duplicate, stale Nov 21, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants