Skip to content

[Bug]: Presidio Guardrail giving litellm type validation error #15327

@yogeshkumar16

Description

@yogeshkumar16

What happened?

I am using open ai 's gpt-4o model through litellm .

openai_model = OpenAIChat(
    id="gpt-4o",
    max_completion_tokens=8092,
    seed=42,
    temperature=0,
    top_p=1,
    api_key="placeholder",
    base_url=os.getenv("LITELLM_BASE_URL"),
    timeout=300,
    default_headers={"Authorization": os.getenv("LITELLM_API_KEY")},
)

I am getting this error

.raise ModelProviderError(
agno.exceptions.ModelProviderError: litellm.types.proxy.guardrails.guardrail_hooks.presidio.PresidioAnalyzeResponseItem() argument after ** must be a mapping, not str
Image

Relevant log output

Are you a ML Ops Team?

Yes

What LiteLLM version are you on ?

v1.77.2-stable

Twitter / LinkedIn details

No response

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions