Skip to content

[bug] litellm support erroring for azure openai falling through to the openai callable #959

@dtam

Description

@dtam

Describe the bug
A clear and concise description of what the bug is.

To Reproduce
Steps to reproduce the behavior:

from guardrails import Guard
import os
os.environ["AZURE_API_KEY"] = "" # "my-azure-api-key"
os.environ["AZURE_API_BASE"] = "" # "https://example-endpoint.openai.azure.com"
os.environ["AZURE_API_VERSION"] = "" # "2023-05-15"

guard = Guard()

result = guard(
    model="azure/<your_deployment_name>",
    messages=[{"role":"user", "content":"How many moons does Jupiter have?"}],
)

print(f"{result.validated_output}")

Expected behavior
uses litellm

Library version:
Version (e.g. 0.5.0)

Additional context
Add any other context about the problem here.

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions