Skip to content

Conversation

@kaushikb11
Copy link
Contributor

@kaushikb11 kaushikb11 commented Jul 22, 2024

… llm_api is not passed

  • Update get_llm_ask to check for LiteLLM first
  • Allow LiteLLM usage without explicitly passing litellm.completion
  • Maintain compatibility with other LLM providers
  • Update relevant tests to reflect new LiteLLM usage

Script to reproduce the issue

from guardrails import Guard
import os
os.environ["AZURE_API_KEY"] = "" # "my-azure-api-key"
os.environ["AZURE_API_BASE"] = "" # "https://example-endpoint.openai.azure.com/"
os.environ["AZURE_API_VERSION"] = "" # "2023-05-15"

guard = Guard()

result = guard(
    model="azure/<your_deployment_name>",
    messages=[{"role":"user", "content":"How many moons does Jupiter have?"}],
)

print(f"{result.validated_output}")

@dtam
Copy link
Contributor

dtam commented Jul 22, 2024

ok closing this branch which is the right one

@dtam
Copy link
Contributor

dtam commented Jul 22, 2024

#958

@dtam dtam closed this Jul 22, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants