Skip to content

[bug] Trouble calling guard without using openai or azure openai APIs #979

@wesngoh

Description

@wesngoh

Describe the bug
I am trying to use the custom LLM wrapper so that I can add guardrails using a NVIDIA TensorRT LLM (TRT-LLM). I do not wish to use openai/azure openai for the guardrails call.

However I'm met with the following error:
openai.OpenAIError: Ambiguous use of module client; please set 'openai.api_type' or the 'OPENAI_API_TYPE' environment variable to 'openai' or 'azure'

To Reproduce
Here's my code:

def my_llm_api(
    prompt: str) -> str:
    """Custom LLM API wrapper.

    At least one of prompt, instruction or msg_history should be provided.

    Args:
        prompt (str): The prompt to be passed to the LLM API
        instruction (str): The instruction to be passed to the LLM API
        msg_history (list[dict]): The message history to be passed to the LLM API
        **kwargs: Any additional arguments to be passed to the LLM API

    Returns:
        str: The output of the LLM API
    """

    # Call your LLM API here
    
    # Prepare the input for the LLM API
    inputs = {
        "prompt": prompt
    }
    
    # Make a request to the Triton server
    response = requests.post(TRITON_SERVER_URL, json=inputs)
    response_data = response.json()
    
    # Extract the output from the response
    llm_output = response_data.get("outputs", {}).get("text", "")
    
    return llm_output

validated_response = guard(
    my_llm_api,
    prompt="Can you generate a list of 10 things that are not food?",
)
print(validated_response)

Expected behavior
Full error:

File "/usr/local/lib/python3.10/site-packages/guardrails/integrations/langchain/guard_runnable.py", line 15, in _validate
generate-1       |     response = self.guard.validate(input)
generate-1       |   File "/usr/local/lib/python3.10/site-packages/guardrails/guard.py", line 1080, in validate
generate-1       |     return self.parse(llm_output=llm_output, *args, **kwargs)
generate-1       |   File "/usr/local/lib/python3.10/site-packages/guardrails/guard.py", line 951, in parse
generate-1       |     return self._execute(  # type: ignore # streams are supported for parse
generate-1       |   File "/usr/local/lib/python3.10/site-packages/guardrails/guard.py", line 774, in _execute
generate-1       |     return guard_context.run(
generate-1       |   File "/usr/local/lib/python3.10/site-packages/guardrails/utils/telemetry_utils.py", line 347, in wrapped_func
generate-1       |     return func(*args, **kwargs)
generate-1       |   File "/usr/local/lib/python3.10/site-packages/guardrails/guard.py", line 751, in __exec
generate-1       |     return self._exec(
generate-1       |   File "/usr/local/lib/python3.10/site-packages/guardrails/guard.py", line 805, in _exec
generate-1       |     api = get_llm_ask(llm_api, *args, **kwargs)
generate-1       |   File "/usr/local/lib/python3.10/site-packages/guardrails/llm_providers.py", line 801, in get_llm_ask
generate-1       |     if llm_api == get_static_openai_create_func():
generate-1       |   File "/usr/local/lib/python3.10/site-packages/guardrails/utils/openai_utils/v1.py", line 19, in get_static_openai_create_func
generate-1       |     return openai.completions.create
generate-1       |   File "/usr/local/lib/python3.10/site-packages/openai/_utils/_proxy.py", line 20, in __getattr__
generate-1       |     proxied = self.__get_proxied__()
generate-1       |   File "/usr/local/lib/python3.10/site-packages/openai/_utils/_proxy.py", line 55, in __get_proxied__
generate-1       |     return self.__load__()
generate-1       |   File "/usr/local/lib/python3.10/site-packages/openai/_module_client.py", line 60, in __load__
generate-1       |     return _load_client().completions
generate-1       |   File "/usr/local/lib/python3.10/site-packages/openai/__init__.py", line 294, in _load_client
generate-1       |     raise _AmbiguousModuleClientUsageError()
generate-1       | openai.OpenAIError: Ambiguous use of module client; please set `openai.api_type` or the `OPENAI_API_TYPE` environment variable to `openai` or `azure`

Library version:
guardrails-ai==0.5.2

The documentation on using a custom LLM wrapper was a little insufficient and I don't understand why openai is called even when I did not state to use those APIs. Appreciate if anyone can share if they found a workaround.

Metadata

Metadata

Assignees

Labels

bugSomething isn't working

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions