Skip to content

Conversation

@steven10a
Copy link
Collaborator

Fixes reported bug where using a sync client caused a TypeError: object ChatCompletion can't be used in 'await' expression error when LLM based guardrails are executed.

  • Root cause was the guardrail LLM is initialized as the same class as the actual client and we run our guardrails asynchronously.
  • Fix: Wrap sync methods with blocking callables in a background executor allowing all guardrails to continue to execute asynchronously

Changes:

  • Helper method _invoke_openai_callable added to properly handle sync and async traffic from all api endpoints we support
  • All LLM based checks now use the new helper
  • Added unit tests to catch this in the future

Copilot AI review requested due to automatic review settings October 16, 2025 17:39
Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This PR fixes a bug where using synchronous OpenAI clients caused TypeError: object ChatCompletion can't be used in 'await' expression errors in LLM-based guardrails. The root cause was that guardrail LLMs were initialized as the same class as the actual client, but guardrails always execute asynchronously. The fix wraps synchronous methods in a background executor to allow uniform async execution.

Key changes:

  • Added _invoke_openai_callable helper to handle both sync and async OpenAI SDK methods
  • Updated all LLM-based checks to use the new helper for calling OpenAI APIs
  • Added unit tests to verify sync client support

Reviewed Changes

Copilot reviewed 5 out of 5 changed files in this pull request and generated 3 comments.

Show a summary per file
File Description
src/guardrails/checks/text/llm_base.py Implements _invoke_openai_callable and _request_chat_completion helpers; updates run_llm to support sync/async clients
src/guardrails/checks/text/prompt_injection_detection.py Updates _call_prompt_injection_detection_llm to use _invoke_openai_callable
src/guardrails/checks/text/hallucination_detection.py Updates hallucination_detection to use _invoke_openai_callable
tests/unit/checks/test_llm_base.py Adds test for sync client support in run_llm
tests/unit/checks/test_prompt_injection_detection.py Adds test for sync response handling in prompt injection detection

Tip: Customize your code reviews with copilot-instructions.md. Create the file or learn how to get started.

else:
try:
from openai import AsyncAzureOpenAI, AzureOpenAI # type: ignore
except Exception: # pragma: no cover - optional dependency
Copy link

Copilot AI Oct 16, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The bare except Exception is too broad. Consider catching ImportError specifically since this is handling an optional dependency import.

Suggested change
except Exception: # pragma: no cover - optional dependency
except ImportError: # pragma: no cover - optional dependency

Copilot uses AI. Check for mistakes.
Comment on lines +41 to +42
def create(self, **kwargs: Any) -> Any:
_ = kwargs
Copy link

Copilot AI Oct 16, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The unused variable assignment _ = kwargs is unnecessary. If the intention is to indicate kwargs are intentionally unused, this can be omitted as Python allows unused parameters.

Copilot uses AI. Check for mistakes.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is just in a test, I think it's fine


class _SyncResponses:
def parse(self, **kwargs: Any) -> Any:
_ = kwargs
Copy link

Copilot AI Oct 16, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The unused variable assignment _ = kwargs is unnecessary. If the intention is to indicate kwargs are intentionally unused, this can be omitted as Python allows unused parameters.

Suggested change
_ = kwargs

Copilot uses AI. Check for mistakes.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is just in a test, I think it's fine

@gabor-openai gabor-openai self-requested a review October 16, 2025 17:43
Copy link
Collaborator

@gabor-openai gabor-openai left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you!

@gabor-openai gabor-openai merged commit 515bd41 into main Oct 16, 2025
3 checks passed
@gabor-openai gabor-openai deleted the dev/steven/sync_client_bug branch October 16, 2025 17:44
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants