acty-openai provides an OpenAIExecutor implementation for Acty. It adapts
langchain-openai chat models to the Acty executor interface and emits the
shared telemetry attributes used across the wider Acty stack.
pip install acty-openaiFor local development:
pip install -e .[dev]import asyncio
from acty import ActyEngine, EngineConfig
from acty_openai import OpenAIExecutor
from langchain_openai import ChatOpenAI
async def main() -> None:
model = ChatOpenAI(model="gpt-4o-mini")
engine = ActyEngine(
executor=OpenAIExecutor(model=model),
config=EngineConfig(primer_workers=1, follower_workers=1),
)
try:
payload = {
"messages": [{"role": "user", "content": "hello"}],
}
submission = await engine.submit_group("demo", payload, [])
if submission.primer is not None:
result = await submission.primer
print(result.output)
finally:
await engine.close()
asyncio.run(main())- if you do not pass a model explicitly,
OpenAIExecutorrequireslangchain-openai - the executor can attach OpenTelemetry span attributes when telemetry is enabled
- this package depends directly on both
actyandacty-corebecause it imports both at runtime
- tests live under
tests/ - the repo includes unit tests plus an Acty engine integration test for shared telemetry behavior