Skip to content

conspol/acty-openai

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

acty-openai

acty-openai provides an OpenAIExecutor implementation for Acty. It adapts langchain-openai chat models to the Acty executor interface and emits the shared telemetry attributes used across the wider Acty stack.

Install

pip install acty-openai

For local development:

pip install -e .[dev]

Usage

import asyncio

from acty import ActyEngine, EngineConfig
from acty_openai import OpenAIExecutor
from langchain_openai import ChatOpenAI


async def main() -> None:
    model = ChatOpenAI(model="gpt-4o-mini")
    engine = ActyEngine(
        executor=OpenAIExecutor(model=model),
        config=EngineConfig(primer_workers=1, follower_workers=1),
    )
    try:
        payload = {
            "messages": [{"role": "user", "content": "hello"}],
        }
        submission = await engine.submit_group("demo", payload, [])
        if submission.primer is not None:
            result = await submission.primer
            print(result.output)
    finally:
        await engine.close()


asyncio.run(main())

Notes

  • if you do not pass a model explicitly, OpenAIExecutor requires langchain-openai
  • the executor can attach OpenTelemetry span attributes when telemetry is enabled
  • this package depends directly on both acty and acty-core because it imports both at runtime

Development

  • tests live under tests/
  • the repo includes unit tests plus an Acty engine integration test for shared telemetry behavior

About

OpenAI executor for acty

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages