Skip to content
/ orra Public

Instantly orchestrate multi-agent workflows as services - batteries included. 🪡

License

Notifications You must be signed in to change notification settings

ezodude/orra

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

🪡 orra

🦸 Instantly orchestrate multi-agent workflows as services with Orra.

Orra's Python SDK and Local Development Environment enable agent service-based workflows with deployments, and workflow enhancements. This makes multi-agent orchestrations work seamlessly in production.

Why Orra?

You've got a base agent workflow up and running. But you want to ship it with a smaller LLM model for cost savings. Or, you want to ensure the quality of the outputs. Or, maybe you want to hit the ground running by plugging in some pre-built agents straight away.

That's where Orra comes in. 🚀

Built on LangGraph's powerful Agent runtime. Orra is focused on making it super simple to deploy your agents as services and get them production-ready. We take care of all the complex, behind-the-scenes stuff, so you can concentrate on creating awesome user experiences. ⚡️⚡️

Mix and match agents

Orra allows you to combine any agents - including off-the-shelf ones like GPT Researcher with custom agents built with LangChain, CrewAI, and more.

Workflow enhancements (in the works)

Orra bakes in enhancements to enable reliable, repeatable execution of complex multi agent service-based workflows by:

  • Offering pre-built data and API integrations as part of the SDK.
  • Standardizing flow control between agent services.
  • Enhancing tool prompting via integrated LLM fine-tuning.
  • Evaluating agent-service outputs to ensure correctness and quality.
  • Monitoring costs across LLMs and tools.
  • Offering pre-built open-source agents to get you up and running fast.

We're just getting started

We're still ironing out the details of our Local Development Environment.

You can try out the latest by installing a local version of Orra.

(Check out the Dependabot example for a demo of a real-world agent service-based workflow)

What does Orra look like?

It just takes a few lines of code to orchestrate a service-based workflow using Orra:

from typing import Optional, Any
from orra import Orra

app = Orra(schema={"source": Optional[str], "researched": Optional[str]})


@app.step
def investigate(state: dict) -> Any:
    return {**state, "source": "hello world"}


@app.step
def research_topic(state: dict) -> Any:
    result = {}  # Call your agent here
    return {**state, "researched": result}

# **** That's it! You now have a `/workflow` endpoint plus an endpoint for each step. ****

Try Orra locally

This is a basic Hello World example to get you familiar with Orra.

Requirements:

  1. Create a new Orra project:
poetry new orra-app
cd orra-app
  1. Install the Orra SDK and CLI locally from the cloned repository:
poetry add /path/to/repo/libs/orra
poetry add /path/to/repo/libs/cli
  1. Create a main file in the orra-app directory, and copy in the content of this example:
touch main.py
  1. Run your Orra project using the Orra CLI:
poetry run python -m orra_cli run
  1. Your Orra project is now running, and you can access it via HTTP endpoints! 🚀
poetry run python -m orra_cli run
  ✔ Compiling Orra application workflow... Done!
  ✔ Prepared Orra application step endpoints...Done!
  ✔ Preparing Orra application workflow endpoint... Done!
  ✔ Starting Orra application... Done!

  Orra development server running!
  Your API is running at:     http://127.0.0.1:1430

INFO:     Started server process [33823]
INFO:     Waiting for application startup.
INFO:     Application startup complete.
INFO:     Orra running on http://127.0.0.1:1430 (Press CTRL+C to quit)
  1. Execute your workflow as a service by sending a POST request to the /workflow endpoint:
curl -X POST \
  -H "Content-Type: application/json" \
  -d '{"source": null, "researched": null}' \ 
  http://127.0.0.1:1430/workflow

Outputs:

{
	"researched": "'hello world' is a common phrase used in programming to demonstrate the basic syntax of a programming language. It is believed to have originated from the book \"The C Programming Language\" by Brian Kernighan and Dennis Ritchie.",
	"source": "hello world"
}
  1. Execute individual steps by sending a POST request to the /workflow/step_name endpoint ( e.g. /workflow/investigate):
curl -X POST \
  -H "Content-Type: application/json" \
  -d '{"source": null, "researched": null}' \
  http://127.0.0.1:1430/workflow/investigate

Outputs:

{
	"researched": null,
	"source": "hello world"
}

This is a great way to test orchestrated steps individually.

🎉 You're all set! 🎉

About

Instantly orchestrate multi-agent workflows as services - batteries included. 🪡

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages