🦸 Instantly orchestrate multi-agent workflows as services with Orra.
Orra's Python SDK and Local Development Environment enable agent service-based workflows with deployments, and workflow enhancements. This makes multi-agent orchestrations work seamlessly in production.
You've got a base agent workflow up and running. But you want to ship it with a smaller LLM model for cost savings. Or, you want to ensure the quality of the outputs. Or, maybe you want to hit the ground running by plugging in some pre-built agents straight away.
That's where Orra comes in. 🚀
Built on LangGraph's powerful Agent runtime. Orra is focused on making it super simple to deploy your agents as services and get them production-ready. We take care of all the complex, behind-the-scenes stuff, so you can concentrate on creating awesome user experiences. ⚡️⚡️
Orra allows you to combine any agents - including off-the-shelf ones like GPT Researcher with custom agents built with LangChain, CrewAI, and more.
Orra bakes in enhancements to enable reliable, repeatable execution of complex multi agent service-based workflows by:
- Offering pre-built data and API integrations as part of the SDK.
- Standardizing flow control between agent services.
- Enhancing tool prompting via integrated LLM fine-tuning.
- Evaluating agent-service outputs to ensure correctness and quality.
- Monitoring costs across LLMs and tools.
- Offering pre-built open-source agents to get you up and running fast.
We're still ironing out the details of our Local Development Environment.
You can try out the latest by installing a local version of Orra.
(Check out the Dependabot example for a demo of a real-world agent service-based workflow)
It just takes a few lines of code to orchestrate a service-based workflow using Orra:
from typing import Optional, Any
from orra import Orra
app = Orra(schema={"source": Optional[str], "researched": Optional[str]})
@app.step
def investigate(state: dict) -> Any:
return {**state, "source": "hello world"}
@app.step
def research_topic(state: dict) -> Any:
result = {} # Call your agent here
return {**state, "researched": result}
# **** That's it! You now have a `/workflow` endpoint plus an endpoint for each step. ****
This is a basic Hello World example to get you familiar with Orra.
Requirements:
- Poetry installed.
- Clone this repository.
- Create a new Orra project:
poetry new orra-app
cd orra-app
- Install the Orra SDK and CLI locally from the cloned repository:
poetry add /path/to/repo/libs/orra
poetry add /path/to/repo/libs/cli
- Create a main file in the
orra-app
directory, and copy in the content of this example:
touch main.py
- Run your Orra project using the Orra CLI:
poetry run python -m orra_cli run
- Your Orra project is now running, and you can access it via HTTP endpoints! 🚀
poetry run python -m orra_cli run
✔ Compiling Orra application workflow... Done!
✔ Prepared Orra application step endpoints...Done!
✔ Preparing Orra application workflow endpoint... Done!
✔ Starting Orra application... Done!
Orra development server running!
Your API is running at: http://127.0.0.1:1430
INFO: Started server process [33823]
INFO: Waiting for application startup.
INFO: Application startup complete.
INFO: Orra running on http://127.0.0.1:1430 (Press CTRL+C to quit)
- Execute your workflow as a service by sending a POST request to the
/workflow
endpoint:
curl -X POST \
-H "Content-Type: application/json" \
-d '{"source": null, "researched": null}' \
http://127.0.0.1:1430/workflow
Outputs:
{
"researched": "'hello world' is a common phrase used in programming to demonstrate the basic syntax of a programming language. It is believed to have originated from the book \"The C Programming Language\" by Brian Kernighan and Dennis Ritchie.",
"source": "hello world"
}
- Execute individual steps by sending a POST request to the
/workflow/step_name
endpoint ( e.g./workflow/investigate
):
curl -X POST \
-H "Content-Type: application/json" \
-d '{"source": null, "researched": null}' \
http://127.0.0.1:1430/workflow/investigate
Outputs:
{
"researched": null,
"source": "hello world"
}
This is a great way to test orchestrated steps individually.
🎉 You're all set! 🎉