This repository provides a template for creating a Python agent that can be used with the BeeAI Platform.
BeeAI agents are Python-based services that can be run locally or deployed to the BeeAI Platform. Each agent exposes specific functionality through the ACP (Agent Communication Protocol) which is implemented via SDK.
In this template, you'll find:
- A basic agent implementation
- Docker configuration to ease the build of agent
- Project structure for building your own agents
├── src/ # Source code directory
│ └── beeai_agents/ # Python package directory
│ ├── __init__.py # Package initialization (empty)
│ └── agent.py # Agent implementations (main file you'll modify)
├── Dockerfile # Container configuration to build the agent
├── pyproject.toml # Package metadata and dependencies
├── uv.lock # Dependency lock file (generated by UV)
└── README.md # Project documentation
Key files to focus on:
agent.py
: This is where you'll implement your agent logicpyproject.toml
: Update this if you need to add dependenciesDockerfile
: Modify if you need special build configuration
- BeeAI Platform installed
- Python 3.11 or higher
- UV package manager for dependency management
-
Set up your project. Start by using this template for your own agent. You may use this as a template or fork this repository.
-
Install dependencies using
uv sync
. -
Implement your agent by modifying the source code located in src/beeai_agents/server.py.
Here's an example of the included template agent:
@server.agent(
metadata=Metadata(ui={"type": "hands-off"})
)
async def example_agent(input: list[Message], context: Context) -> AsyncGenerator[RunYield, RunYieldResume]:
"""Polite agent that greets the user"""
hello_template: str = os.getenv("HELLO_TEMPLATE", "Ciao %s!")
yield MessagePart(content=hello_template % str(input[-1]))
Modify this file to implement your own agent's logic. Here are some key points to consider when creating your agent:
- The function name (example_agent above) is used as the unique id for the agent in the platform. You can override this in the metadata.
- The docstring is used as the agent's description in the platform UI. You can also override this in the metadata.
- The
@server.agent()
decorator registers your function as an agent and can customize its appearance and behavior - Your agent receives messages in the
input
list, with the most recent message at the end - Return responses using
yield MessagePart(content="text")
or even simplyyield "text"
- Access conversation context through the
context
parameter
Tip
You can define multiple agents in the same service by creating additional decorated functions.
To create the most engaging and helpful interface for your users, define the following metadata in your agent decorator. This information shapes how your agent is presented in the GUI.
@server.agent(
name="chat",
description=(
"Conversational agent with memory, supporting real-time search, "
"Wikipedia lookups, and weather updates through integrated tools"
),
metadata=Metadata(
ui={
"type": "chat",
"user_greeting": "Hello! I'm your AI assistant. How can I help you today?"
}, # type: ignore[call-arg]
framework="BeeAI",
recommended_models=["llama3.3:70b-instruct-fp16"],
author={
"name": "John Smith",
"email": "jsmith@example.com",
"url": "https://example.com"
},
dependencies=[
{"type": "tool", "name": "Weather"},
{"type": "tool", "name": "Wikipedia"},
{"type": "tool", "name": "Google Search"}
],
),
)
Note
The example above highlights the components that directly impact the GUI experience. For the complete metadata specification, see the ACP Agent Detail documentation.
- Update project metadata and dependencies in the
pyproject.toml
file. After updating, synchronize withuv sync
.
To test your agent locally:
# Start the agent server
uv run server
This will start a local http server on http://127.0.0.1:8000 by default. You'll get an output similar to:
INFO: Started server process [86448]
INFO: Waiting for application startup.
INFO: Application startup complete.
INFO: Uvicorn running on http://127.0.0.1:8000 (Press CTRL+C to quit)
Your agents should now be started on http://localhost:8000. You can verify your agents are running with the BeeAI CLI:
# List available agents
beeai list
# Run the example agent
beeai run example_agent "Your Name"
There are two ways to add your agent to the BeeAI Platform:
When running agents locally with uv run server
, they are automatically registered with the BeeAI Platform. In this mode:
- The BeeAI Platform will communicate with your local server
- You manage the agent's lifecycle (starting/stopping)
- Changes are immediately available without redeployment
To share your agent with others or deploy it to the BeeAI Platform:
- Push your agent code to a GitHub repository
- Add the agent to BeeAI using:
beeai add https://github.com/your-username/your-repo-name
The BeeAI Platform will automatically:
- Clone your repository
- Build a Docker image
- Start the agent container
- Extract agent configuration directly from the
/agents
endpoint - Register the agent in the platform
- For stable versions, use Git tags (e.g.,
agents-v0.0.1
) - When adding a tagged version:
beeai add https://github.com/your-username/your-repo-name@agents-v0.0.1
- To update: remove the old version (
beeai remove <agent-name>
) and add the new one
To check the status of your agents:
# List all agents and their status
beeai list
- Local agents: View logs directly in the terminal where you ran
uv run server
- Managed agents: Use
beeai logs <agent-id>
to view logs - BeeAI server logs: Check
/opt/homebrew/var/log/beeai-server.log
(default location)