cycls is a zero-config framework for building and publishing AI agents. With a single decorator and one command, you can deploy your code as a web application complete with a front-end UI and an OpenAI-compatible API endpoint.
cycls is an anti-framework. We treat the boilerplate, config files, and infrastructure that surround modern applications as a bug to be eliminated. A developer's focus is the most valuable resource, and context-switching is its greatest enemy.
Our zero-config approach makes your Python script the single source of truth for the entire application. When your code is all you need, you stay focused, iterate faster, and ship with confidence.
This philosophy has a powerful side-effect: it makes development genuinely iterative. The self-contained nature of an agent encourages you to 'build in cycles'—starting simple and adding complexity without penalty. This same simplicity also makes cycls an ideal target for code generation. Because the entire application can be expressed in one file, LLMs can write, modify, and reason about cycls agents far more effectively than with traditional frameworks. It's a seamless interface for both human and machine.
- ✨ Zero-Config Deployment: No YAML or Dockerfiles.
cyclsinfers your dependencies, and APIs directly from your Python code. - 🚀 One-Command Push to Cloud: Go from local code to a globally scalable, serverless application with a single
agent.deploy(). - 💻 Instant Local Testing: Run
agent.local()to spin up a local server with hot-reloading for rapid iteration and debugging. - 🤖 OpenAI-Compatible API: Automatically serves a streaming
/chat/completionsendpoint. - 🌐 Automatic Web UI: Get a clean, interactive front-end for your agent out of the box, with no front-end code required.
- 🔐 Built-in Authentication: Secure your agent for production with a simple
auth=Trueflag that enables JWT-based authentication. - 📦 Declarative Dependencies: Define all your
pip,apt, or local file dependencies directly in Python.
pip install cyclsCreate a file main.py. This simple example creates an agent that streams back the message "hi".
import cycls
# Initialize the agent
agent = cycls.Agent()
# Decorate your function to register it as an agent
@agent()
async def hello(context):
yield "hi"
agent.local()Run it from your terminal:
python main.pyThis will start a local server. Open your browser to http://127.0.0.1:8000 to interact with your agent.
This example creates a more advanced agent that calls the OpenAI API. It will be deployed to the cloud with authentication enabled.
# deploy.py
import cycls
# Initialize the agent with dependencies and API keys
agent = cycls.Agent(
pip=["openai"],
keys=["ak-<token_id>", "as-<token_secret>"]
)
# A helper function to call the LLM
async def llm(messages):
# Import inside the function: 'openai' is only needed at runtime in the container.
import openai
client = openai.AsyncOpenAI(api_key="sk-...") # Your OpenAI key
model = "gpt-4o"
response = await client.chat.completions.create(
model=model,
messages=messages,
temperature=1.0,
stream=True
)
# Yield the content from the streaming response
async def event_stream():
async for chunk in response:
content = chunk.choices[0].delta.content
if content:
yield content
return event_stream()
# Register the function as an agent named "cake" and enable auth
@agent("cake", auth=True)
async def cake_agent(context):
# The context object contains the message history
return await llm(context.messages)
# Deploy the agent to the cloud
agent.deploy(prod=True)Run the deployment command from your terminal:
python main.pyAfter a few moments, your agent will be live and accessible at a public URL like https://cake.cycls.ai.
This project is licensed under the MIT License.