Acontext is a context data platform that:
- Stores contexts & artifacts, using Postgres and S3
- Observes agent tasks and user feedback, and offers a nice Dashboard
- Enables agent self-learning by collecting experiences (or SOPs)
We're building it because we believe Acontext can help you:
- Build a more scalable agent product
- Improve your agent success rate and reduce running steps
so that your agent can be more stable and provide greater value to your users.
How Acontext Learns for your Agents?
How to Start It? π
π means a document link
We have a acontext-cli to help you do quick proof-of-concept. Download it first in your terminal:
curl -fsSL https://install.acontext.io | shYou should have docker installed and an OpenAI API Key to start an Acontext backend on your computer:
acontext docker upπ Acontext requires an LLM provider and an embedding provider.
We support OpenAI and Anthropic SDK formats and OpenAI and jina.ai embedding API formats
Once it's done, you can access the following endpoints:
- Acontext API Base URL: http://localhost:8029/api/v1
- Acontext Dashboard: http://localhost:3000/
Dashboard of Success Rate and other Metrics
We're maintaining Python and Typescript
SDKs. Below snippets are using Python.
pip install acontext # for Python
npm i @acontext/acontext # for Typescript
from acontext import AcontextClient
client = AcontextClient(
base_url="http://localhost:8029/api/v1"
api_key="sk-ac-your-root-api-bearer-token"
)
client.ping()
# yes, the default api_key is sk-ac-your-root-api-bearer-tokenAcontext can manage your sessions and artifacts.
Save Messages π
Acontext offers persistent storage for message data.
session = client.sessions.create()
messages = [{"role": "user", "content": "Hello, how are you?"}]
r = openai_client.chat.completions.create(model="gpt-4.1", messages=messages)
print(r.choices[0].message.content)
client.sessions.send_message(session_id=session.id, blob=messages[0])
client.sessions.send_message(session_id=session.id, blob=r.choices[0].message)π We support Anthropic SDK as well.
π We support multi-modal message storage.
Load Messages π
Obtain your session messages:
r = client.sessions.get_messages(session.id)
new_msg = r.items
new_msg.append({"role": "user", "content": "Hello again"})
r = openai_client.chat.completions.create(model="gpt-4.1", messages=new_msg)
print(r.choices[0].message.content)
You can view sessions in your local Dashboard
Artifacts π
Create a disk for your agent to store and read artifacts using file paths:
Code Snippet
from acontext import FileUpload
disk = client.disks.create()
file = FileUpload(
filename="todo.md",
content=b"# Sprint Plan\n\n## Goals\n- Complete user authentication\n- Fix critical bugs"
)
artifact = client.disks.artifacts.upsert(
disk.id,
file=file,
file_path="/todo/"
)
print(client.disks.artifacts.list(
disk.id,
path="/todo/"
))
result = client.disks.artifacts.get(
disk.id,
file_path="/todo/",
filename="todo.md",
with_public_url=True,
with_content=True
)
print(f"β File content: {result.content.raw}")
print(f"β Download URL: {result.public_url}")
You can view artifacts in your local Dashboard
Observe π
For every session, Acontext will launch a background agent to track the task progress and user feedback.
You can use the SDK to retrieve the current state of the agent session.
Code Snippet
from acontext import AcontextClient
# Initialize client
client = AcontextClient(
base_url="http://localhost:8029/api/v1", api_key="sk-ac-your-root-api-bearer-token"
)
# Create a project and session
session = client.sessions.create()
# Conversation messages
messages = [
{"role": "user", "content": "I need to write a landing page of iPhone 15 pro max"},
{
"role": "assistant",
"content": "Sure, my plan is below:\n1. Search for the latest news about iPhone 15 pro max\n2. Init Next.js project for the landing page\n3. Deploy the landing page to the website",
},
{
"role": "user",
"content": "That sounds good. Let's first collect the message and report to me before any landing page coding.",
},
{
"role": "assistant",
"content": "Sure, I will first collect the message then report to you before any landing page coding.",
},
]
# Send messages in a loop
for msg in messages:
client.sessions.send_message(session_id=session.id, blob=msg, format="openai")
# Wait for task extraction to complete
client.sessions.flush(session.id)
# Display extracted tasks
tasks_response = client.sessions.get_tasks(session.id)
print(tasks_response)
for task in tasks_response.items:
print(f"\nTask #{task.order}:")
print(f" ID: {task.id}")
print(f" Title: {task.data['task_description']}")
print(f" Status: {task.status}")
# Show progress updates if available
if "progresses" in task.data:
print(f" Progress updates: {len(task.data['progresses'])}")
for progress in task.data["progresses"]:
print(f" - {progress}")
# Show user preferences if available
if "user_preferences" in task.data:
print(" User preferences:")
for pref in task.data["user_preferences"]:
print(f" - {pref}")You can view the sessions tasks' statuses in Dashboard:
A Task Demo
Acontext can gather a bunch of sessions and learn skills (SOPs) on how to call tools for certain tasks.
Learn Skills to a Space π
A Space can store skills, experiences, and memories in a Notion-like system.
# Step 1: Create a Space for skill learning
space = client.spaces.create()
print(f"Created Space: {space.id}")
# Step 2: Create a session attached to the space
session = client.sessions.create(space_id=space.id)
# ... push the agent working contextThe learning happens in the background and is not real-time (delay around 10-30s).
You can view every Space in the Dashboard:
A Space Demo
Search Skills from a Space π
To search skills from Space and use it in the next session:
result = client.spaces.experience_search(
space_id=space.id,
query="I need to implement authentication",
mode="fast"
)Acontext supports fast and agentic modes for search. The former uses embedding to match skills. The latter uses a Notion Agent to explore the entire Space and tries to cover every skill needed.
Star Acontext on Github to support and receive instant notifications β€οΈ
Join the community for support and discussions:
