Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
87 changes: 87 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,87 @@
# runtimeuse

Run AI agents inside sandboxes over WebSocket. Two packages:

| Package | Language | Role | Install |
| ---------------------------------------------------------- | ---------- | ------------------------------------------ | ------------------------------- |
| [`runtimeuse`](./packages/runtimeuse) | TypeScript | Agent runtime (runs inside the sandbox) | `npm install runtimeuse` |
| [`runtimeuse-client`](./packages/runtimeuse-client-python) | Python | Client (connects from outside the sandbox) | `pip install runtimeuse-client` |

## Quick Start

### 1. Start the runtime (inside a sandbox)

```bash
npx -y runtimeuse@latest
```

This starts a WebSocket server on port 8080 using the OpenAI agent handler by default. Use `--agent claude` for Claude.

### 2. Connect from Python

```python
import asyncio
import json
from runtimeuse_client import RuntimeUseClient, InvocationMessage, ResultMessageInterface

async def main():
client = RuntimeUseClient(ws_url="ws://localhost:8080")

invocation = InvocationMessage(
message_type="invocation_message",
source_id="my-run-001",
preferred_model="gpt-4.1",
system_prompt="You are a helpful assistant.",
user_prompt="What is 2 + 2?",
output_format_json_schema_str=json.dumps({
"type": "json_schema",
"schema": {
"type": "object",
"properties": {"answer": {"type": "string"}},
},
}),
secrets_to_redact=[],
agent_env={},
)

async def on_result(result: ResultMessageInterface):
print(result.structured_output)

await client.invoke(
invocation=invocation,
on_result_message=on_result,
result_message_cls=ResultMessageInterface,
)

asyncio.run(main())
```

### 3. Or use the runtime programmatically (TypeScript)

```typescript
import { RuntimeUseServer, openaiHandler } from "runtimeuse";

const server = new RuntimeUseServer({ handler: openaiHandler, port: 8080 });
await server.start();
```

## How It Works

```
Python Client ──WebSocket──> Runtime (in sandbox) ──> AgentHandler
├── openai (default)
└── claude
```

1. The client sends an `InvocationMessage` over WebSocket
2. The runtime downloads files and runs pre-commands (if any)
3. The `AgentHandler` executes the agent with the given prompts and model
4. Intermediate `AssistantMessage`s stream back to the client
5. Files in the artifacts directory are auto-detected and uploaded via presigned URL handshake
6. A final `ResultMessage` with structured output is sent back

See the [runtime README](./packages/runtimeuse/README.md) and [client README](./packages/runtimeuse-client-python/README.md) for full API docs.

## License

BSL-1.0
8 changes: 8 additions & 0 deletions packages/runtimeuse-client-python/.gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
__pycache__/
.pytest_cache/
venv/
.venv/
.env/
env/
.env
scratch_cli*.py
149 changes: 149 additions & 0 deletions packages/runtimeuse-client-python/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,149 @@
# runtimeuse (Python)

Python client library for communicating with a [runtimeuse](https://github.com/getlark/runtimeuse) agent runtime over WebSocket.

Handles the WebSocket connection lifecycle, message dispatch, artifact upload handshake, cancellation, and structured result parsing -- so you can focus on what to do with agent results rather than wire protocol details.

## Installation

```bash
pip install runtimeuse
```

## Quick Start

Start the runtime inside any sandbox, then connect from outside:

```python
import asyncio
from runtimeuse import RuntimeUseClient, InvocationMessage, ResultMessageInterface

async def main():
# Start the runtime in a sandbox (provider-specific)
sandbox = Sandbox.create()
sandbox.run("npx -y runtimeuse")
ws_url = sandbox.get_url(8080)

# Connect and invoke
client = RuntimeUseClient(ws_url=ws_url)

invocation = InvocationMessage(
message_type="invocation_message",
source_id="my-run-001",
system_prompt="You are a helpful assistant.",
user_prompt="Do the thing and return the result.",
output_format_json_schema_str='{"type":"json_schema","schema":{"type":"object"}}',
secrets_to_redact=["sk-secret-key"],
agent_env={"API_KEY": "sk-secret-key"},
)

async def on_result(result: ResultMessageInterface):
print(f"Success: {result.structured_output.get('success')}")
print(f"Output: {result.structured_output}")

await client.invoke(
invocation=invocation,
on_result_message=on_result,
result_message_cls=ResultMessageInterface,
)

asyncio.run(main())
```

For local development without a sandbox, connect directly:

```python
client = RuntimeUseClient(ws_url="ws://localhost:8080")
```

## Usage

### RuntimeUseClient

Manages the WebSocket connection to the agent runtime and runs the message loop: sends an invocation, iterates the response stream, and dispatches typed messages to your callbacks.

```python
client = RuntimeUseClient(ws_url="ws://localhost:8080")

await client.invoke(
invocation=invocation,
on_result_message=on_result,
result_message_cls=ResultMessageInterface,
on_assistant_message=on_assistant, # optional
on_artifact_upload_request=on_artifact, # optional -- return (presigned_url, content_type)
on_error_message=on_error, # optional
is_cancelled=check_cancelled, # optional -- async () -> bool
timeout=300, # optional -- seconds
)
```

### Artifact Upload Handshake

When the agent runtime requests an artifact upload, provide a callback that returns a presigned URL and content type. The client sends the response back automatically.

```python
async def on_artifact(request: ArtifactUploadRequestMessageInterface) -> tuple[str, str]:
presigned_url = await my_storage.create_presigned_url(request.filename)
content_type = guess_content_type(request.filename)
return presigned_url, content_type
```

### Cancellation

Pass an `is_cancelled` callback to cancel a running invocation. When it returns `True`, the client sends a cancel message to the runtime and raises `CancelledException`.

```python
from runtimeuse import CancelledException

async def check_cancelled() -> bool:
return await db.is_run_cancelled(run_id)

try:
await client.invoke(
invocation=invocation,
on_result_message=on_result,
result_message_cls=ResultMessageInterface,
is_cancelled=check_cancelled,
)
except CancelledException:
print("Run was cancelled")
```

### Custom Result Types

Subclass `ResultMessageInterface` to add domain-specific fields:

```python
from runtimeuse import ResultMessageInterface

class MyResultMessage(ResultMessageInterface):
custom_score: float | None = None

await client.invoke(
invocation=invocation,
on_result_message=handle_my_result,
result_message_cls=MyResultMessage,
)
```

## API Reference

### Message Types

| Class | Description |
| ----------------------------------------- | ------------------------------------------------------ |
| `InvocationMessage` | Sent to the runtime to start an agent invocation |
| `ResultMessageInterface` | Structured result from the agent |
| `AssistantMessageInterface` | Intermediate assistant text messages |
| `ArtifactUploadRequestMessageInterface` | Runtime requesting a presigned URL for artifact upload |
| `ArtifactUploadResponseMessageInterface` | Response with presigned URL sent back to runtime |
| `ErrorMessageInterface` | Error from the agent runtime |
| `CancelMessage` | Sent to cancel a running invocation |
| `CommandInterface` | Pre/post invocation shell command |
| `RuntimeEnvironmentDownloadableInterface` | File to download into the runtime before invocation |

### Exceptions

| Class | Description |
| -------------------- | ------------------------------------------- |
| `CancelledException` | Raised when `is_cancelled()` returns `True` |
Original file line number Diff line number Diff line change
@@ -0,0 +1,61 @@
import os
import asyncio
import json
from pydantic import BaseModel

from src.runtimeuse_client import (
AssistantMessageInterface,
ErrorMessageInterface,
RuntimeUseClient,
InvocationMessage,
ResultMessageInterface,
CommandInterface,
)


class Answer(BaseModel):
answer: str


async def main():
client = RuntimeUseClient(ws_url="ws://localhost:8080")
invocation = InvocationMessage(
message_type="invocation_message",
source_id="my-source",
preferred_model="gpt-5.4",
pre_agent_invocation_commands=[
CommandInterface(
command="echo 'Hello, world!'",
cwd=os.getcwd(),
env={},
)
],
system_prompt="You are a helpful assistant.",
user_prompt="Search the web to find the answer to the question: 'What is the population of France grouped by region? Once you find the answer, run a python script to compute the sum of the total population of France.'",
output_format_json_schema_str=json.dumps(
{"type": "json_schema", "schema": Answer.model_json_schema()}
),
secrets_to_redact=[],
agent_env={},
)

async def on_result(result: ResultMessageInterface):
print(f"Result: {result.structured_output}")

async def on_assistant_message(message: AssistantMessageInterface):
print(f"Assistant message: {message.text_blocks}")

async def on_error_message(message: ErrorMessageInterface):
print(f"Error message: {message.error}")

await client.invoke(
invocation=invocation,
on_result_message=on_result,
result_message_cls=ResultMessageInterface,
on_assistant_message=on_assistant_message,
on_error_message=on_error_message,
)


if __name__ == "__main__":
asyncio.run(main())
18 changes: 18 additions & 0 deletions packages/runtimeuse-client-python/pyproject.toml
Original file line number Diff line number Diff line change
@@ -0,0 +1,18 @@
[build-system]
requires = ["hatchling"]
build-backend = "hatchling.build"

[project]
name = "runtimeuse-client"
version = "0.3.0"
description = "Client library for AI agent runtime communication over WebSocket"
readme = "README.md"
license = "BSL-1.0"
requires-python = ">=3.10"
dependencies = [
"pydantic>=2.0",
"websockets>=12.0",
]

[tool.hatch.build.targets.wheel]
packages = ["src/runtimeuse_client"]
Original file line number Diff line number Diff line change
@@ -0,0 +1,32 @@
from .client import RuntimeUseClient
from .transports import Transport, WebSocketTransport
from .exceptions import CancelledException
from .types import (
AgentRuntimeMessageInterface,
RuntimeEnvironmentDownloadableInterface,
CommandInterface,
InvocationMessage,
ResultMessageInterface,
AssistantMessageInterface,
ArtifactUploadRequestMessageInterface,
ArtifactUploadResponseMessageInterface,
ErrorMessageInterface,
CancelMessage,
)

__all__ = [
"RuntimeUseClient",
"Transport",
"WebSocketTransport",
"CancelledException",
"AgentRuntimeMessageInterface",
"RuntimeEnvironmentDownloadableInterface",
"CommandInterface",
"InvocationMessage",
"ResultMessageInterface",
"AssistantMessageInterface",
"ArtifactUploadRequestMessageInterface",
"ArtifactUploadResponseMessageInterface",
"ErrorMessageInterface",
"CancelMessage",
]
Loading