Skip to content

round1topo/tinyclaw

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

TinyClaw

TinyClaw is an edge fact node runtime for turning raw device signals into stable, observable business facts.

The current first-class profile is occupancy-camera-v1: a camera-driven occupancy node that detects people, applies debounce and cooldown, models device health, and exposes the result over MCP.

What TinyClaw Is

TinyClaw is not just a camera wrapper and not just an MCP adapter.

It sits between the physical edge world and northbound protocol consumers:

  • Southbound: camera frames, detector outputs, device timing, runtime health
  • Middle layer: semantic convergence, state transitions, heartbeat, recent events
  • Northbound: MCP tools, resources, notifications, stdio, and streamable HTTP

In short:

  • MCP is the northbound protocol
  • TinyClaw is the semantic runtime that decides what facts are worth publishing

Current Profile

Current default profile:

  • occupancy-camera-v1

Current published facts:

  • occupancy.changed
  • device.heartbeat
  • device.health_changed

Current MCP surface:

  • Tools:
    • tinyclaw.get_occupancy_status
    • tinyclaw.capture_frame
    • tinyclaw.get_config
    • tinyclaw.get_device_health
    • tinyclaw.set_config
    • tinyclaw.reset_state
  • Resources:
    • tinyclaw://scene/{scene_id}/state
    • tinyclaw://scene/{scene_id}/stats
    • tinyclaw://scene/{scene_id}/events/recent
    • tinyclaw://scene/{scene_id}/health
  • Notifications:
    • occupancy.changed
    • device.heartbeat
    • device.health_changed

Compatibility:

  • tinyclaw.get_occupancy is still accepted as an alias for tinyclaw.get_occupancy_status

Architecture

TinyClaw V0.3 is structured in five layers:

  1. Input Plugin Brings raw world signals into the node. The current concrete implementation is a camera frame source.
  2. Semantic Profile Owns profile-specific convergence. For occupancy-camera-v1, that includes detection-to-candidate translation, debounce, cooldown, transition generation, and profile snapshots.
  3. Node Runtime Owns lifecycle, health, heartbeat, runtime snapshots, and loop execution.
  4. TinyClaw Server Owns MCP tools/resources/notifications, recent events, and publishing.
  5. Transport Adapter Maps the MCP server onto stdio or streamable HTTP.

More detail:

Why It Is Different From a Simple MCP Adapter

A simple device MCP adapter usually exposes raw actions like:

  • capture a frame
  • run a detector
  • read device status

TinyClaw goes one level higher. It publishes stabilized facts:

  • occupancy transitions
  • device heartbeat
  • health transitions
  • recent event context

That means it does more than protocol translation:

  • state memory
  • debounce/cooldown
  • health modeling
  • event buffering
  • runtime configuration
  • multi-transport exposure

Install

cd /Users/xiaoyatao/tinyclaw
.venv/bin/pip install -e .[dev]

After editable install, the project no longer depends on PYTHONPATH=src.

Run

Start the MCP server over stdio:

tinyclaw-mcp --config config/occupancy.yaml

Start the MCP server over streamable HTTP:

tinyclaw-mcp --config config/occupancy.yaml --transport http --host 127.0.0.1 --port 8000

Run the local demo loop without MCP:

tinyclaw --demo --config config/occupancy.yaml

Demo Paths

TinyClaw currently ships with three demo layers:

  1. demo-e2e Pure deterministic internal driving for service semantics.
  2. demo-frames Semi-real visual input via fixture frames plus a mock detector.
  3. demo-real-camera Live device demonstration with a real camera and optional fault injection.

E2E Demo

.venv/bin/tinyclaw demo-e2e \
  --config config/occupancy.yaml \
  --scenario fixtures/e2e_happy_path.json \
  --time-scale 0.01 \
  --heartbeat-interval 2 \
  --dump-events /tmp/tinyclaw_e2e_events.json

Frame Demo

.venv/bin/tinyclaw demo-frames \
  --config config/occupancy.yaml \
  --frames fixtures/frames/happy_path \
  --speed 100 \
  --heartbeat-interval 2 \
  --dump-events /tmp/tinyclaw_frame_events.json

Real Camera Demo

.venv/bin/tinyclaw demo-real-camera \
  --config config/demo_real_camera.yaml \
  --duration 30 \
  --dump-events /tmp/tinyclaw_real_events.json \
  --heartbeat-interval 2 \
  --simulate-fault camera_timeout \
  --no-display

Real-device notes:

Development

Run tests:

pytest -q

Run lint:

ruff check .

Minimal remote MCP client smoke example:

python examples/http_client_smoke.py

Current Status

What is already done:

  • first-class occupancy-camera-v1 profile
  • runtime/server/transport split
  • stdio and streamable HTTP transports
  • health model and heartbeat
  • recent events buffer
  • deterministic, frame-based, and real-camera demos

What is not done yet:

  • multi-profile runtime
  • multi-device orchestration
  • package/manifest-based declarative assembly
  • long-term persistence
  • cloud registry or package marketplace

Next Direction

The next planned product shape is a declarative node assembly platform:

  • device_package.yaml
  • node_manifest.yaml
  • registry
  • assembler
  • validator

That work builds on the current V0.3 runtime split instead of replacing it.

About

Edge fact node runtime for turning device signals into MCP-native occupancy, health, and heartbeat events

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors