Let AI Control Real Robots โ From Code to Physical World in One Line
๐ง One
pip installaway from giving your AI agent a real body.
RobotClaw is a Python robotics framework that bridges AI agents with physical robots. It turns high-level AI commands into real-world servo movements โ enabling your AI to walk, wave, dance, and interact with the physical world.
Built for the OpenClaw AI agent platform. Compatible with LOBOT LX series bus servos. Designed for makers, researchers, and anyone who wants to bring robots to life.
pip install robotclaw# Your AI agent says: "่ฎฉๆบๅจไบบๆฅๆ"
# RobotClaw makes it happen in the real world:
from robotclaw import Robot, DEFAULT_CONFIG
with Robot(config=DEFAULT_CONFIG, port="COM3") as robot:
robot.go_home(time_ms=2000) # Stand up
robot.set_joint("left_hip_pitch", # Move a joint
position=350,
time_ms=500)๐ค User: "่ฎฉๆบๅจไบบๅๅฐๅๅงไฝ็ฝฎ"
๐ง AI: โ robot.go_home(2000) โ
๐ค User: "ๆญๆพ่ตฐ่ทฏๅจไฝ๏ผ2ๅ้"
๐ง AI: โ player.play(walk_clip, speed=2.0) โ
๐ค User: "ๆฃๆฅๆๆ่ตๆบ็็ตๅ"
๐ง AI: โ robot.scan() โ ๆฅๅ็ตๅ็ถๆ โ
No robotics PhD required. If you can write Python, you can control a robot.
| Feature | Description |
|---|---|
| ๐ง AI-Native | First-class integration with OpenClaw โ AI agents control robots via natural language |
| ๐ Plug & Play | Connect USB, pip install, and you're controlling servos in 3 lines of code |
| ๐ฌ Motion Teaching | Record human demonstrations, save as JSON, replay at any speed |
| ๐ค High-Level API | Think in joints (left_hip_pitch), not raw servo IDs |
| ๐ ๏ธ CLI Tools | robotclaw-scan for diagnostics, robotclaw-teach for interactive teaching |
| โก Thread-Safe | Production-grade mutex-protected serial communication |
| ๐ Pure Python | Zero compiled dependencies โ runs anywhere Python runs |
pip install robotclawfrom robotclaw import ServoBus
bus = ServoBus()
bus.connect("COM3", baudrate=115200)
# Move servo to position 500 in 1 second
bus.move(servo_id=1, position=500, time_ms=1000)
# Read current position
pos = bus.read_position(servo_id=1)
print(f"Servo position: {pos}")
bus.disconnect()from robotclaw import Robot, DEFAULT_CONFIG
with Robot(config=DEFAULT_CONFIG, port="COM3") as robot:
# All 10 joints to home position
robot.go_home(time_ms=2000)
# Control individual joints by name
robot.set_joint("left_hip_pitch", position=350, time_ms=500)
robot.set_joint("right_knee", position=600, time_ms=500)
# Read all joint positions at once
positions = robot.get_positions()
print(positions)No programming needed โ just physically move the robot and record:
from robotclaw import Robot, DEFAULT_CONFIG
from robotclaw.recorder import MotionRecorder, MotionPlayer
with Robot(config=DEFAULT_CONFIG, port="COM3") as robot:
# Step 1: Record a motion by posing the robot
recorder = MotionRecorder(robot)
recorder.start_recording("wave_hand")
robot.unload_all() # Release servos so you can pose by hand
input("Pose the robot โ Press Enter to capture...")
recorder.capture_frame()
input("Next pose โ Press Enter...")
recorder.capture_frame()
clip = recorder.finish_recording()
recorder.save(clip, "motions/wave_hand.json")
# Step 2: Replay at any speed
robot.load_all()
player = MotionPlayer(robot)
player.play(clip, speed=1.5) # 1.5x speed playback# Discover all connected servos
robotclaw-scan --port COM3
# Interactive motion teaching terminal
robotclaw-teach --port COM3RobotClaw is designed from the ground up for AI-driven robotics. It ships with a built-in OpenClaw skill that lets AI agents understand and control the robot through natural language:
from robotclaw.openclaw_skill import OpenClawSkill
# Register with your AI agent
skill = OpenClawSkill(port="COM3")
# Now your AI can:
# - Move joints by name
# - Play recorded motions
# - Read sensor data (position, voltage, temperature)
# - Perform diagnostic scans
# - Chain complex movement sequencesUse Cases:
- ๐ Education โ Students learn robotics through conversation with AI
- ๐ญ Prototyping โ Rapidly test robot behaviors via natural language
- ๐ฎ Entertainment โ AI-controlled robot performances and interactions
- ๐ฌ Research โ Quickly iterate on movement patterns without manual coding
| Component | Specification |
|---|---|
| Servos | LOBOT LX-16A / LX-224 / LX-225 bus servos |
| Interface | USB-to-TTL serial adapter |
| Baudrate | 115200 (direct) or 9600 (controller board) |
| Power | 6โ8.4V DC, โฅ5A recommended for 10 servos |
| Default Config | 10 servos, 5 per leg (biped robot) |
๐ก Tip: More servo types and robot configurations coming soon. PRs welcome!
src/robotclaw/
โโโ __init__.py # Public API: ServoBus, Robot, DEFAULT_CONFIG
โโโ servo_bus.py # LOBOT LX protocol โ the low-level driver
โโโ robot_config.py # JointConfig, LegConfig, RobotConfig
โโโ robot.py # High-level robot controller
โโโ cli.py # CLI: robotclaw-scan, robotclaw-teach
โโโ openclaw_skill.py # OpenClaw AI agent skill adapter
โโโ SKILL.md # OpenClaw skill descriptor
โโโ recorder/ # Motion teaching subsystem
โโโ motion_data.py # Keyframe & MotionClip data models
โโโ recorder.py # Record motions from physical teaching
โโโ player.py # Play back motions with speed control
- LOBOT LX bus servo protocol driver
- High-level joint-based robot API
- Motion recording & playback system
- OpenClaw AI agent integration
- CLI diagnostic & teaching tools
- ๐ Visual motion editor (web UI)
- ๐ Inverse kinematics engine
- ๐ Support for more servo protocols (Dynamixel, Feetech)
- ๐ ROS 2 bridge
- ๐ Reinforcement learning integration
We welcome contributions from robotics enthusiasts, AI researchers, and Python developers!
# Clone and install in development mode
git clone https://github.com/RobotBase/robotclaw.git
cd robotclaw
pip install -e ".[dev]"
# Run tests
python -m pytest tests/ -vSee the CHANGELOG for recent updates.
MIT License โ Use it freely in your projects, commercial or otherwise.
Built with โค๏ธ by the RobotBase team
Making AI-powered robotics accessible to everyone