Skip to content

RobotBase/robotclaw

Folders and files

NameName
Last commit message
Last commit date

Latest commit

ย 

History

2 Commits
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 

Repository files navigation

๐Ÿฆพ RobotClaw

Let AI Control Real Robots โ€” From Code to Physical World in One Line

PyPI version Python versions License: MIT Downloads Stars


๐Ÿง  One pip install away from giving your AI agent a real body.

RobotClaw is a Python robotics framework that bridges AI agents with physical robots. It turns high-level AI commands into real-world servo movements โ€” enabling your AI to walk, wave, dance, and interact with the physical world.

Built for the OpenClaw AI agent platform. Compatible with LOBOT LX series bus servos. Designed for makers, researchers, and anyone who wants to bring robots to life.

pip install robotclaw

๐ŸŽฌ What Can You Do?

# Your AI agent says: "่ฎฉๆœบๅ™จไบบๆŒฅๆ‰‹"
# RobotClaw makes it happen in the real world:

from robotclaw import Robot, DEFAULT_CONFIG

with Robot(config=DEFAULT_CONFIG, port="COM3") as robot:
    robot.go_home(time_ms=2000)          # Stand up
    robot.set_joint("left_hip_pitch",    # Move a joint
                    position=350, 
                    time_ms=500)
๐Ÿค– User:  "่ฎฉๆœบๅ™จไบบๅ›žๅˆฐๅˆๅง‹ไฝ็ฝฎ"
๐Ÿง  AI:    โ†’ robot.go_home(2000) โœ…

๐Ÿค– User:  "ๆ’ญๆ”พ่ตฐ่ทฏๅŠจไฝœ๏ผŒ2ๅ€้€Ÿ"
๐Ÿง  AI:    โ†’ player.play(walk_clip, speed=2.0) โœ…

๐Ÿค– User:  "ๆฃ€ๆŸฅๆ‰€ๆœ‰่ˆตๆœบ็š„็”ตๅŽ‹"
๐Ÿง  AI:    โ†’ robot.scan() โ†’ ๆŠฅๅ‘Š็”ตๅŽ‹็Šถๆ€ โœ…

No robotics PhD required. If you can write Python, you can control a robot.


โœจ Why RobotClaw?

Feature Description
๐Ÿง  AI-Native First-class integration with OpenClaw โ€” AI agents control robots via natural language
๐Ÿ”Œ Plug & Play Connect USB, pip install, and you're controlling servos in 3 lines of code
๐ŸŽฌ Motion Teaching Record human demonstrations, save as JSON, replay at any speed
๐Ÿค– High-Level API Think in joints (left_hip_pitch), not raw servo IDs
๐Ÿ› ๏ธ CLI Tools robotclaw-scan for diagnostics, robotclaw-teach for interactive teaching
โšก Thread-Safe Production-grade mutex-protected serial communication
๐Ÿ Pure Python Zero compiled dependencies โ€” runs anywhere Python runs

๐Ÿš€ Quick Start

1. Install

pip install robotclaw

2. Connect & Control

from robotclaw import ServoBus

bus = ServoBus()
bus.connect("COM3", baudrate=115200)

# Move servo to position 500 in 1 second
bus.move(servo_id=1, position=500, time_ms=1000)

# Read current position
pos = bus.read_position(servo_id=1)
print(f"Servo position: {pos}")

bus.disconnect()

3. Build a Walking Robot

from robotclaw import Robot, DEFAULT_CONFIG

with Robot(config=DEFAULT_CONFIG, port="COM3") as robot:
    # All 10 joints to home position
    robot.go_home(time_ms=2000)

    # Control individual joints by name
    robot.set_joint("left_hip_pitch", position=350, time_ms=500)
    robot.set_joint("right_knee", position=600, time_ms=500)

    # Read all joint positions at once
    positions = robot.get_positions()
    print(positions)

4. Teach Your Robot New Moves

No programming needed โ€” just physically move the robot and record:

from robotclaw import Robot, DEFAULT_CONFIG
from robotclaw.recorder import MotionRecorder, MotionPlayer

with Robot(config=DEFAULT_CONFIG, port="COM3") as robot:
    # Step 1: Record a motion by posing the robot
    recorder = MotionRecorder(robot)
    recorder.start_recording("wave_hand")

    robot.unload_all()  # Release servos so you can pose by hand
    input("Pose the robot โ†’ Press Enter to capture...")
    recorder.capture_frame()

    input("Next pose โ†’ Press Enter...")
    recorder.capture_frame()

    clip = recorder.finish_recording()
    recorder.save(clip, "motions/wave_hand.json")

    # Step 2: Replay at any speed
    robot.load_all()
    player = MotionPlayer(robot)
    player.play(clip, speed=1.5)  # 1.5x speed playback

5. CLI Tools

# Discover all connected servos
robotclaw-scan --port COM3

# Interactive motion teaching terminal
robotclaw-teach --port COM3

๐Ÿง  AI Integration โ€” The Killer Feature

RobotClaw is designed from the ground up for AI-driven robotics. It ships with a built-in OpenClaw skill that lets AI agents understand and control the robot through natural language:

from robotclaw.openclaw_skill import OpenClawSkill

# Register with your AI agent
skill = OpenClawSkill(port="COM3")

# Now your AI can:
# - Move joints by name
# - Play recorded motions
# - Read sensor data (position, voltage, temperature)
# - Perform diagnostic scans
# - Chain complex movement sequences

Use Cases:

  • ๐ŸŽ“ Education โ€” Students learn robotics through conversation with AI
  • ๐Ÿญ Prototyping โ€” Rapidly test robot behaviors via natural language
  • ๐ŸŽฎ Entertainment โ€” AI-controlled robot performances and interactions
  • ๐Ÿ”ฌ Research โ€” Quickly iterate on movement patterns without manual coding

๐Ÿ”ง Supported Hardware

Component Specification
Servos LOBOT LX-16A / LX-224 / LX-225 bus servos
Interface USB-to-TTL serial adapter
Baudrate 115200 (direct) or 9600 (controller board)
Power 6โ€“8.4V DC, โ‰ฅ5A recommended for 10 servos
Default Config 10 servos, 5 per leg (biped robot)

๐Ÿ’ก Tip: More servo types and robot configurations coming soon. PRs welcome!


๐Ÿ“ Architecture

src/robotclaw/
โ”œโ”€โ”€ __init__.py          # Public API: ServoBus, Robot, DEFAULT_CONFIG
โ”œโ”€โ”€ servo_bus.py         # LOBOT LX protocol โ€” the low-level driver
โ”œโ”€โ”€ robot_config.py      # JointConfig, LegConfig, RobotConfig
โ”œโ”€โ”€ robot.py             # High-level robot controller
โ”œโ”€โ”€ cli.py               # CLI: robotclaw-scan, robotclaw-teach
โ”œโ”€โ”€ openclaw_skill.py    # OpenClaw AI agent skill adapter
โ”œโ”€โ”€ SKILL.md             # OpenClaw skill descriptor
โ””โ”€โ”€ recorder/            # Motion teaching subsystem
    โ”œโ”€โ”€ motion_data.py   # Keyframe & MotionClip data models
    โ”œโ”€โ”€ recorder.py      # Record motions from physical teaching
    โ””โ”€โ”€ player.py        # Play back motions with speed control

๐Ÿ—บ๏ธ Roadmap

  • LOBOT LX bus servo protocol driver
  • High-level joint-based robot API
  • Motion recording & playback system
  • OpenClaw AI agent integration
  • CLI diagnostic & teaching tools
  • ๐Ÿ”œ Visual motion editor (web UI)
  • ๐Ÿ”œ Inverse kinematics engine
  • ๐Ÿ”œ Support for more servo protocols (Dynamixel, Feetech)
  • ๐Ÿ”œ ROS 2 bridge
  • ๐Ÿ”œ Reinforcement learning integration

๐Ÿค Contributing

We welcome contributions from robotics enthusiasts, AI researchers, and Python developers!

# Clone and install in development mode
git clone https://github.com/RobotBase/robotclaw.git
cd robotclaw
pip install -e ".[dev]"

# Run tests
python -m pytest tests/ -v

See the CHANGELOG for recent updates.


๐Ÿ“„ License

MIT License โ€” Use it freely in your projects, commercial or otherwise.


Built with โค๏ธ by the RobotBase team
Making AI-powered robotics accessible to everyone

About

๐Ÿฆพ Bridge OpenClaw AI agents with physical servo robots โ€” Python library for LOBOT LX bus servo control, motion teaching & playback

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages