OpenClaw Embodiment SDK

Hardware abstraction layer for connecting physical devices to AI agent runtimes.

A Python SDK that bridges sensors, cameras, microphones, BLE devices, and robotic hardware into the OpenClaw agent loop. Built for the Pamir Distiller, Even G2 glasses, and Reachy robotics -- but the HAL interface works with anything.

Install

pip install -e .         # from source
git clone https://github.com/mmartoccia/openclaw-embodiment.git

Architecture

237
Tests passing
0
Grain errors
10
Modules
3
HAL profiles
hal/ hardware abstraction
context/ sensor context builder
discovery/ world model + anomaly detection
transport/ BLE, WiFi, STT bridge
triggers/ audio + motion wake
profiles/ device profiles (iOS, Distiller, G2)
core/ base classes + interfaces
cli/ diagnostic commands

Agent Governance

This repo is built with agentic tooling. Autonomous agents contribute code. That means it needs rules humans don't:

AGENTS.md -- contribution protocol for autonomous agents
grain -- anti-slop linter, runs on every commit

Supported Hardware

Pamir Distiller -- ARM SBC with e-ink display, mic, speaker, GPIO. Always-on ambient agent.
Even G2 Glasses -- smart glasses with camera, display, BLE. First-person agent perception.
Reachy -- robotic platform. Full body HAL with joint control, force feedback, camera.