Menlo

Introduction

A full-stack software platform for building and operating a robot labor force.

Menlo Platform lets developers build a robot labor force, turning software into physical labor.

It is a full-stack platform that handles orchestration, telemetry, and safety — everything between an AI agent and physical execution. Define a workflow, deploy it to a robot, collect real-world telemetry, and iterate. The full cycle compresses from weeks to minutes.

The platform is natively integrated with Asimov hardware and runs an identical stack on virtual robots in the browser simulator, so you can develop and validate before touching hardware.

What's included

Platform UI — A web dashboard for creating and managing robots, launching the browser-based simulator, teleoperating robots with keyboard or joystick controls, and monitoring live telemetry (joint positions, IMU, mode, temperatures).

REST API — Programmatic access to every platform capability: create robots, join live WebRTC sessions, send velocity and semantic commands, read telemetry snapshots. Authenticated with API keys.

CLI — The menlo command-line tool for managing robots, checking status, and joining sessions from your terminal.

Simulator (Uranus) — A browser-based MuJoCo digital twin. Runs the same agent and control stack as a physical robot, so code validated in simulation deploys directly to hardware.

Training (Cyclotron) — A sim-to-real locomotion training pipeline with domain randomization tuned to measured Asimov actuator models. Trained policies deploy through the Agent Platform.

Robot types

TypeDescription
VirtualSimulated robot running in the browser. No hardware required. Create one instantly from the Platform UI.
PhysicalAn Asimov humanoid robot, connected over WebRTC. Controlled today via the Asimov API; Platform UI integration is coming soon.

Start building

How is this guide?

On this page