Back to blog

Reachy Mini: A new Robotics platform from HF and Pollen Robotics

Open-source robot for AI prototyping, $299+. Vision, speech, motion via Hugging Face.

Michal Pogoda-Rosikon

Michal Pogoda-Rosikon

Jul 11, 2025 · 8 min read

Reachy Mini: A new Robotics platform from HF and Pollen Robotics

Developers building AI systems often face a gap between software models and physical interactions - training vision or speech models is straightforward, but testing them in real-world, embodied scenarios requires accessible hardware. Reachy Mini, a collaboration between Hugging Face and Pollen Robotics announced on July 9, 2025, aims to bridge this by providing an open-source robot platform for prototyping human-robot interactions.

Use Cases

Prototyping Conversational Agents: Combine speech recognition models with the robot's microphones and speaker for natural dialogues, adding expressive gestures via head movements and antennas to enhance engagement.

Vision-Based Interactions: Use the wide-angle camera for object detection or facial recognition, enabling applications like interactive demos where the robot responds to visual cues in real time.

Educational Tools: As an assembly kit with beginner-friendly SDKs, it supports teaching AI concepts - students can code behaviors in Python or Scratch.

Research and Business Experimentation: Validate multimodal pipelines in low-stakes environments. Businesses can prototype customer-facing bots or internal tools for AI-assisted workflows.

Hardware Overview

The robot stands at 28 cm tall and weighs 1.5 kg, designed for desktop use with expressive hardware to mimic lifelike interactions. Core components include a head with 6 degrees of freedom, full base rotation, two animated antennas, a 5W speaker, and a wide-angle camera.

Two configurations

Reachy Mini Lite: Host compute (Mac/Linux), Wired connectivity, $299. Reachy Mini (full): Raspberry Pi 5 compute, WiFi + Battery, Accelerometer, $449.

Software Integration

Control starts with an open-source Python SDK for hardware access and AI model chaining. Extensions for JavaScript and Scratch are planned, alongside a simulation SDK for virtual testing. Over 15 pre-built behaviors on the Hugging Face Hub provide starting points.

Michal Pogoda-Rosikon

Written by

Michal Pogoda-Rosikon

Co-founder @ bards.ai

Bridging AI research and real product engineering. Writing about what works once the demo ends.

Like this article?

Get practical insights on AI, product, and growth sent to your inbox.