For years, robotics has been the domain of big tech, deep research labs, and prohibitively expensive hardware.
Now, Hugging Face, a company best known for revolutionizing open-source AI tools, is aiming to do the same for robots. Last week, the New York-based company with three French founders, launched “Reachy Mini,” a $299 desktop humanoid robot.
Hugging Face is hoping Reachy could mark a turning point in making robotics accessible to the masses. Executives believe that the robot’s low price, modular design, and integration with Hugging Face’s vast AI model ecosystem could do for robotics what the Raspberry Pi did for computing.
“We’re thrilled to finally share what we’ve been building,” Hugging Face co-founder Thomas Wolf posted on LinkedIn. “Reachy Mini is our first hardware product—and it’s just the beginning.”
A Foundational AI Player
Hugging Face is best known for its open-source platform that has become a central hub for sharing, training, and deploying machine learning models.
Founded in 2016 by Clément Delangue, Julien Chaumond, and Thomas Wolf, the company initially started as a chatbot app before pivoting to become a foundational player in the AI community. Hugging Face gained widespread recognition with the release of its Transformers library, which simplified access to state-of-the-art natural language processing models.
Today, it hosts thousands of models across various modalities—language, vision, audio, and more—enabling researchers, developers, and enterprises to build and collaborate in the open. The company’s mission centers on democratizing AI through transparency, accessibility, and community-driven innovation.
In April 2025, Hugging Face made its first major foray into hardware by acquiring Bordeaux-based Pollen Robotics—an open-source robotics startup founded in 2016 by Matthieu Lapeyre and Pierre Rouanet.
The deal brought Pollen’s flagship humanoid platform, Reachy 2—a modular, 7‑DoF robot used in research labs at institutions like Cornell and Carnegie Mellon and initially priced around $70,000—under Hugging Face’s umbrella.
By merging Pollen’s open-hardware ethos with Hugging Face’s best-in-class open-source AI model ecosystem, the acquisition enables a vertically integrated AI-to-robotics stack, accelerating community-driven robotics innovation.
What Is 'Reachy Mini'?

Reachy Mini is a compact, highly expressive desktop robot designed for developers, educators, researchers, and curious tinkerers. It features two articulated arms with seven degrees of freedom and a head equipped with a camera for vision tasks. The robot's body is powered by an onboard NVIDIA Jetson Orin Nano, which is capable of running large AI models locally and is designed to support Python programming out of the box.
At just $299, Reachy Mini is a stark departure from the norm. Comparable educational and research-grade robots typically cost thousands, if not tens of thousands, of dollars. Hugging Face has instead prioritized accessibility, aligning with its open-source ethos and community-driven development model.
According to Hugging Face’s blog, the robot is modular and fully open-source, encouraging community hacks, upgrades, and experimentation. All design files are hosted on GitHub, allowing anyone to 3D-print parts, contribute code, or even build their own Reachy Mini from scratch.
A Trojan AI Horse
Hugging Face’s entry into robotics is a strategic extension of its larger mission: to put powerful, open-source AI into as many hands as possible. Reachy Mini comes pre-configured to run a suite of Hugging Face-hosted AI models, from natural language understanding to vision and control systems.
"What's truly disruptive here is the ease with which Hugging Face's entire model hub can be deployed on Reachy Mini," notes Superhuman in its July 10 newsletter. "You can create a talking, seeing, moving assistant in your living room, controlled by code as simple as a Jupyter notebook."
TechCrunch describes the robot as “a charming and genuinely useful way to introduce robotics and AI to a wider audience,” calling attention to how easily it interfaces with models like Transformers, Whisper (for speech recognition), and even open-source vision models like YOLOv8.
VentureBeat’s coverage emphasizes that this is no toy: “It’s a real robot,” the article states, “capable of running large vision models, responding to commands, and interfacing with other devices.”
The Stakes

The robotics industry has long been caught in a tension between groundbreaking innovation and market inaccessibility. Advanced robots are expensive to build, difficult to program, and often locked behind proprietary systems. That’s left robotics primarily in the hands of universities, tech giants, and well-funded startups.
Hugging Face wants to flip that script.
By removing the barriers of cost, complexity, and closed platforms, Reachy Mini opens up new educational, creative, and research possibilities. A high school robotics team can now experiment with real-time vision-based navigation. A solo developer can prototype a physical AI assistant. Even casual enthusiasts can explore reinforcement learning in the real world.
As Robots & Startups wrote: “Reachy Mini is not just a robot. It’s an invitation to the future of embodied AI.”
The other big differentiator is community. Hugging Face has long leaned into community-powered innovation, and Reachy Mini is no exception. The company is launching a dedicated Hugging Face Space for developers to share code, demos, and experiments with Reachy Mini. It’s also offering plug-and-play integrations with Gradio for interactive demos and Autotrain for fine-tuning models.
So far, little Reachy is generating big buzz.
Hugging Face co-founder Delangue noted on LinkedIn that in the first 24h since Reachy Mini was announced, the company had received $500,000 in pre-orders. He later updated that "2,000+ robots" sold in a few days.
In an interview with TechCrunch, Delangue said he believes the company's commitment to open source is now resonating with the robotics community.
“I feel like it’s really important for the future of robotics to be open source, instead of being closed source, black box, [and] concentrated in the hands of a few companies,” Delangue told TechCrunch. “I think it’s quite a scary world to have like millions of robots in people’s homes controlled by one company, with customers, users, not really being able to control them, understand them. I would much rather live in a place, or in a world, or in a country, where everyone can have some control over the robots.”