Close
0%
0%

SamuRoid: 22-DOF Embodied AI & ROS Humanoid

Open-source bipedal robot powered by Raspberry Pi 4B. Features IK-based gait, 30kg.cm bus servos, and DeepSeek LLM integration for AI resear

Similar projects worth following
Overview
SamuRoid is a multi-modal, 22-DOF bionic humanoid based on Raspberry Pi 4B. It integrates AI vision, voice interaction, and LLM reasoning, providing a powerful platform for research and high-end robotics hacking.
Mechanical & Actuators
Structure: Aluminum alloy with 22 Degrees of Freedom (Head 2, Shoulder 2, Arm 4, Hand 2, Leg 10, Foot 2).
Servos: XRS300 high-voltage serial bus servos (30kgf.cm @ 12V), enabling 30+ preset actions like kicking, dancing, and gesturing.
Electronic System
Brain: Raspberry Pi 4B (4GB) + PWR.ROSBOT.X driver board.
Connectivity: WiFi, BLE 5.0, Gigabit Ethernet.
Sensing: Robot-Eye 4.0 (1080P), MPU6050 6-axis IMU, and USB high-precision Mic.
Expansion: Fully broken-out GPIOs compatible with 40+ modular sensors.
Software & AI Architecture
Control: ROS Melodic on Ubuntu 18.04. Features IK-based gait and Inverted Pendulum algorithm for dynamic balance.
Vision: OpenCV-powered facial recognition, color tracking, and automatic ball targeting.

Project Details: Samuroid Technical Deep-Dive

1. Project Vision

SamuRoid is an advanced 22-DOF bipedal humanoid platform designed for the intersection of Embodied AI and Real-time Bipedal Locomotion. By bridging Large Language Models (LLMs) like DeepSeek with the ROS (Robot Operating System) ecosystem, Samuroid transforms high-level semantic intent into precise physical movements.

2. Mechanical Design & Kinematics

The chassis is constructed from high-strength Aluminum Alloy with a static spray finish, ensuring structural rigidity for dynamic balancing.

  • Degrees of Freedom (22 DOF) Breakdown:
    • Head: 2 DOF (Pan/Tilt for vision tracking)
    • Shoulders/Arms: 2 DOF (Shoulder) + 4 DOF (Arms) + 2 DOF (Hands)
    • Lower Body: 10 DOF (Legs) + 2 DOF (Feet)
  • Actuation: Powered by XRS300 High-Voltage Serial Bus Servos.
    • Stall Torque: ≥30kgf.cm @ 12V.
    • Feedback: Real-time position, temperature, and load monitoring via the serial protocol.

3. Electronic Architecture & Sensing

The system adopts a master-slave control architecture to balance high-level AI reasoning and low-level motor control.

  • Compute Brain: Raspberry Pi 4B (4GB RAM) running Ubuntu 18.04.
  • Motion Controller: PWR.ROSBOT.X dedicated driver board. It handles DC-DC power management (SY8120ABC), audio amplification, and acts as a hardware abstraction layer for the 22 servos.
  • Sensing Suite:
    • IMU: MPU6050 6-axis gyroscope for attitude estimation and gait stabilization.
    • Vision: 1080P 120° Wide-angle USB camera (Robot-Eye 4.0).
    • Audio: High-precision AEC (Acoustic Echo Cancellation) Microphone + 1W Speaker for voice interaction.

4. Software Stack & Locomotion Algorithms

SamuRoid is fully integrated with ROS Melodic. The codebase is open-source and supports both C++ and Python development.

  • Locomotion Engine: Implementation of Inverse Kinematics (IK) combined with the Linear Inverted Pendulum Model (LIPM). This ensures the Center of Mass (CoM) remains stable during dynamic gait transitions.
  • Vision Pipeline: OpenCV-based modules for face recognition, color tracking, and QR code localization.

5. Embodied AI: Integrating LLMs

The defining feature of Samuroid is its Multimodal AI Integration.

By connecting to DeepSeek and Doubao LLM APIs, the robot performs semantic parsing of natural language. Instead of hard-coded commands, the robot can understand intent:

  • Input: "I am tired, show me some fun."
  • Process: LLM interprets "tired" -> selects "Dance" action group -> triggers ROS Action Server.
  • Feedback: Real-time status report via the integrated voice system.


6. Technical Specifications Summary

  • Dimensions: 190.98 * 141.6 * 389.81 mm
  • Weight: 2.3 kg
  • Battery: 12V 3000mAh Li-po (60A discharge protection)
  • Communication: Dual-band WiFi (2.4G/5G), Bluetooth 5.0, PS2 Wireless Controller.

  • 1 × Main Controller Raspberry Pi development main control board
  • 1 × Driver Board PWR.ROSBOT.X
  • 1 × Camera Robot-Eye 4.0 (1080P, 120° wide-angle, 2-megapixel)
  • 1 × Display 0.96-inch OLED display
  • 1 × IMU sensor 9-axis sensor (3-axis gyroscope + 3-axis accelerometer + 3-axis magnetometer)

View all 7 components

  • Stop Fighting PID Loops, Start Winning: 22-DOF Humanoid Platform for RoboCup 2026

    alisa.wu03/21/2026 at 03:58 0 comments

    SamuRoid Robocup
    SamuRoid for Robocup teams

    Anyone who has ever attempted to build a bipedal robot for RoboCup knows the "Development Purgatory." You spend six months just trying to keep the thing upright, wrestling with Inverted Pendulum models and PID tuning, only to realize you haven't even touched the soccer logic or AI vision.

    We’ve been there. And we decided to fix the hardware barrier.

    Enter SamuRoid: The 22-DOF Shortcut to the Pitch

    We are officially announcing the SamuRoid platform’s optimization for the RoboCup Singapore Open 2026. This isn't just a "toy" robot; it's an open-architecture development beast designed to get you straight into high-level strategy and AI deployment.

    • 22 Degrees of Freedom (DOF): Full human-like range of motion for complex maneuvers and rapid recovery from falls.
    • XRS300 High-Torque Servos (≥30kgf.cm): The muscle behind the movement. High precision, high heat dissipation, and the strength needed for explosive kicks.
    • Onboard Intelligence: Powered by Raspberry Pi 4B, integrated with an MPU6050 6-axis gyro for real-time self-balancing and gait correction.


    We know the Hackaday community loves control. That’s why SamuRoid is built on Ubuntu + ROS Melodic.

    • Open CV Integration: Ready-to-use autonomous ball tracking and QR code localization.
    • Kinematics: Pre-configured inverse kinematics (IK) so you can focus on where the foot goes, not the trigonometry of how it gets there.
    • LLM Ready: Compatible with DeepSeek/Doubao APIs for team-robot interaction.

    Why this matters for RoboCup 2026 (Science Centre Singapore)

    Autonomous Decision Making and Team Collaboration. If you are still debugging your walking gait in March, you’ve already lost.

    SamuRoid provides a "Competition-Ready" baseline. You get a stable walk, a powerful kick, and a vision system that works out of the box—giving you a year of head-start to develop custom offensive and defensive algorithms.

    🛠 Get Involved / Dev Access

    We are looking for RoboCup teams, University labs, and hardcore roboticists to push this platform to its limits.

    We are offering a 5% "RoboCup Season" discount and dedicated technical support for registered teams.

    👉 [Check out the Full Specs & Schematics on our Site]
    https://www.xiaorgeek.net/products/samuroid-ai-humanoid-robot-with-raspberry-pi-integrated-multimodal-ai-model-large-language-models-vision-interactive-voice-based-ros-xiaorgeek

    Are you heading to Singapore in 2026? Let’s talk gait optimization in the comments.




  • Can a Raspberry Pi Humanoid Robot Run OpenClaw AI Agents?

    alisa.wu03/10/2026 at 14:19 0 comments

    Recently, AI agents like OpenClaw have been gaining a lot of attention in the developer community.

    We started experimenting with running OpenClaw on a Raspberry Pi 4B, and surprisingly, it works quite well on edge hardware.

    That sparked an interesting idea:

    What happens if we combine AI agents with robotics?

    Our humanoid robot SamuRoid is powered by a Raspberry Pi 4B, which makes it an interesting platform for developers who want to explore AI + robotics on edge devices.

    Although we haven't officially integrated OpenClaw with SamuRoid yet, the hardware platform already allows developers to experiment with their own AI stack.

    For makers and AI hackers, this could open up some fun possibilities.

    Imagine building your own:

    • AI-powered humanoid robot
    • Running OpenClaw agents locally
    • Powered by Raspberry Pi edge computing

    Since SamuRoid already runs on Raspberry Pi, developers in the maker community could potentially try installing OpenClaw themselves and experiment with AI-driven robot behaviors.

    We would love to see what the community can build with it.

    If you're interested in exploring the hardware platform:

    SamuRoid humanoid robot
    https://www.xiaorgeek.net/products/samuroid-ai-humanoid-robot-with-raspberry-pi-integrated-multimodal-ai-model-large-language-models-vision-interactive-voice-based-ros-xiaorgeek

    To celebrate Pi Day, we are also running a $31.4 discount promotion on SamuRoid for a limited time.

    If you're a maker who enjoys experimenting with AI agents + robotics, this might be a fun project to explore.

    Looking forward to seeing what the community builds.

  • Project Idea: SamuRoid – A Humanoid Robot Platform for Embodied AI and ROS Development

    alisa.wu03/06/2026 at 08:23 0 comments

    We recently started working on a humanoid robotics project called SamuRoid. The main goal is to explore how an affordable humanoid robot platform can be used for embodied AI experiments, robotics education, and ROS development.

    Most humanoid robots today are either research-grade systems that cost tens of thousands of dollars, or simple toy robots that are difficult to extend. With Samuroid, we are trying to build something in between — a capable but accessible humanoid robot platform for developers, students, and robotics enthusiasts.

    The robot is built around a Raspberry Pi 4B running Ubuntu 18.04 and ROS Melodic. Using the ROS framework allows us to integrate different modules such as motion control, machine vision, and AI interaction in a standardized way.

    Mechanically, the robot uses a 22-DOF humanoid structure including head, arms, legs, and feet joints. High-torque serial bus servos (≥30kgf.cm) are used to drive the joints, enabling complex humanoid movements such as walking, waving, dancing, and kicking a ball.

    Currently we are testing several locomotion algorithms based on inverse kinematics and inverted pendulum control. A built-in MPU6050 IMU helps the robot maintain stability during walking.

    For perception, SamuRoid integrates a 1080P wide-angle camera combined with OpenCV-based computer vision algorithms. We are experimenting with several AI vision capabilities including:

    - face recognition  
    - color recognition  
    - QR code detection  
    - object tracking  
    - autonomous ball tracking and kicking

    Another interesting direction we are exploring is multimodal AI interaction. By connecting the robot to large language model APIs such as DeepSeek and Doubao, the robot can understand voice commands and perform actions through natural language interaction.

    The robot also includes a voice input system with a microphone and speaker for real-time audio interaction.

    Because the platform runs ROS and supports Python and C++ development, it is also suitable for robotics education, AI experimentation, and developer research projects.

    We are still refining the motion control and expanding the AI interaction capabilities.

    If anyone in the Hackaday community is interested in humanoid robotics, embodied AI, or ROS-based robot platforms, we would love to hear your ideas and suggestions.

    More technical information and development resources about the Samuroid robot platform can be found here:

    https://www.xiaorgeek.net/products/samuroid-ai-humanoid-robot-with-raspberry-pi-integrated-multimodal-ai-model-large-language-models-vision-interactive-voice-based-ros-xiaorgeek

View all 3 project logs

  • 1
    Instructions for turning the device on and off

    a. Locate the circular main power switch on the chest of the SamuRoid humanoid robot.

    b. Press the switch lightly to turn it on. Once powered on, the switch will illuminate with a red indicator light, signifying that the power is connected.

    c. Please wait patiently for approximately 40 seconds while the robot's built-in Ubuntu operating system starts up.

    d. When you hear the voice prompt "Mech Warrior ready!", it means the robot has entered standby mode and is ready for operation.

  • 2
    Shutdown Procedure

    When shutting down, the robot will lose its power balance. Please be sure to hold the robot firmly to prevent it from falling and being damaged.

  • 3
    Camera calibration

    If the lighting conditions in the robot's environment are too bright or too dark, you can adjust the camera parameters to optimize the image quality. Additionally, the football color recognition parameters in the automatic kicking mode can also be modified in the camera settings. For detailed instructions, please refer to: 4.19 Use the web-based debugging tool. 

View all 16 instructions

Enjoy this project?

Share

Discussions

Similar Projects

Does this project spark your interest?

Become a member to follow this project and never miss any updates