Genesis AMR

Open-Source · ROS 2 Jazzy · Mecanum Base · 5-DOF Manipulator

Genesis AMR : full system view. Mecanum mobile base with SO-101 5-DOF arm.


A low-cost, open-source autonomous mobile manipulator built from first principles , spanning the full robotics stack from bare-metal servo register programming to Vision-Language-Action (VLA) training.

Every subsystem is documented with the theory, the math, and the implementation. The goal: a single platform where every major robotics concept can be studied, built, and tested, end to end.

First time here?

Good luck !!!


What’s Inside

LayerConcepts Covered
Embedded / FirmwareRegister-level servo control, ESP32 bare-metal drivers, BMS & power architecture
KinematicsFK, IK (analytical + Newton-Raphson), servo ↔ joint space mapping, workspace visualization
Mobile BaseMecanum wheel kinematics, wheel PID, omnidirectional teleoperation
PerceptionPinhole model, back-projection, depth recovery, hand-eye calibration, 3D pose estimation
AutonomySLAM, frontier exploration, A* global planning, RRT* dynamic replanning
Visual ServoingIBVS and PBVS closed-loop manipulation
AI / LearningVLA data collection pipeline, imitation learning, future RL integration

Philosophy: every abstraction is earned. If a concept is used, the documentation shows where it comes from.


System Modules

🛞 Mobile Base

Omnidirectional 4-wheel mecanum drive with full teleoperation and autonomous modes.

  • Xbox controller + MediaMTX live video stream
  • Pan-tilt first-person view
  • Good Boy Mode : autonomous search and retrieval of a green ball
  • Frontier-based autonomous mapping (in progress)
  • A* global planning + RRT* dynamic replanning (in progress)

🦾 Manipulator

5-DOF SO-101 arm with full kinematic chain and vision-guided control.

  • Servo-to-joint space mapping with range limiting and register communication
  • Forward kinematics : full chain implementation
  • Inverse kinematics : analytical and Newton-Raphson numerical solver
  • Blind pick-and-place with path interpolation
  • GUI control with real-time workspace visualization
  • 3D ball pose estimation
  • IBVS : Image-Based Visual Servoing
  • PBVS : Position-Based Visual Servoing

👁️ Perception

Camera model foundations through 3D object pose estimation.

  • Pinhole camera model and projection math
  • Back-projection and depth recovery
  • Hand-eye calibration
  • Object detection and 3D pose estimation

🗺️ Navigation & SLAM

Autonomous mapping and obstacle-aware path planning.

  • LiDAR-based SLAM
  • Frontier exploration
  • A* for global path planning
  • RRT* for dynamic replanning around obstacles
  • (Simulation complete : hardware implementation in progress)

⚡ Hardware & Power

Electrical architecture, mechanical build, and BOM.

  • 2× 4S 18650 packs : one dedicated to hardware motors and actuators, one to logic (RPi5 + ESP32)
  • Per-rail buck converters
  • Wiring diagrams and schematics
  • (Power system restructure in progress for improved vibration resistance)

🧠 Learning & VLA

Data collection through autonomous skill acquisition.

  • VLA data collection pipeline
  • Imitation learning setup
  • YOLO-based object classification and segmentation (planned)
  • RL-based dexterous manipulation (planned)

Implementation Status

==Last updated: March 18, 2026==

SubsystemFeatureStatus
FirmwareRegister-level PWM + motor driver interface
FirmwareHardware quadrature encoder reading (PCNT)
FirmwareWheel PID (anti-windup, derivative filter …)
FirmwareESP32 ↔ RPi5 serial communication protocol
Mobile BaseMecanum kinematics (FK + IK)
Mobile BaseWheel odometry + TF2 broadcast🔧
Mobile BaseIMU integration + EKF sensor fusion🔧
Mobile BaseGood Boy autonomous ball retrieval
Mobile BaseFrontier exploration + SLAM🔧
Mobile BaseA* / RRT* autonomous navigation🗓️
Mobile BaseObstacle detection🗓️
Mobile BasePower system restructure🧪
ManipulatorServo ↔ joint space mapping
ManipulatorForward kinematics (PoE)
ManipulatorAnalytical + Newton-Raphson IK
ManipulatorGUI + workspace visualization
ManipulatorOpen-loop pick-and-place
ManipulatorIBVS + PBVS visual servoing
Manipulator3D ball pose estimation
ManipulatorMoveIt2 integration + ros2_control bridge🗓️
ManipulatorCollision + self-collision avoidance🗓️
ManipulatorGrasp planning🗓️
ManipulatorWhole-body coordination (base + arm)🗓️
ManipulatorYOLO object classification + segmentation🗓️
PerceptionCamera intrinsic calibration
PerceptionHand-eye calibration
PerceptionLiDAR integration + scan filtering
PerceptionDepth camera pipeline🗓️
Perception6-DOF object pose estimation🔧
TeleoperationGamepad teleop (base + arm )
TeleoperationMediaMTX live video stream
TeleoperationCustom handheld controller (6-DOF + screen)🗓️
TeleoperationLeader-follower arm teleoperation🗓️
TeleoperationPhone / web teleoperation🗓️
TeleoperationVR teleoperation (Quest 3)🗓️
SystemZenoh distributed compute (RPi5 ↔ dev machine)🧪
Systemrosbag2 logging + telemetry pipeline🗓️
LearningLeRobot data recording pipeline🗓️
LearningVLA data collection pipeline🗓️
LearningImitation learning🗓️
LearningDataset management + versioning🗓️
LearningSim-to-real transfer🗓️
LearningRL dexterous manipulation🗓️
SimulationGazebo + ROS2 integration (URDF, sensors)🧪
SimulationMuJoCo digital twin + system identification💭
SimulationIsaac Lab parallel env + domain randomization💭
SimulationRL policy training (PPO / SAC)💭

✅ Complete · 🧪 Testing · 🔧 In Progress · 🗓️ Planned · 💭 Under Consideration


Hardware Overview

ComponentDetails
ComputeRaspberry Pi 5 + 2× ESP32
ArmSO-101 · 5-DOF · Serial servo bus
BaseMecanum wheels · 4-motor drive
SensingDual RGB cameras · LiDAR·IMU
MiddlewareROS 2 Jazzy · Zenoh DDS bridge
Power8x18650 · BMS · Per-rail buck converters

Built by Claude Daniel Jacquet · MS Robotics, Purdue University