Close
0%
0%

Perceptor - ROS2 Perception, Planning, Control

Modified Roomba with ROS2, LIDAR, and odometry running SLAM Toolbox for real-time mapping and navigation

Similar projects worth following
This project transforms an off-the-shelf Roomba 650 into a ROS 2-powered autonomous mapping and navigation robot by integrating 360° RPLiDAR A1M8, Wheel encoders, Roomba’s onboard sensors, Optional IMU (e.g., MPU6050).
Key Features include:
✅ Odometry publishing via custom create_robot ROS 2 driver
✅ LiDAR-based mapping and obstacle detection
✅ Visualization and monitoring in RViz2
✅ IMU support for enhanced pose estimation
✅ Real-time mapping with SLAM Toolbox
✅ Autonomous path planning using the ROS 2 Navigation Stack.
If you’re looking to repurpose your old Roomba into a smart robot with perception and navigation skills, this is a fun and practical way to do it!

Refer to the README for the details.

  • Waypoint Navigation

    Vipin M08/01/2025 at 15:36 0 comments

    In robotics development, valuable capabilities often exist in the framework long before custom implementations become necessary.

    Discovery Through Configuration Audit

    While preparing to implement waypoint navigation on the Perceptor robot, I found that the functionality was already pre-configured in the existing nav2_params.yaml file. The waypoint_follower node was enabled and correctly parameterized from the initial Nav2 integration. No additional code or infrastructure changes were required.

    This highlights the completeness of the Nav2 stack—key features are often functional by default, requiring only proper invocation through existing interfaces.

    RViz as a Control Surface for Waypoints

    RViz's Nav2 plugin includes a visual interface for managing waypoint navigation without writing ROS action clients or manually specifying coordinates. The waypoint execution process is fully operable through GUI interaction:

    1. Launch the standard Nav2 stack

    2. Open RViz with appropriate Nav2 config

    3. Set the initial pose using the "2D Pose Estimate" tool

    4. Enable "Waypoint Mode" in the Nav2 panel

    5. Click target waypoints on the map (clicks are sequentially ordered)

    6. Click "Start Waypoint Following" to execute the route

    RViz publishes waypoint goals via the FollowWaypoints action interface. Each waypoint is visualized on the map in real time.

    No code, YAML editing, or coordinate calculation is required.

    Compatibility with Existing Navigation Features

    Waypoint navigation inherits all runtime behavior defined by the motion planning and costmap layers:

    • Keepout Zones: The global planner automatically avoids keepout areas during path generation between waypoints. If a waypoint lies within a restricted zone, the planner reroutes to the nearest valid pose.

    • Speed Filter Zones: The robot dynamically adjusts velocity based on configured speed masks while traversing waypoints.

    • Dynamic Obstacle Avoidance: The local planner remains active, enabling real-time rerouting in response to moving obstacles.

    This decoupling is possible because waypoint navigation operates at the task layer, while filters and collision avoidance operate at the path and motion planning layers.

    Use Cases and Practical Deployment

    Visual waypoint management enables multiple real-world applications without requiring custom software:

    • Security Patrols: Define patrol loops by placing sequential perimeter waypoints.

    • Inspection Routes: Navigate to sensors or hardware locations for regular inspection.

    • Cleaning Passes: Emulate room-based or grid-style cleaning patterns with precise coverage.

    • Demonstration Paths: Quickly generate visible motion plans for field demos and validation.

    All of these can be configured ad hoc through the RViz interface in seconds.

    Advanced Capabilities in Waypoint Follower

    The waypoint_follower node supports advanced configurations beyond RViz usage:

    • Looped Missions: Enable automatic looping of waypoint sequences.

    • Failure Handling: Skip failed waypoints, retry with alternate paths, or abort on failure.

    • Waypoint Behaviors: Trigger plugin-based behaviors at specific waypoints (e.g., sensor polling, actuator control).

    • API Integration: Expose the ROS 2 FollowWaypoints action interface for programmatic mission control and external system integration.

    RViz usage is optional—programmatic access remains available via standard ROS 2 action clients.

    Observations on Framework Design

    The fact that waypoint_follower was pre-configured and operational illustrates a design strength of Nav2: core features are enabled by default, with sensible defaults and seamless subsystem integration.

    This architecture lowers the barrier to implementation by:

    • Minimizing manual configuration

    • Enabling discovery through GUI interfaces

    • Promoting modular task-level behaviors layered on top of reusable motion infrastructure

    Future Work

    Having verified waypoint navigation and integrated it into Perceptor's operational...

    Read more »

  • From SLAM to Smart Navigation

    Vipin M08/01/2025 at 06:44 0 comments

     Following previous success fixing laser scan drift, I have upgraded the Perceptor robot from a basic SLAM platform into a modular autonomous navigation system with advanced features like keepout zones and variable speed control.

    Overview

    In the previous entry, I addressed laser scan drift on an RPLiDAR-equipped iRobot Create 2. After frame correction and sensor calibration, SLAM stability was achieved. This post focuses on the transition from mapping to fully autonomous navigation using ROS 2 and Nav2.

    SLAM to Navigation with Nav2

    Autonomous navigation requires multiple components beyond SLAM:

    • Localization: Determining robot pose on a known map

    • Path Planning: Computing viable paths to target poses

    • Obstacle Avoidance: Adapting to static and dynamic objects

    • Motion Control: Executing velocity commands safely

    ROS 2’s Navigation Stack (Nav2) enables this functionality but demands hardware-aware configuration.

    Replacing SLAM with AMCL

    I replaced SLAM with AMCL (Adaptive Monte Carlo Localization) to localize using a static map. This allowed me to retain the same sensor suite while shifting to a particle filter approach:

    amcl:  ros__parameters:    laser_model_type: "likelihood_field"    max_particles: 2000    min_particles: 500    initial_pose:      x: 0.0      y: 0.0      yaw: 0.0
    

    Despite a minimal sensor stack, AMCL yielded reliable localization across varied test environments.

    Modular Launch Design

    To streamline development and debugging, I adopted a modular launch approach across four terminals:

    # Terminal 1: Base and sensor drivers
    ros2 launch perceptor launch_robot.launch.py
    
    # Terminal 2: Localization
    ros2 launch perceptor localization_launch.py map:=home.yaml
    
    # Terminal 3: Nav2 stack
    ros2 launch perceptor navigation_launch.py
    
    # Terminal 4: Optional filters and extensions
    ros2 launch perceptor keepout_extension.launch.py
    

    Benefits:

    • Isolated debugging per component

    • Independent lifecycle control

    • Scalable architecture for feature integration

    • Faster iteration during development

    Keepout Zones via Costmap Filters

    To restrict the robot from entering specific regions, I implemented keepout zones using Nav2's costmap filters. Required components:

    1. Binary mask image defining exclusion zones

    2. Filter info server to define spatial metadata

    3. Integration with global costmap

    Correct alignment between map and mask was critical. Misalignment led to unintended path planning behavior.

    keepout_filter:  plugin: "nav2_costmap_2d::KeepoutFilter"
    enabled: True
    filter_info_topic: "/costmap_filter_info"

    Configuration Issue: Missing Publisher

    Initial implementation failed due to the absence of a speed limit publisher:

    ros2 topic info /speed_limit --verbose
    # Publisher count: 0  <- No effect on speed
    

    Adding the following parameter resolved the issue:

    speed_filter:  plugin: "nav2_costmap_2d::SpeedFilter"  enabled: True  filter_info_topic: "/speed_filter_info"  speed_limit_topic: "/speed_limit"
    

    Without this, the filter parses the mask but never communicates velocity limits to the controller.

    Integration Challenges

    Coordinate Frame Alignment

    All masks and maps required:

    • Identical origins

    • Same resolution (0.05 m/pixel)

    • Cross-verification tools to ensure alignment

    Simultaneous Filter Operation

    Running multiple filters required:

    • Unique topic names

    • Independent node lifecycles

    • Distinct map servers per filter

    Performance Optimization

    On Raspberry Pi 5:

    • CPU usage was monitored during planning

    • Memory overhead increased with additional costmap layers

    • Modular design enabled disabling unused filters to conserve resources

    Current Capabilities

    Perceptor now supports:

    • Stable indoor localization (±5 cm)

    • Obstacle avoidance and path planning (planning time < 200 ms)

    • Keepout enforcement with high spatial precision

    • Speed-controlled navigation based on map zones

    • Modular launch for component-based deployment

    Best Practices and Takeaways

    • Configuration Depth:...

    Read more »

  • SLAM Debugging: When Perceptor's Laser Scans Drift from the Map

    Vipin M07/28/2025 at 15:35 0 comments

    The Problem: Laser Scan Drift

    While running SLAM on my Create 2-based robot with an RPLiDAR, I noticed an odd issue. Initially, the map looked fine and the robot localized correctly, but over time, the laser scan visualization in RViz started drifting away from the map. The pink laser scan lines—meant to reflect real-world walls and obstacles—became increasingly misaligned. The map itself didn't distort, but the scan data no longer lined up, making it hard to trust the real-time sensor output.

    The Cause: Coordinate Frame Misalignment

    After some investigation, I found the issue was a coordinate frame mismatch. Specifically:

    • The LiDAR’s internal 0° direction was pointing backward, opposite to the robot’s forward direction.

    • ROS and SLAM algorithms expect the laser’s +X axis to face forward.

    • This mismatch introduced a 180° angular offset, which affected how odometry and scan data fused over time.

    The SLAM system was trying to make sense of data coming in at the wrong orientation, which led to scan drift, even though mapping and localization initially appeared correct.

    The Fix: Correcting the Laser Frame Orientation

    To fix this, I modified the URDF to rotate the laser frame by 180° around the Z-axis. I added this transform to align the LiDAR’s forward direction with the robot’s:

    <origin xyz="0 0 0.1" rpy="0 0 3.14159" />
    

    This single change ensured that:

    • The laser’s 0° angle points in the robot’s forward direction.

    • SLAM receives properly oriented scan data.

    • TF transformations (map → odom → base_link → laser) stay consistent.

    The Result: Solid, Aligned SLAM

    After making the adjustment:

    • Laser scans in RViz stay aligned with the map.

    • Real-time obstacle data matches physical surroundings.

    • The robot can map reliably with minimal drift.

    Current Status

    Using SLAM Toolbox, I’m now able to generate accurate maps. I briefly tested AMCL by running Nav2 with the saved map published via the map server. Localization appears accurate, and the robot’s estimated position matches the environment. The next step is to test autonomous navigation using this setup.

  • Initial Challenges: When SLAM Goes Wrong

    Vipin M07/28/2025 at 02:36 0 comments

    The Symptoms

    My SLAM system showed classic drift symptoms that pointed to deeper issues:

    • Laser Scans Moving with Robot: In RViz, the pink laser scan dots moved with the robot instead of remaining fixed in the map frame.

    • Frame Separation: I noticed three distinct coordinate origins — the robot, odometry frame, and map frame — all offset from each other.

    • Transform Extrapolation Errors: RViz logs were filled with timing-related errors.

    • Map Distortion: The map looked okay when stationary but warped during motion.

    The Confusion

    What confused me most: laser scans drifted during straight-line motion but partially realigned during rotation. Initially, I mistook this for working loop closure. But after deeper analysis, I realized loop closure is not functioning properly yet. The system is trying — but it isn’t successfully closing loops or correcting long-term drift.

    🔍 Diagnostic Process: Systematic Debugging

    Step 1: Transform Tree Analysis

    I inspected the transform tree:

    • Key Finding: The transform chain (map → odom → base_footprint → base_link → laser) was structurally correct, but timestamps were often misaligned, causing extrapolation errors.

      # Check TF tree structure
      ros2 run tf2_tools view_frames
      # Monitor specific frame relationships
      ros2 run tf2_ros tf2_echo map odom
      ros2 run tf2_ros tf2_echo base_footprint base_link
      ros2 run tf2_ros tf2_echo base_link laser

    Step 2: Topic Rate and Timestamp Validation

    I measured the topic rates and confirmed mismatches between TF publishing and sensor data timestamps.

    # Check component frequencies
    ros2 topic hz /scan # Result: 7.5Hz (good)
    ros2 topic hz /tf # Result: 70Hz (excellent) ros2 topic hz /map # Result: 0.2Hz (every 5 seconds)
    ros2 topic hz /odom # Result: 10Hz (adequate)

    Step 3: Odometry Accuracy Evaluation

    To understand how much odometry might be contributing to drift, I tested it in isolation:

    • Commanded: 1.0m forward

    • Actual: 1.073m forward

    • Error: ~7.3% — surprisingly decent for a Create 2

    # Linear accuracy test
    echo "Starting position:"
    ros2 topic echo /odom --once | grep -A3 "position:"
    
    # Move exactly 1 meter forward
    timeout 20s ros2 topic pub /cmd_vel_nav geometry_msgs/msg/Twist \
      "{linear: {x: 0.05, y: 0.0, z: 0.0}, angular: {x: 0.0, y: 0.0, z: 0.0}}" --rate 10
    
    echo "Final position:"
    ros2 topic echo /odom --once | grep -A3 "position:" 

    Step 4: SLAM Correction Attempt Monitoring

    I tried to monitor how well SLAM was correcting drift over short tests:

    • Before Movement: [0.000, 0.000, -0.017]

    • After Movement: [-0.020, 0.023, -0.017]

    • The small correction suggests some adjustment, but not the kind of large-scale loop closure I expected.

    # Monitor map-odom transform during movement
    ros2 run tf2_ros tf2_echo map odom 

    🛠️ Partial Fixes and Improvements So Far

    ✅ Transform Timing Stabilized

    I improved the startup order and transform publication frequency, which eliminated most extrapolation errors.

    # Correct startup order
    # Terminal 1: Start robot first, wait for full initialization
    ros2 launch perceptor launch_robot.launch.py enable_camera:=false
    
    # Terminal 2: Start SLAM after robot is ready  
    ros2 launch slam_toolbox online_async_launch.py \
      params_file:=/home/pi/Roomba/slam_dev_ws/src/perceptor/config/mapper_params_online_async.yaml \
      use_sim_time:=false
    
    # Terminal 3: Start RViz last
    ros2 run rviz2 rviz2 -d src/perceptor/config/map.rviz

    ✅ Removed Feedback Loop in Twist_Mux

    I had incorrectly set both input and output of twist_mux to /cmd_vel, creating an infinite feedback loop. Separating them fixed the phantom motion issue.

    # config/twist_mux.yaml - BEFORE (broken)
    navigation:
      topic: cmd_vel      # ❌ Same as output - creates feedback loop!
      timeout: 0.5
      priority: 10
    
    # AFTER (fixed)  
    navigation:
      topic: cmd_vel_nav  # ✅ Separate input topic
      timeout: 0.5
      priority: 10 

    ✅ Simplified Configuration Files

    Merged redundant joystick configuration files for better maintainability....

    Read more »

  • Upgrading to Raspberry Pi 5 and ROS 2 Jazzy

    Vipin M07/20/2025 at 17:56 0 comments

    The journey to transform a Roomba 650 into a fully autonomous SLAM robot continues with some significant hardware and software upgrades. This update covers the migration from Raspberry Pi 3 to Pi 5, the transition to ROS 2 Jazzy, power management challenges, and getting gamepad control working.

    The Great Migration: Pi 3 to Pi 5

    Initial Challenges with Pi 3

    The original plan involved using a Raspberry Pi 3 with Ubuntu 22.04 Core and ROS 2 Humble. However, this setup proved problematic due to missing ROS components and the limitations of Ubuntu Core's snap-based package management. While Ubuntu Desktop isn't officially supported on Pi 3 due to its 1GB RAM limitation, the constant troubleshooting was becoming a distraction from the core SLAM development.

    Future-Proofing with Pi 5

    Given the project's eventual goals of heavy-duty image processing and AI integration, I decided to future-proof the system by upgrading to a Raspberry Pi 5. This decision came with its own cascade of changes:

    • Pi 5 requires Ubuntu 24.04 (no 22.04 support)
    • Ubuntu 24.04 ships with ROS 2 Jazzy (not Humble)
    • Development machine also needed upgrading to maintain version consistency

    The migration to Ubuntu 24.04 and ROS 2 Jazzy on both the Pi and development machine was necessary to ensure compatibility across the entire development pipeline.

    Power Management: The Pi 5 Challenge

    Current Limitations

    The Pi 5's higher power requirements (up to 5A at 5V) created an immediate problem. The Roomba's serial port can only source 200mA before triggering its PTC resettable fuse. This meant the original power-over-serial approach was no longer viable.

    Hardware Modifications

    The solution involved modifying the custom cable by removing the voltage regulator circuit. Now the Pi 5 runs on external power while maintaining serial communication with the Roomba. When externally powered, the Pi 5 actually sources 5V to the level shifter rather than sinking it.

    Future Power Solutions

    The current "tethered" approach is temporary. Future plans include:

    • Tapping directly into the Roomba's battery pack
    • Using a USB-C PD 3.0 compliant charging module
    • Maintaining the 200mA limit on the serial interface

    Until then, the Roomba sports a power "tail" for development work.

    UART Configuration on Pi 5

    Unlike previous Pi models, the Pi 5 doesn't have UART pins enabled by default. This required:

    1. Enable UART via raspi-config
    2. Add dtparam=uart0 to /boot/firmware/config.txt

    These changes ensure the GPIO pins 14 (TX) and 15 (RX) function properly for serial communication.

    ROS 2 Jazzy Integration

    Installation and Verification

    ROS 2 Jazzy was successfully installed on the Pi 5 and verified using the standard talker/listener demo. The

    <span class="c-chat-codespan__content svelte-17rwp8y">create_robot</span> driver was then brought up, establishing bidirectional communication:

    • Robot control via /cmd_vel topic
    • Sensor data publishing on various topics (/odom/scan/battery, etc.)

    Teleoperation Success

    The system now responds perfectly to teleop_twist_keyboard commands, with real-time visualization in RViz2 showing the robot model moving in sync with the physical Roomba.

    Gamepad Control Integration

    Nintendo Pro Controller Setup

    A Nintendo Switch Pro Controller was successfully paired via Bluetooth. The system logs show proper recognition:

    nintendo 0005:057E:2009.0008: hidraw0: BLUETOOTH HID v80.01 Gamepad [Pro Controller]
    input: Pro Controller as /devices/platform/soc/.../input/input23

    ROS 2 Joy Integration

    The ROS 2 joy_node successfully interfaces with the Pro Controller:

    ros2 run joy joy_node
    [INFO] [joy_node]: Opened joystick: Pro Controller. deadzone: 0.050000

    Gamepad Control Success

    The existing joy_teleop.launch file from the create_robot package worked perfectly with the Nintendo Pro Controller. Using the pro_controller.yaml configuration file, the system launched successfully:...

    Read more »

  • Of Bots, Beeps, and Bumps: Prepping Roomba for SLAM

    Vipin M07/18/2025 at 14:26 0 comments

    The journey to turning a Roomba 650 into a ROS 2-enabled SLAM and navigation robot has been eventful. Although the SLAM stack isn't powered up yet, much groundwork has been completed to reach that point. This update documents the process, decisions made along the way, and a glimpse of what’s next.

    Hardware Interface and Serial Communication

    The project began with a deep dive into the iRobot Create Open Interface (OI) specification to understand the Roomba’s serial communication requirements. From there, the Roomba's 7-pin mini-DIN port was mapped to identify the relevant pins for TX, RX, and power.

    It became clear that the Roomba's serial interface operated at 5V logic, while the Raspberry Pi's UART is 3.3V tolerant. To bridge this, I built a custom cable incorporating a level shifter. A DC-DC voltage regulator was also added to power the Raspberry Pi directly from the Roomba's onboard battery.

    The initial attempt to communicate between the Pi and Roomba was unsuccessful. To debug, I replaced Roomba with an Arduino Uno, attempting a loopback test through the Arduino itself, but that too failed. I then created a local loopback on the Pi (jump TX to RX), which succeeded and reassured me that the Pi GPIOs were not fried. Returning to the Pi, the correct serial port abstraction was identified—/dev/ttyAMA0 or /dev/serial0—which had been part of the earlier confusion.

    This progress made it worthwhile to improve the hardware connections, so a custom header was fabricated for the Raspberry Pi’s 40-pin connector to secure the wiring and facilitate swapping Pis without rewiring. With renewed confidence, I returned to the Pi to successfully establish communication with the Roomba.

    Validating Control and Sensor Feedback

    With the hardware connection sorted, the next step was to develop basic Python scripts to:

    • Start the Roomba’s OI

    • Drive forward, reverse, and rotate

    • Query sensor data including bump sensors, battery status, and distance traveled

    This formed the first real confirmation that the Roomba was responding properly to commands issued from the Raspberry Pi.

    Moving to ROS 2: create_robot Package

    Next, I turned to the create_robot ROS 2 package from AutonomyLab. After cloning and building the package on the Raspberry Pi, the create_driver was launched:

    ros2 launch create_bringup create_driver.launch.py serial:=/dev/serial0 baud:=115200
    

    Once running, several topics became available:

    • /odom for odometry

    • /cmd_vel for velocity commands

    • /battery/voltage

    • /bumper

    • /wheel_drop

    • /cliff

    • /light_bumper

    • /ir_opcode

    • /dock

    • /buttons

    Exploring Teleoperation

    To control the robot, I started with teleop_twist_keyboard, a ROS 2 package that allows controlling via a keyboard. While exploring options for more intuitive control, I discovered two existing controllers at home: a Flysky RC transmitter used for drones and a Nintendo Switch controller. The latter was stolen from my son's stash and the former is something that both of use when we go out for flying drones and foamies.

    The Switch controller was an easy pick because it could pair directly to the Pi via Bluetooth. The Flysky controller would have required an additional receiver, although I learned that with the inputattach tool, it's now possible to integrate such RC transmitters with Linux as joystick devices. This integration isn't complete yet but is on the list for a future blog post.

    Setting Up RPLiDAR with ROS 2

    With Roomba control and telemetry operational, the next focus was on integrating the RPLiDAR A1M8. This was first brought up on a desktop Linux machine running Ubuntu 22.04 and ROS 2 Humble, to simplify testing.

    Once verified, the setup was migrated to the Raspberry Pi. The rplidar_node published data on the /scan topic, confirming that the LIDAR was functioning.

    A split architecture was then tested:

    • The Pi handled the LIDAR and published /scan

    • A host computer ran RViz2 for visualization via...

    Read more »

View all 6 project logs

Enjoy this project?

Share

Discussions

Simon Jansen wrote 08/07/2025 at 13:33 point

Cool project! I'm trying to do sensor fusion with the Roomba encoders & an MPU6050 as well.

Check it out: https://hackaday.io/project/183524-old-roomba-new-tricks

Did you accomplish this or is this something you're still working on? I'm looking for the EKF input parameters (covariance matrix).

  Are you sure? yes | no

Similar Projects

Does this project spark your interest?

Become a member to follow this project and never miss any updates