Close

SLAM Debugging: When Perceptor's Laser Scans Drift from the Map

A project log for Perceptor - ROS2 Perception, Planning, Control

Modified Roomba with ROS2, LIDAR, and odometry running SLAM Toolbox for real-time mapping and navigation

vipin-mVipin M 07/28/2025 at 15:350 Comments

The Problem: Laser Scan Drift

While running SLAM on my Create 2-based robot with an RPLiDAR, I noticed an odd issue. Initially, the map looked fine and the robot localized correctly, but over time, the laser scan visualization in RViz started drifting away from the map. The pink laser scan lines—meant to reflect real-world walls and obstacles—became increasingly misaligned. The map itself didn't distort, but the scan data no longer lined up, making it hard to trust the real-time sensor output.

The Cause: Coordinate Frame Misalignment

After some investigation, I found the issue was a coordinate frame mismatch. Specifically:

The SLAM system was trying to make sense of data coming in at the wrong orientation, which led to scan drift, even though mapping and localization initially appeared correct.

The Fix: Correcting the Laser Frame Orientation

To fix this, I modified the URDF to rotate the laser frame by 180° around the Z-axis. I added this transform to align the LiDAR’s forward direction with the robot’s:

<origin xyz="0 0 0.1" rpy="0 0 3.14159" />

This single change ensured that:

The Result: Solid, Aligned SLAM

After making the adjustment:

Current Status

Using SLAM Toolbox, I’m now able to generate accurate maps. I briefly tested AMCL by running Nav2 with the saved map published via the map server. Localization appears accurate, and the robot’s estimated position matches the environment. The next step is to test autonomous navigation using this setup.

Discussions