Intro

A guide to Simultaneous Localization and Mapping (SLAM) using ROS and the slam_toolbox package. In particular, the video focuses on grid SLAM with a 2D Lidar

SLAM Overview

There are 2 main types of SLAM- feature/landmark SLAM (using distinct objects like houses) and grid SLAM (dividing the environment into cells). The slam_toolbox uses a grid-map-based approach.

ROS and SLAM

ROS coordinate frames

.

  • Map – fixed global frame; origin arbitrarily chosen.
  • Odom – fixed at robot start; drifts slowly over time.
  • Base_footprint – 2D robot pose; origin under center; moves with robot.
  • Base_link – 3D robot frame; origin at center/pivot; moves with robot.
  • Laser_link – LIDAR frame; fixed relative to base_link.
  • The frame attached to the robot is called base_link. The odom represents the world origin, and the transform from odom to base_link is calculated (e.g. by differential drive controller) using wheel odometry.

Wheel Odometry

  • Any small section of the odometry is smooth, with minimal error, but error compounds over time
  • Odometry effectively measures robot velocity, which is integrated to estimate position (dead reckoning)

Correcting for drift

  • To reduce compounded error, robots fuse odometry with other sensors:

Info

| Sensor | Helps Correct           | Common Framework     |
| ------ | ----------------------- | -------------------- |
| IMU    | Short-term motion drift | EKF or sensor fusion |
| LiDAR  | Position drift          | LiDAR SLAM           |
| Camera | Visual drift            | Visual SLAM          |
| GPS    | Global drift            | Outdoor navigation   |
  • This updates the odom to base_link transform, however, which can cause the robot to appear to jump around

Map frame

  • The map frame shows the robot’s position relative to the global origin of the environment.
  • A frame in ROS can only have 1 parent, so the localization system calculates the map to odom transform
  • Why it’s separate: The odom → base_link transform provides smooth, high-frequency updates of the robot’s motion, while the map → odom transform applies slower corrections from SLAM or localization
  • Composing them (map → odom → base_link) gives a globally accurate and smooth pose without sudden jumps

Topics

  • /odom
    • Type: nav_msgs/msg/Odometry
    • Same position info as odom → base_link TF
    • Velocity
    • Covariances
  • /map
    • Type: nav_msgs/msg/OccupancyGrid
    • occupancy data for grid map

base_footprint frame

  • base_footprint AKA “shadow” of the robot representing its contact with the ground (z = 0)

    • Orientation: Typically aligned with the robot’s yaw, but ignores pitch and roll
  • Note: The robot’s pose (position + orientation) in a given frame is equivalent to its transform from that frame.

    • In other words, the pose tells you where the robot’s frame (e.g., base_link) is relative to its parent frame (e.g., odom or map).

Setting up for slam_toolbox

  • In robot_core.xacro, add base_footprint link, which is attached rigidly to the base_link with no offset For this robot, we treat z0 as the wheel axis
    <!-- BASE_FOOTPRINT LINK -->
 
    <joint name="base_footprint_joint" type="fixed">
        <parent link="base_link"/>
        <child link="base_footprint"/>
        <origin xyz="0 0 0" rpy="0 0 0"/>
    </joint>
 
    <link name="base_footprint">
    </link>

Install the package

  • Install slam_toolbox
sudo apt install ros-${ROS_DISTRO}-slam-toolbox
  • This setup will be for online asynchronous mapping
    • Work on live data stream rather than recorded logs
    • Always process the most recent scan to avoid lagging (may skip scans) if processing rate is slower than scan rate

Parameter file

  • Copy slam_toolbox params file into the workspace config directory
cp /opt/ros/humble/share/slam_toolbox/config/mapper_params_online_async.yaml dev_ws/src/articubot_one/config/
  • Build and source workspace

  • Launch simulation

ros2 launch articubot_one launch_sim.launch.py world:=./src/articubot_one/worlds/obstacles.world

Visualize in RViz

  • Launch rviz
rviz2 -d src/articubot_one/config/main.rviz
  • Launch slam_toolbox
ros2 launch slam_toolbox online_async_launch.py slam_params_file:=./src/articubot_one/config//mapper_params_online_async.yaml use_sim_time:=true
  • In rviz, add Map with /map topic, then change Fixed Frame to map

    • The robot may jump around, but the map will stay steady
    • If left it at odom frame, the robot would move smoothly but the map would sometimes jump around
  • Drive the robot around, and you should see a map generated in rviz

    • Change rviz view to TopDownOrthographic

Save the map

 
  • Show services from slam_toolbox
ros2 service list
  • Save map
  • This saves:
    • my_map.yaml
      • Contains metadata such as the grid resolution and origin location
    • my_map.pgm
      • actual cell occupancy data
ros2 service call /slam_toolbox/save_map slam_toolbox/srv/SaveMap "{name: 'my_map'}"
  • Use RViz plugin: Add SlamToolBoxPlugin panel 1 and use either format
    • Save Map (old format): “my_map_save”
    • Serialize Map: “my_map_serial” (.data and .posegraph)

Use map for localization

  • Change mode, map_file_name, and map_start_at_dock parameters
  • Rerun slam_toolbox
ros2 launch slam_toolbox online_async_launch.py slam_params_file:=./src/articubot_one/config//mapper_params_online_async.yaml use_sim_time:=true
  • The map should be loaded up, allowing the robot to localize against it

Localization with amcl

amcl (adaptive Monte Carlo localization) from the nav2 stack

Footnotes

  1. [Penn] T06 Running SLAM ↩