Lidarview-odom-slam integration

I’m working on extending LidarView to integrate live external odometry data from my robot into the SLAM (Online) pipeline.
The goal is to let the SLAM algorithm fuse both LiDAR point cloud and odometry information for improved mapping accuracy and trajectory stability.

What I’m trying to achieve
I have a custom LiDAR plugin that publishes PointCloud2 data in real time.
Separately, I also have odometry data (from wheel encoders) available as ROS 2 messages.

I would like the SLAM (Online) module in LidarView to use this odometry as a motion prior / initial pose estimate during scan matching — effectively improving SLAM convergence and reducing drift.

What I have implemented
I’ve created a custom class derived from vtkPacketInterpreter that handles LiDAR data and also initializes a ROS 2 node.
Inside this interpreter, I’ve implemented a function:

void vtkCustomPacketInterpreter::SetManualSlamConnect(bool enable);

This function sets up the communication between my odometry publisher and the SLAM pipeline.

Now, I would like this function to be called automatically when the user selects “SLAM (Online)” from the LidarView GUI.

I found the corresponding XML proxy definition:

and I’m considering either:

  1. Calling SetManualSlamConnect() inside the vtkSlam constructor, or

  2. Adding this to the XML:

  3. so that the method is triggered when the SLAM filter is instantiated.

    My questions

    1. What is the recommended way to trigger a function (like SetManualSlamConnect()) when a filter such as SLAM (Online) is created or selected from the GUI?
      Should this be done via XML hints, directly in the vtkSlam constructor, or through a manager class like pqLidarSlamManager?

    2. If my external ROS 2 odometry data is successfully connected,
      will the SLAM (Online) module actually consume and use odometry as an additional input (e.g., as an initial guess for registration)?
      Or is the current implementation designed to rely solely on LiDAR-based pose estimation?

    3. Are there any additional parameters or configuration options (in the SLAM .xml or .json files) that must be enabled for SLAM to accept external odometry?

Expected outcome

Once integrated, I expect the SLAM trajectory in LidarView to become smoother and more stable, particularly during slow rotations or when re-localizing after loop closure, since odometry will provide a reliable motion prior alongside the point cloud data.
Environment

  • LidarView built from source (Superbuild)

  • Custom plugin derived from vtkPacketInterpreter

  • ROS 2 Humble

  • External odometry published as nav_msgs/Odometry

Hello,

Thank you for using LidarView and SLAM, and apologies for the late reply.

  1. What is the recommended way to trigger a function (like SetManualSlamConnect()) when a filter such as SLAM (Online) is created or selected from the GUI?
    Should this be done via XML hints, directly in the vtkSlam constructor, or through a manager class like pqLidarSlamManager?

We recently added an optional external sensor input to the SLAM filter in LidarView, which provides functionality very similar to your use case. Please refer to the related merge request for implementation detail.

Futhermore, LidarView supports loading MCAP data and ROS2 subscribers has recently been added, allowing direct reception of ROS message such as: pointcloud, IMU, odometry, image… etc
You can find more detail in this repo: https://gitlab.kitware.com/LidarView/plugins/ros2-io

  1. If my external ROS 2 odometry data is successfully connected,
    will the SLAM (Online) module actually consume and use odometry as an additional input (e.g., as an initial guess for registration)?
    Or is the current implementation designed to rely solely on LiDAR-based pose estimation?
  2. Are there any additional parameters or configuration options (in the SLAM .xml or .json files) that must be enabled for SLAM to accept external odometry?

Yes, external odometry can be used as an initial motion estimate in SLAM.
To enable this, you can choose Ego-Motion Mode as External or External OR Motion Extrapolation .This allows SLAM to use your odometry as a motion prior of registration.

Our SLAM supports external sensors in multiple roles, including: initial pose estimation, undistortion, local constraints, pose graph optimization.

You can refer to the External Sensors section of the SLAM documentation for more details about how to use them and how to set parameters.

Theorically, it is now possible to subscribe directly to an odometry topic inside LidarView and use it as an external SLAM input with our recent updates, The remaining step is aligning the ROS odometry message format with the SLAM internal format — this is currently under development.
Please stay tuned for upcoming updates.

I hope this helps, and please feel free to contact us if you need further support.