Use of external trajectory, LAS export, batch mode

Hi,

I’ve been doing some testing with SLAM inside LidarView, using data acquired with a Hesai Pandar XT32 and a GNSS/INS positioning unit (IMU) from which we can obtain a post-processed tightly-coupled trajectory. First of all I want to say that I’m extremely impressed with the results in a lidar-only test case, as the registering of overlapping drivelines is practically seamless and even longer stretches of straight road with little geometry cause little XY deviation.

I have however run into some issues that I would like to ask help with:

  1. I have so far been unable to integrate the postprocessed GNSS/INS trajectory. I have converted the trajectory to a csv that looks like this:
Time,X,Y,Z,Rz,Ry,Rx
1744611187.003185,180220.72746252423,448140.8107820237,23.98793640492724,3.092427497477753,-0.0251676478137583,-0.0131597825600371
1744611187.008185,180220.7273933219,448140.8108929326,23.987936405969265,3.09244495075656,-0.0251676478137583,-0.0131772358525572
1744611187.013185,180220.72746196107,448140.8108932798,23.98793640573993,3.092444950770452,-0.0251676478137583,-0.0131772358525572

I’ve also created the calibration matrix file. Adding the trajectory as described on the LidarView SLAM manual page does appear to work and only generates a warning stating that the caibration matrix has been read. But when I then do the SLAM processing, it does not seem like the trajectory gets used and all output remains close to the origin. I also tried just setting an initial pose XYZ that is close to the starting position in our national grid, but once again the resulting point cloud stays close to 0/0/0.

Would it be possible to get a slightly more detailed description of how to integrate an external trajectory? What is the exact file format required? Should the calibration file contain the position/orientation of the lidar scanner in the IMU body frame, or the IMU position/orientation in the scanner frame? What are then the required settings and processing steps to get an improved trajectory and point cloud in the IMU trajectory’s coordinate system?

  1. There are some issued with the LAS export. The timestamp is not written to a point’s timestamp field, but added as an extra custom field. There also seems to be a precision issue (the Hesai uses Unix timestamps) that causes timestamps in large discrete steps instead of the desired millisecond precision. The “write timestemps as file-series” option also generates multiple file with mostly identical content, possibly because of this precision issue. Inside LidarView, the timestamps seem to be OK.

  2. Is it possible to do SLAM processing in batch? The bin folder contains an lvbatch.exe and an lvpython.exe, but no information on how to use these.

I’ve been using a recent nightly build (5.1.0-33-gb21aecfa) as this appeared to resolve the issue with importing Hesai calibration files, as described in another topics.

Any help would be greatly appreciated!

Toby

Hi @Toby,

Please note that I’m not the SLAM expert (she is currently in vacation), so I will share my understanding, but I could be mistaken on some points.

It seems that the header for the XYZRPY format is Time, X, Y, Z, Rx(Roll), Ry(Pitch), Rz(Yaw) (from here), you can add in the same file the IMU data with the columns acc_x, acc_y, acc_z, w_x, w_y, w_z.
I think the calibration file should contains the position/orientation of the lidar scanner in the IMU body frame, and then the calibration_external_sensor.mat next to it would convert it back to the lidar frame.
To convert the local trajectory to gnss coordinates you can use the Align to external sensor reference button (I think it should be click after the SLAM process is complete, as doing so earlier could introduce numerical inaccuracies)

Yes looking a the LAS specs / writter code, I see what you mean. I wil try to fix that this week. (Note that I don’t think you need to use the write timestemps as file-series option as it used to write multiple las files, one for each frame of the lidar)

What do you mean exactly? Could you provide more detail on what you are trying to acheive?

Note that we currenlty works on improving the integration of external sensor in LidarView and add a better user experience of manipulating them / loading them in the SLAM. So If you have any suggestion they are welcome!

Hope this helps!
Timothée

Hi Timothée,

Thanks for your reply! I’ve modified the trajectory file format and I see that the point cloud is indeed relocated after clicking Align to external sensor reference and then re-running through all frames. I’ll try out a few options for the calibration matrix.

Regarding the batch processing, ideally it would be possible to run the processing from the command line in a non-interactive way to allow for automation. It also looks like (at least in the test that I did) processing becomes slow with large input files.

Toby

I’ve done some further testing. I’ve found one way to generate results that appear to use the GNSS/INS trajectory and provide output in the trajectory coordinate system. I’ve directly applied the LiDAR lever arm and orientation during PCAP import. It is a bit unclear here whether the transformation from base to lidar frame or lidar to base frame is required, but it seems to be lidar to base frame. The coordinate system definition is also not entirely clear. The trajectory appears to be left-handed with x=forward (roll axis), y=right (pitch axis) and z=up (yaw-axis). But both the display in LidarView and how the rotation angles affect the lidar frames imply a right-handed system with x=left (pitch axis), y=forward (roll axis), and z=up (yaw axis).

After instantiating a SLAM filter I have added the trajectory through the external sensor dialog, using the identity matrix as calibration matrix (as lidar to base frame transformation has already been dealt with) and External OR motion extrapolation as Ego-Motion mode.

If I then click Align to external sensor reference and run through all frames, the external trajectory appears to be taken into account. I’m still seeing differences in height.

Some things that I noticed:

  1. I tried to run SLAM first without the external trajectory, then loading it and clicking Align to external sensor reference. This does not appear to do anything.
  2. Likewise for the Calibrate button under External sensors. It is suggested that this should be able to compute the calibration if both a SLAM trajectory and an external trajectory are present, but nothing seems to happen.
  3. The documentation states that it is also possible to use an external trajectory (GNSS only or with orientations) for Pose graph optimization. I’ve tried, but clicking Optimize graph does not appear to do anything.

Hi,

Is the SLAM expert back from vacation? If yes, could she take a look at my questions? This would be much appreciated. Thanks!

Toby

Hi Toby,

Sorry for the delayed reply.

When you have an external trajectory — whether from an IMU, INS, or another LiDAR SLAM — you can use it to assist SLAM in several ways:

  1. Use external poses for ego-motion estimation.
    Based on the current timestamps, SLAM will find the synchronized pose by interpolating the external trajectory. To enable this, set the Ego-Motion mode to “External” or “External OR motion extrapolation.”

  2. Use external poses in local optimization.
    SLAM estimates the pose by optimizing the distance between the features in the current frame and the map. If an external trajectory is available, it can also be included in this optimization.
    To enable it, set Pose weight to a non-zero value to indicate how much you trust the external poses.

  3. Use external poses for undistortion.
    External poses can be used to undistort frames, especially when the external sensor (e.g., an IMU) has a higher frequency than the LiDAR. To enable this, set the Undistortion mode to “External.”

  4. Use external poses in pose graph optimization.
    To do so, you can then click “Optimize graph” whenever needed.

SLAM uses the right-handed coordinate convention with x → roll, y → pitch, and z → yaw.
BaseToLidar is set to identity by default, meaning the SLAM odometry is represented in the LiDAR frame. The external poses used in SLAM should be aligned with the ENU (East–North–Up) coordinate system.

In my experience, the x, y, z coordinates from an INS are often in the ENU frame (east → x, north → y, up → z), while the roll, pitch, yaw from an IMU are aligned with the NED frame (north → x, east → y, down → z).
Additionally, the yaw value provided by the INS often represents a heading — an absolute angle (0° → north, 90° → east, etc.).

“I tried to run SLAM first without the external trajectory, then loading it and clicking Align to external sensor reference. This does not appear to do anything.”

That’s the recommended way to align the SLAM trajectory with the external poses. We prefer to let SLAM work locally first and then apply the GPS offset afterward.
I haven’t been able to reproduce this behavior — if you can share part of your pcap and INS trajectory, I can take a closer look.

“Likewise for the Calibrate button under External sensors. It is suggested that this should be able to compute the calibration if both a SLAM trajectory and an external trajectory are present, but nothing seems to happen.”

The Calibrate button computes the INS calibration. The calibration matrix is stored in memory and does not affect visualization directly.
However, accurate calibration is important — otherwise, using the INS can actually degrade SLAM results.
Calibration precision depends on the trajectory quality. You should select a portion of the SLAM trajectory with sufficient rotation and non-planar motion so that the optimizer has enough information to estimate the 6-DOF transformation.

Do you work in a Windows or Linux environment?
It would be helpful to check the SLAM log. If you’re on Linux, you can set the verbosity level to 3 for more detailed output. The log can provide more information about why the pose graph optimization didn’t work, show the estimated calibration, and include other useful details.

Best regards,

Tong