Creating Custom SLAM Trajectory

Hello,

Is it possible to create a custom trajectory file and then apply this trajectory to a .pcap streamfile?

I have a drone-mounted VLP-16 scanning vegetation, but the LidarView SLAM algorithm does not work very well under these conditions (see attached SLAM trajectory.)

Since my flight trajectories are relatively simple and can be recorded by the drone, how can I create and upload my own trajectory file to “guide” the .pcap frames?

Thank you,
Connor

In other words, how can I integrate my drone’s gps/imu trajectory data into LidarView? I know there are other discussion topics related to this question, and I have already referred to this part of the documentation about SLAMing with external sensors, but the documentation does not explicitly state how to apply the external sensor information to the .pcap streamfile.

For reference, here is the gps/imu data that I am trying to use in the “External sensors data file” field of the SLAM properties. I have formatted it to have only the timestamp, x, y, z, roll, pitch, and yaw fields. However, when I run SLAM the resulting trajectory is still incorrect.

FLY070-formatted.csv (91.1 KB)

Hi @baldpixels,

In order to be able to load an external trajectory, it needs to satisfy the 3 following requirements :

  • It shall have the correct formatting: yours seems correct
  • It shall come with an external calibration file so that the SLAM know the transform between the poses and the LiDAR data => have you provided this file in your test ? And do you know this calibration ?
  • It shall be synchronized with the LiDAR time : in your csv file the time array starts at 3 seconds. And hard to say for sure your without your pcap data, but LiDAR data is usually in POSIX time (now would be 1712045779 )

When those conditions are satisfied, you may do the following to use it in the SLAM :.

  • Create a SLAM filter.
  • Use the “External sensors” field and load a CSV file containing synchronized poses. Change the following parameters: “Undistortion: External”, “Motion extrapolation: External”, “pose weight = 100000000” (this last parameter should appear in a new section at the bottom of the “Properties” window when the CSV file is entered).
  • Launch the SLAM.

For refinement at the end: check “Use pose graph”. Check “external pose” in the pose graph settings section and click on “Optimize graph”.

Would it be possible to send your pcap file ( even privately), so we can also have a look and see if the SLAM can also be improved in that scenario without external poses ?

You may also consider using latest LidarView/SLAM.

Artifacts of master are available here

Latest release available here

Hope this helps,

Gatien

Hello Gatien,

Thank you for another helpful reply! Now I understand how to use my external data sensor as the trajectory.

As far as the time field goes, I have been using relative time because my LiDAR sensor does not seem to be recording accurate POSIX time. Regarding calibration: I read that the calibration file is not necessary, so I have not yet tried to incorporate this feature.

I would be happy to share my .pcap files (I have many that I could share!)

I did have one additional question: since my IMU/GPS records latitude and longitude rather than x and y coordinates, what would be the process for converting from latitude/longitude to x and y coordinates? The z coordinate is easy enough, as the IMU/GPS records relative altitude, but I’m stuck on x and y!

Please excuse my novice questions… I am just a college student working on a research project!

Thank you,
Connor

UPDATE: Another thought I’ve head…since the IMU/GPS records compass heading, as well as x and y velocities relative to this heading, I believe I can iteratively calculate the x and y positions using simple trigonometry.

Hi Connor,

I had forgotten to reply here preivously, sorry about that. No need to create another parallel post for the same topic though.

Getting back on my previous answer :

  • You need to give a calibration matrix to transform the poses to the LiDAR ( otherwise you are sending the algorithm an information of movement that goes in the wrong direction )
  • You need to have the data synchronized, or the algorithm cannot guess which value to take for each lidar frame ( and each point within the frame )

Responding from your other post :

I have confirmed that my external sensor .csv data is correct and properly formatted (see attached.) What is strange to me, is that each time I run SLAM with the same external sensor data, the trajectory turns out differently! This implies to me that my external sensor data is not determining the trajectory as it should

Which player setting did you use in LidarView ? By default, it is set to *1 that will emulate real time, which can lead to non deterministic outcome ( as some frames are skipped if the process takes longer than real time, and which ones depends on how your PC schedules this process )
=> You should use All Frames setting to make sure all frames are played

Everything considered, my professor and I have become rather frustrated with what seems like a straightforward task: using GPS/IMU data from a drone to aggregate LiDAR frames. We have the position and orientation data, so why is it so difficult to integrate this data in LidarView? It seems like a serious product flaw…

Without calibration and time sync, this data is not usable. We have some tools in LidarView to calibrate GPS with LiDAR using both LiDAR estimated trajectory and GPS data. But this needs to be done in conditions where LiDAR SLAM is sufficient. This is available here
image
Some explanation here

At this point, I’m not sure we would recommend Velodyne products to other researchers trying to capture aerial LiDAR data, but hopefully this problem can still be resolved.

Concluding on Velodyne products based on the results you get from an external software to which you give incomplete data seems like a hasty conclusion to me…

Regarding the transform of latitude / longitude into x/y coordinates.
The reference library is libproj
This is handled in LidarView in this part when GPS data are provided along LiDAR data in pcap files