Distortions Due to Ego Motion
Introduction
What is Ego?
“Ego” is used to refer self. Distortions due to self motion is an alternative title.
What is a distortion?
A distortion is when an object or scene is not dimensionally what you expect. This is different from ghost points (false positives) or missing points (false negatives).
Why does this happen?
Ouster lidar shares many similarities to 2D cameras. We use time-of-flight detectors to capture a 3D pointcloud while a 2D camera uses a CMOS chip to capture .
Distortions are similar to shutter speed of a 2D camera. The distortions are similar to motion blur for the 2D. Below is an example of motion blur and the shutter speed:
Data is collected over a set time for a full frame (i.e. rotation for spinning Ouster lidar) in either devices. For Ouster sensors, the default set time is 10hz (1/10th second).
[insert a screenshot of lidar distortion with frames overlapped; maybe from the sample data?]
Mitigating Distortions
Physical
Increase Rotation Speed
Ouster provides many different lidar modes to satisfy customer resolution desires and data throughput limitations. They are formatted as [measurement block]x[rotational frequency]. If you are on 1024x10 and can handle double the data, try 1024x20 mode.
Software
Synchronized Motion Data
If you know the motion that the sensor was undergoing during recording, you can use it to dewarp the pointcloud.
Mathworks has an article on using GNSS and IMU data: https://www.mathworks.com/help/lidar/ug/motion-compensation-in-lidar-point-cloud.html
[insert link to Angus’s post]