Institut für Mess- und Regelungstechnik (MRT)

Laser Based Environment Perception

 

Motivation
In contrast to stereo camera rigs, lidar scanners additionally provide direct distance measurements with high accuracy. Precise measurements up to 80m allow earlier detection of traffic participants, better situation analysis, and easier path planning. As free/occupied space is directly measured, lidar scanners were the sensor of choice for the latest DARPA Urban Challenge. On the downside, the provided angular resolution is usually much lower as compared with stereo cameras.

We employ a Velodyne HDL-64E S2 sensor, depicted in the next image. The sensor is mounted on top of our autonomous vehicle 'Annieway', which took part in the Darpa Urban Challenge 2007. Specifically designed for automobile applications, it has a field of view of 360° horizontally and 28° vertically. Example data can be downloaded here

 

 

Our goal is to develop algorithms for the detection and tracking of any kind of objects (pedestrians, cars, trees, buildings, ...). Such a detailed environment perception then enables secure autonomous driving or next generation driver assistance systems.

Segmentation
Our current approach first takes the raw 3D measurements and generates a virtual 2D range image for faster processing.

 

 

It then proceeds by partitioning the image into several segments, which serve as object hypotheses. These can also be back-projected into 3D.

 

 

Motion Estimation
In the following, motion is estimated for each object hypothesis and tracked over time.
Results and example data can be found here.

 

 


Simultaneous Localization and Mapping (SLAM)
The measurements can also be used to compute the motion of the own car relative to the world and to generate a detailed 3D map of the surroundings. Results and two datasets can be found here