The Right (Angled) Perspective: Improving the Understanding of Road Scenes Using Boosted Inverse Perspective Mapping

Tom Bruls1, Horia Porav1, Lars Kunze1, Paul Newman2

  • 1University of Oxford
  • 2Oxford University

Details

09:46 - 09:57 | Mon 10 Jun | Berlioz Auditorium | MoAM2_Oral.2

Session: Vision Sensing and Perception

09:46 - 09:57 | Mon 10 Jun | Room 4 | MoAM2_Oral.2

Session: Poster 1: (Orals) AV + Vision

Abstract

Many tasks performed by autonomous vehicles such as road marking detection, object tracking, and path planning are simpler in bird's-eye view. Hence, Inverse Perspective Mapping (IPM) is often applied to remove the perspective effect from a vehicle's front-facing camera and to remap its images into a 2D domain, resulting in a top-down view. Unfortunately, however, this leads to unnatural blurring and stretching of objects at further distance, due to the resolution of the camera, limiting applicability. In this paper, we present an adversarial learning approach for generating a significantly improved IPM from a single camera image in real time. The generated bird's-eye-view images contain sharper features (e.g. road markings) and a more homogeneous illumination, while (dynamic) objects are automatically removed from the scene, thus revealing the underlying road layout in an improved fashion. We demonstrate our framework using real-world data from the Oxford RobotCar Dataset and show that scene understanding tasks directly benefit from our boosted IPM approach.