Vision-Based Localization Using a Monocular Camera in the Rain

Makoto Yamada1, Tomoya Sato1, Hiroyuki Chishiro1, Shinpei Kato2

  • 1The University of Tokyo
  • 2University of Tokyo

Details

12:15 - 12:30 | Mon 28 Oct | The Great Room IV | MoD-T2.2

Session: Regular Session on Perception under Adverse Weather Conditions (II)

Abstract

With the development of autonomous driving, simultaneous localization and mapping (SLAM) is becoming more important. ORB-SLAM is a vision-based SLAM system that uses the ORB feature. The ORB feature is obtained from light intensity information. However, its robustness is undermined due to weather conditions e.g., rainfall creating an effect of light scattering. This paper presents a complementary method for the ORB feature, which reduces the impact of rain on the localization performance. The pre-processing of the system removes rain strokes from input images based on Fu et al.’s method. In order to remove rain in real-time in SLAM system, the presented method speeds up rain removal by adjustment of the number of layers, and rain removal and tracking are executed in parallel. Experimental results using Ritcher’s datasets show that the ORB feature augmented by the presented pre-processing method improves performance for localization of mobile robots and autonomous vehicles in the rain.