People Tracking by Cooperative Fusion of RADAR and Camera Sensors

Martin Dimitrievski1, Lennert Jacobs2, Peter Veelaert3, Wilfried Philips4

  • 1IMEC - IPI - Ghent University
  • 2IMEC - Ghent University
  • 3Ghent University
  • 4Ghent University IMinds

Details

14:00 - 14:15 | Mon 28 Oct | The Great Room I | MoE-T1.1

Session: Regular Session on Vulnerable Road Users Perception (I)

Abstract

Accurate 3D tracking of objects from monocular camera poses challenges due to the loss of depth during projection. Although ranging by RADAR has proven effective in highway environments, people tracking remains beyond the capability of single sensor systems. In this paper, we propose a cooperative RADAR-camera fusion method for people tracking on the ground plane. Using average person height, joint detection likelihood is calculated by back-projecting detections from the camera onto the RADAR Range-Azimuth data. Peaks in the joint likelihood, representing candidate targets, are fed into a Particle Filter tracker. Depending on the association outcome, particles are updated using the associated detections, or by sampling the raw likelihood itself. Therefore, our system operates in a Tracking by Detection mode, if associated, or Tracking Before Detection mode otherwise. Utilizing the raw likelihood data has the advantage that lost targets are continuously tracked even if the camera or RADAR signal does not reach the detection threshold. We show that in single target, uncluttered environments, the proposed method entirely outperforms camera-only tracking. Experiments in a real-world urban environment also confirm that the cooperative fusion tracker produces significantly better estimates, even in difficult and ambiguous situations.