Deep-LK for Efficient Adaptive Object Tracking

Chaoyang Wang1, Hamed Kiani Galoogahi2, Chen-Hsuan Lin1, Simon Lucey1

  • 1Carnegie Mellon University
  • 2Cmu

Details

10:30 - 13:00 | Tue 22 May | podL | [email protected]

Session: Visual Tracking 1

Abstract

In this paper we present a new approach for efficient regression based object tracking which we refer to as Deep-LK. Our approach is closely related to the Generic Object Tracking Using Regression Networks (GOTURN) framework of Held et al. We make the following contributions. First, we demonstrate that there is a theoretical relationship between siamese regression networks like GOTURN and the classical Inverse-Compositional Lucas & Kanade (IC-LK) algorithm. Further, we demonstrate that unlike GOTURN IC-LK adapts its regressor to the appearance of the currently tracked frame. We argue that this missing property in GOTURN can be attributed to its poor performance on unseen objects and/or viewpoints. Second, we propose a novel framework for object tracking - which we refer to as Deep-LK - that is inspired by the IC-LK framework. Finally, we show impressive results demonstrating that Deep-LK substantially outperforms GOTURN. Additionally, we demonstrate comparable tracking performance to current state of the art deep-trackers on high frame rate sequences whilst being an order of magnitude (i.e. 100 FPS) computationally efficient.