Learning Motion Predictors for Smart Wheelchair Using Autoregressive Sparse Gaussian Process

Zicong Fan1, Lili Meng1, Tian Qi Chen2, Jingchun Li1, Ian Mitchell1

  • 1University of British Columbia
  • 2University of Toronto

Details

10:30 - 13:00 | Tue 22 May | podM | [email protected]

Session: Mobile Robots

Abstract

Constructing a smart wheelchair on a commercially available powered wheelchair (PWC) platform avoids a host of seating, mechanical design and reliability issues but requires methods of predicting and controlling the motion of a device never intended for robotics. Analog joystick inputs are subject to black-box transformations which may produce intuitive and adaptable motion control for human operators, but complicate robotic control approaches; furthermore, installation of standard axle mounted odometers on a commercial PWC is difficult. In this work, we present an integrated hardware and software system for predicting the motion of a commercial PWC platform that does not require any physical or electronic modification of the chair beyond plugging into an industry standard auxiliary input port. This system uses an RGB-D camera and an Arduino interface board to capture motion data, including visual odometry and joystick signals, via ROS communication. Future motion is predicted using an autoregressive sparse Gaussian process model. We evaluate the proposed system on real-world short-term path prediction experiments. Experimental results demonstrate the system's efficacy when compared to a baseline neural network model.