Backpropagation for Parametric STL

Karen Yan Ming Leung1, Nikos Arechiga2, Marco Pavone3

  • 1University of Washington
  • 2Toyota Research Institute
  • 3Stanford University

Details

13:00 - 17:30 | Sun 9 Jun | Room L118 | SuGT11.2

Session: ULAD: Unsupervised Learning for Automated Driving

Abstract

This paper proposes a method to evaluate Signal Temporal Logic (STL) robustness formulas using computation graphs. This method results in efficient computations and enables the use of backpropagation for optimizing over STL parameters. Inferring STL formulas from behavior traces can provide powerful insights into complex systems, such as long-term behaviors in time-series data. It can also be used to augment existing prediction and planning architectures by ensuring specifications are met. However, learning STL formulas from data is challenging from a theoretical and numerical standpoint. By evaluating and learning STL formulas using computation graphs, we can leverage the computational efficiency and utility of modern machine learning libraries. The proposed approach is particularly effective for solving parameteric STL (pSTL) problems, the problem of parameter fitting for a given signal. We provide a relaxation technique that makes this method tractable when solving general pSTL formulas. Through a traffic-weaving case-study, we show how the proposed approach is effective in learning pSTL parameters, and how it can be applied for scenario-based testing for autonomous driving and other complex robotic systems.