Learning to Localize, Grasp, and Hand Over Unmodified Surgical Needles

Albert Wilcox, Justin Kerr1, Brijen Thananjeyan2, Jeffrey Ichnowski3, Minho Hwang4, Samuel Paradis1, Danyal Fer5, Ken Goldberg2

  • 1University of California, Berkeley
  • 2UC Berkeley
  • 3Carnegie Mellon University
  • 4DGIST
  • 5University of California, San Francisco East Bay

Details

10:55 - 11:00 | Thu 26 May | Room 108B | ThA21.10

Session: Surgical Robotics: Steerable Catheters and Needles

Abstract

Robotic Surgical Assistants (RSAs) are commonly used to perform minimally invasive surgeries by expert surgeons. However, long procedures filled with tedious and repetitive tasks such as suturing can lead to surgeon fatigue, motivating the automation of suturing. As visual tracking of a thin reflective needle is extremely challenging, prior work has modified the needle with nonreflective contrasting paint. As a step towards automation of a suturing subtask without modifying the needle, we propose HOUSTON: Hand Over of Unmodified, Surgical, Tool-Obstructed Needles, a problem and algorithm that uses a learned active sensing policy with a stereo camera to iteratively localize and align the needle into a visible and accessible pose for the other gripper. To compensate for robot positioning and needle perception errors, the algorithm then executes a high-precision grasping motion that uses multiple cameras. Physical experiments with the daVinci Research Kit (dVRK) suggest a success rate of 96.7% on needles used in training, and 75‚àí92.9% on needles unseen in training. On sequential hand overs, HOUSTON successfully executes 32.4 hand overs on average before failure. To our knowledge, this work is the first to study hand over of unmodified surgical needles. See https://tinyurl.com/houston-surgery for additional materials including details about offline datasets and model architectures.