Avi Caspi1, Paul Rosendall2, Jason W. Harper3, Kapil Katyal4, Michael P. Barry5, Gislin Dagnelie6, Arup Roy7
11:30 - 13:30 | Fri 26 May | Emerald III, Rose, Narcissus & Jasmine | FrPS1T1
The Argus II retinal prosthesis has a dissociation between the line of sight of the camera and that of the eye. The image-capturing camera is mounted on the glasses and therefore, eye movements do not influence the visual information sent to the implanted electrodes. We have demonstrated a closed-loop setup that shifts the visual information based on real-time eye position. In contrast to previous experiments, the setup does not require head restraints. The setup is based on a self-calibrating mobile eye tracker that allows free head movements. The patient was required to report the location of a white bar on a black background. An internal sensor was used to record the amount of head motion during the task. Results suggest that during combined eye-head scanning, head movement amplitude was significantly less than in the currently used head-only scanning. In the combined eye-head scanning, the patient first steers to the region of interest using eye movements followed by head movements as in sighted individuals. This is the first demonstration that eye movements can be used in combination with head movements to steer the line of sight of a camera-based retinal prosthesis.
No information added