NeuroAR
AR in robotic neurosurgery
Abstract
Robotic needle positioning tasks in neurosurgery often face challenges due to insufficient perception of planar guidance images during surgery. In this work, we propose an Augmented Reality (AR) interface to help perform the robotic needle positioning tasks by learning from demonstration (LfD). Enhanced immersion in the workflow is achieved by displaying surgical scenes and calculated navigation information.
The framework utilizes mixed interactive interfaces in virtual and real environments, enhancing demonstration efficiency and quality. A head-mounted display and an optical tracking system are utilized to perform the visualization and needle tracking. Gaussian Mixture Model (GMM) and Gaussian Mixture Re- gression (GMR) are employed to learn a robust and smooth trajectory policy from the demonstration. Robot reproduction experiments on the needle positioning task achieved an error of 0.6 mm in the final position and an average error of 1.07 mm during the trajectory. Comparative user studies with haptic device-based teleoperation exhibit a low time cost of 62.76 s and reduced workload of the proposed system.