Quantitative Evaluation of User Performance in Minimally Invasive Surgical Procedures
Sathia Narayanan, Madusudanan
MetadataShow full item record
Skilled human interactions are efficient, repetitive and easily noticeable. Identification, analysis and verification of skilled activities (and their performers) are critical to unlocking the potential of human machine interfaces ( HMI ) used in innumerable robot teleoperation settings from remote-controlled vehicles to robotic- surgery. Yet the underlying processes (inter-coupled perceptual, sensory, and cognitive aspects) of such interactions still remain elusive and difficult to characterize (let alone quantify) which has served to motivate our efforts. We seek to evaluate and study the human motor behaviors, by use of system-identification principles with controlled experimentation, to model and capture manipulation and interaction performance in quantifiable manifest skill-levels. And in lieu of an abstract treatment, we concretize our efforts in the context of surgical procedural assessment and training in this research. Though, developing an abstract assessment framework for use in clinical curriculum is a part of our long-term plan, we present our quantitative analyses specific to minimally invasive procedures: (a) robotic laparoscopic or minimally invasive surgeries (MIS) and (b) percutaneous needle biopsies (PNBs). These procedures demand expertise not only in exhibiting efficient motor actions but more importantly in making reliable decisions based on continuous sensory and cognitive feedback aspects. Therefore, by seeking to formulate quantitative methods to evaluate surgeons and physician for these two cases, we also believe realize the objective of a unified, generalizable and scalable assessment framework that can be applicable to other procedures as well. A combination of virtual and physical experiments involving phantoms and cadavers (approved by SUNY HSIRB) were subsequently administered to validate our methods and performance metrics. The surgeons and trainees with varied levels of expertise were recruited from the respective specialties. For MIS case studies, Intuitive Surgical's da Vinci surgical robot with its SKILLs simulator and a custom-built laparoscopic box trainer with instrumented tools were used as testbeds to generate the desired data corpus. The PNB experiments were conducted using our simulator-trainer framework, Augmented Reality SIMulator for Biopsies ( AR-SIMBiopsies ) that can replicate the 'feel' and 'look' of typical tissue phantoms and enables seamless recording of surgical force and motion signatures, under different (in-vitro and ex-vivo) scenarios. A discrete finite-state segmentation approach relying on fundamental surgical motion-blocks, called Therbligs was proposed. The raw experimental data were then manually annotated with Therblig information for each active tool (ground truth dataset) and the resulting time series data were post-processed (filter, interpolation, differentiation and normalization) to train and validate our Therblig classifiers ( T-class ). A comparative analysis using the predicted Therbligs between different users revealed discriminative signatures of experts and trainees and were used to quantify surgical efficacy (dexterity, motion and force economy). The task segmentation for both case-studies yielded additional performance measures to identify skill deficiencies and ineffective motions confirming the concurrent validation of our assessment metrics. For the specific case of PNB, the significance of force-modulation and force-based measures were also shown in quantitative terms. The final part of this research outlines our ongoing work in developing a purely video-based surgical performance evaluation and feedback framework. Different modules of this framework include tool-detection, tracking, semantic identification and pose estimation. The preliminary results obtained from this framework using real-surgical data are also presented prior to discussing the limitations of our current efforts as well as the future work.