How Intuitive Surgical advances robotic-assisted surgery with machine learning

Problem

One of the primary bottlenecks to speeding up advances in minimally invasive robotic surgery remains obtaining all of the labeled data needed to train machine learning models and making sure that the data is organized under a consistent ontology.

Solution

Intuitive Surgical leveraged Labelbox’s Annotate product, using the native video editor to annotate their unstructured data, measure labeling velocity & efficiency, as well as speed up annotation workflows between domain experts and labelers using model-assisted labeling. 

Result

The Intuitive Surgical data science team is now able to adopt a data-driven approach to scale their labeling efficiency and throughput, while lowering their overhead in gathering metrics on performance and quality, doubling the speed at which they can deliver labeled datasets for their multiple ML models.

Note: This post is a shortened recap of a virtual talk from Xi Liu, Manager of Machine Learning and Digital Solutions at Intuitive Surgical, during Labelbox Accelerate (Nov 2022).


Intuitive Surgical is a global technology leader in minimally invasive care and the pioneer of robotic-assisted surgery. Their products are designed to improve clinical outcomes for patients through minimally invasive surgery, most notably with the da Vinci Surgical System. Intuitive Surgical’s data science team has been focused on leveraging the latest advances in machine learning to drive novel insights and facilitate surgical data science and discovery. 


Surgical interventions have evolved over the years and in recent decades. It’s been shown that minimally invasive surgery can help surgeons operate more efficiently and accelerate patient recovery times with less trauma on their body. Surgeons are now capable of harnessing specialized tools like the da Vinci System to perform incisions using tiny wristed instruments during surgery. A few examples of use cases and AI-powered applications that currently operate behind the scenes with the help of ML include assessing surgical performance and efficiency, identifying skilled tool use and choreography, and better planning operational aspects of operating room resources. However, one of the primary bottlenecks to speeding up these advances remains the need to obtain and refresh all of the labeled data for training these machine learning models.


An initiative which Intuitive Surgical’s data science team tackled was the ability to automatically detect and track surgical instruments in surgical videos with the goal of enabling transformational interventions. On the patient side, a robotic manipulator is directly mimicking a surgeon's hand to perform surgery on a patient's body and by using all of these sets of robotic systems, the Intuitive Surgical team gets access to a rich set of data. Annotating bounding boxes frame-by-frame in tens of thousands of videos is a tedious and time consuming process, because a large variety of surgical tools and surgeries must be captured for robust model training. The Intuitive Surgical team’s approach was to figure out easy ways to not just label data, but utilize informative data such as timestamps of instrument installation and removal. 


To accomplish this, Intuitive Surgical leveraged Labelbox’s Annotate product, using the native video editor to label their unstructured data, measure labeling velocity & efficiency with detailed metrics, as well as speed up annotation workflows between domain experts and labelers using model-assisted labeling. Labelbox’s tools helped significantly reduce their annotation workload needed to train robust surgical tool detection, localization, while tracking the training data used between different model versions.


According to Xi Liu, who manages the spatial labeling and and data science team at Intuitive Surgical, “The key question we are tackling is how do we make surgery a better experience? The goal is to achieve a more efficient annotation pipeline so that given all this rich data that we collecting, we want to provide insights with actionable and trusted feedback that help surgeons improve their performance. We rely on collaborative software to help align our different teams such as our clinical teams and data science teams to ensure that we have a clearly defined ontology. This ensures that all labeling activities are consistent and provides meaningful value to our models.”


In terms of results due to adopting the Labelbox platform, the Intuitive Surgical team is now able to offer richer spatial annotation for their video projects and label where objects of interest are present in the camera feed for classification. Furthermore, their team is now able to scale their labeling efficiency and throughput, lower the overhead needed to gather metrics on performance and quality, and double the speed at which they can deliver labeled datasets for their multiple ML models.