Do Robots Dream of Electric Scalpels?
Creators of surgical robots are embracing superhuman vision and automation to make surgeries of the future easier, and big data may also make the world’s best surgical techniques available to any surgeon—anywhere, anytime.
©2014 Intuitive Surgical Inc.
If you expect hospitals in the future to whisk patients into surgeries on a conveyer belt to be operated on expeditiously by autonomous robots, you’re about to be gravely disappointed.
While this may be technically possible someday, no patient is likely to want it, and researchers are just now beginning to introduce automation and intelligence to surgical robots. Their express goal is to aid—not replace—surgeons.
Today, existing state-of-the-art surgical robots are teleoperation systems that allow a surgeon to control their every move. Robots follow the movement of the surgeon’s hands, while they use an extremely detailed and high-resolution interface to guide it.
“Surgeons do four key things: we spend a bit of time getting into the problem, which means making a big incision or putting keyhole instruments into cavities; next, we spend most of our time getting rid of problems like tumors or injured tissue; then we spend time reconstructing; and finally, we close,” explained Peter C. Kim, a pediatric surgeon and scientist, as well as the vice president of the Sheikh Zayed Institute for Pediatric Surgical Innovation at Children’s National Health System in Washington, D.C.
Surgeons rely on their eyes, hands and intelligence to do their jobs well. Robotic systems offer enhanced vision to see beyond human capabilities, which provides real-time data to the operator, as well as programming to capture best practices and synthesize the collective intelligence of many of the world's best surgeons. This is a benefit both to people undergoing surgery and the surgical teams who are operating on them.
The Future is Here, Almost
Surgical robots are already dramatically changing how surgeons operate. But, importantly, they’re also helping patients by reducing the invasiveness of their operations—which means less pain, shorter recovery times and smaller scars.
Eventually, this will be available to everyone—no matter where you go for your surgery
From a surgeon’s perspective, Kim feels “comfortable, inspired, and assured” using surgical robots. “Our goal is to make surgery safer and more effective,” he said. “Adding intelligence and an autonomous sort of functionality to surgery, and programming the best techniques into it, will essentially make the technology available to everyone. This will decrease variations among surgeons so that in the future you won’t need to worry about where to go if you or your loved ones happen to need an operation.”
"Eventually, this will be available to everyone—no matter where you go for your surgery,” he added.
Meet the Robots: da Vinci and Raven
Researchers in universities and hospitals, as well as private companies, are currently developing different approaches to robotic surgery. While by no means an exhaustive list, robots such as Intuitive Surgical’s da Vinci, the University of Washington’s Raven, and STAR from Sheikh Zayed Institute for Pediatric Surgical Innovation at Children’s National Health System are among those helping to shape the future of surgery.
Credit: ©2014 Intuitive Surgical Inc.
In 2000, Intuitive Surgical’s da Vinci became the first robotic-assisted surgical system cleared by the U.S. Food & Drug Administration for general laparoscopic surgery. It has since been used for more than 3 million minimally invasive procedures including urology, gynecology, general surgery, thoracic surgery and cardiac surgery.
Surgeons perform procedures with da Vinci while seated at a console within the operating room, in close proximity to the patient and surgical support staff.
It has since been used for more than 3 million minimally invasive procedures
When first sitting down at the da Vinci system’s surgeon console, surgeons are often struck by “how natural the control of the instrument is,” said Catherine Mohr, vice president of medical research for Intuitive Surgical. “People struggle to find an adjective other than ‘intuitive’ to describe the feeling. But the biggest ‘aha!’ moment comes with the new sight they’re given. Magnified, three-dimensional high-definition vision allows surgeons to see and operate with enhanced vision.”
Imaging innovations such as Firefly, Intuitive Surgical’s fluorescence imaging system, and TilePro, a picture-in-picture capability, allow surgeons “to see in near-infrared or view multiple images alongside with the surgical site view to help identify target anatomy,” she added. When a surgeon uses Firefly imaging with an injectable fluorescent dye, for example, tissue with blood flow is highlighted in a green, while tissue without blood flow appears gray in the surgeon’s view so they can easily see the difference between the two.
Raven, a Semi-Autonomous Surgical Robot
Researchers at the University of Washington, in Seattle, created Raven, a semi-autonomous surgical robot, to provide support to surgeons by delivering greater dexterity and accuracy for abdominal and even precarious brain procedures.
Credit: UW BioRobotics Lab.
“We’ve been developing this technology for 15 years,” said Blake Hannaford, professor of electrical engineering and founder of the University of Washington’s BioRobotics Laboratory. “In 2002, we built a version for the military with a goal of having surgical robotics inside an armored vehicle and combat casualty care near the front lines. Then, we began using Raven as a platform within our lab for a variety of research into surgical robotics. This was right when surgical robotics started taking off as a research field and business, thanks to the success of Intuitive Surgical.”
We built a version for the military with a goal of having surgical robotics inside an armored vehicle and combat casualty care near the front lines
Hannaford and colleagues pulled together a team of universities around the U.S. working on surgical robotics and, with the help of a National Science Foundation grant, built seven Ravens to distribute. “Others heard about it and said ‘we want one too,’” Hannaford recalled. To meet this demand, they launched a spinout called Applied Dexterity, which sells Raven robots to universities and research groups who want a platform to study or prototype robotic surgery.
They’ve done some high-profile projects—including hauling Raven out to a desert in California, where they had a drone flying above relay an Internet signal so surgeons could remotely operate in that location.
In another project with NASA and the National Oceanic and Atmospheric Administration, the researchers sent Raven into an underwater habitat 60 feet below the ocean’s surface off the Florida Keys. The mission required getting Raven into “a more rugged and productized state than a typical university project,” Hannaford said, but it ultimately proved successful.
So what differentiates Raven from the da Vinci? “One of the most salient things, aside from innovative hardware, is that it’s a very compact mechanism compared to the da Vinci,” Hannaford said. “Raven doesn’t do every function that the da Vinci does—so it’s not a totally fair comparison. Another big feature is our open source software stack. Because it’s a research system, it’s a big advantage that all of the software running it is open source.”
Raven’s software stack starts with Linux kernel, and uses control software akin to a device driver for many of its control functions. “It runs in real-time mode in the Linux kernel,” explained Hannaford. “We use open source GitHub, with a layer called the robotic operating system (ROS), which is the most widely used software package in robotics around the world. We’re fully integrated with ROS.” This allows researchers to write their applications and study Raven’s capabilities within a widely accessible programming environment.
Perhaps the best part: researchers can experiment with the code without even obtaining a robot. “Some have used the code for research and gotten a cool result, then asked if they could visit our lab to try it out on a real Raven because they know it works on the software,” said Hannaford.
Outperforming Standard Clinical Techniques
The smart tissue autonomous robot (STAR) is a robotic surgery technology first demonstrated in 2016 by surgeons and scientists from Sheikh Zayed Institute for Pediatric Surgical Innovation at Children’s National Health System. They demoed a supervised, autonomous robotic soft tissue surgery —performing an anastomosis, in this case, suturing two bowel ends together—on a live pig, in vivo, in an open surgical setting. They showed that surgery with the robot is not only possible but actually outperforms standard clinical techniques in a dynamic clinical setting.
STAR’s main goal is to complement and supplement human knowledge with computer vision and knowledge. “We’re essentially trying to do things better than human capability permits,” said Kim.
More than 44.5 million soft tissue surgeries are performed within the United States each year
Kim’s group chose to perform an anastomosis as the initial test (on the pig) because it’s a complex soft tissue surgery performed more than one million times each year in the U.S. “STAR uses a vision system to guide the best location, and then intelligence algorithms can do the stapling or suturing,” said Kim. “It’ll be done the best way it can be done by a human, or, ultimately the very best way possible when you add machine-learning algorithms and it continues to improve.”
This is a big deal because more than 44.5 million soft tissue surgeries are performed within the United States each year. STAR is still a research prototype, but Kim said that the goal is to commercialize it within 3 to 4 years. Some of STAR’s individual technology systems, such as enhanced vision and intelligence, could become available even sooner.
The Challenge of Making a Safe and Effective Robot
The field of robotics is remarkably interdisciplinary. So the challenges from each subdiscipline, like mechanics or control systems, are topped by a metachallenge of integrating all of these pieces together.
Credit: Sheikh Zayed Institute for Pediatric Surgical Innovation, Children’s National Health System.
For starters, surgical robots bring “special mechanical engineering challenges because they must be precise, but also need to be elongated, narrow, and lightweight—and these things go against each other,” Hannaford said. “Industrial robots typically offer more precision than surgical robots. But industrial robots are very bulky and heavy, because they tend to be made of solid steel. In comparison, the device at the end of a surgical robot’s arm might be 30 centimeters long and about 1 centimeter in diameter. Its very elongated nature makes transmitting the motion accurately through it a very difficult mechanical engineering challenge.”
Another giant hurdle to overcome is the software involved in introducing automation. “It must be very powerful, but we also need to show that it’s safe and constantly under the control of the surgeon,” said Hannaford. “This means factoring in cybersecurity concerns, although they’re similar to those for banking or secure video communication. Only a few are unique to surgical robotics, but we are paying attention to them.”
As the first company to bring surgical robots to market, Intuitive Surgical’s Mohr noted that they faced special hurdles. “The biggest challenge is often ensuring that you’re trying to solve the right problems,” she said. “For every technology you think about applying, you need to ask: ‘How will this make the result better for the patients?’ and have a good answer to that question. Pioneering technology for a new surgical approach comes with challenges, but it also provides the opportunity to find solutions in new and creative ways.”
Eventually, surgical robots will visualize these critical structures and suggest guidelines of how surgeons can operate in real-time and work together
When it comes to optics, “if we start with a technology such as a certain type of camera chip, and ask ‘how do we apply it?’ we can probably get to a really good design that uses that chip,” Mohr added. “By asking: ‘What challenges should we work on if we want to let the surgeon see things they can’t see with their own eyes that would improve the surgery?’ we start looking at technologies like fluorescence imaging or putting chips on the tip of the endoscope so we can make a flexible scope, or thinking about integrating overlay 3D images of the anatomy into the surgeon’s view. We end up with a better solution.”
In the future, infrared vision that’s beyond humans’ vision spectrum—with or without dye to help visualize, identify, and track biological structures and surgical instruments—may move surgery in the same direction as driverless cars.
“Driverless cars started with cruise control, and then moved on to parallel parking, lane departure warning, and eventually self-driving,” said Kim. “Eventually, surgical robots will visualize these critical structures and suggest guidelines of how surgeons can operate in real-time and work together. Ultimately, we may allow them to independently perform the parts of the surgery that they can do better.”
Big Data Creates Smarter Surgical Robots
While today’s surgical robots can give surgeons easy access to best practices, tomorrow’s robots might even help shape the guidelines. Big data is expected to give robots a boost of serious smarts in the not-too-distant future. As Hannaford pointed out, Intuitive Surgical’s da Vinci robot has done more than 3 million operations on people, but these are done in isolation today. “Potentially, all of that data can go into the cloud,” he explained. “Information from each operation can be used to create a huge database to make the robots smarter and safer,” he said.
Their strategy, he said, is to use big data and the cloud to improve future models
Or it could generate statistical norms for a given operation and perhaps even alert surgeons: Hey hold on, what you’re doing here is extremely unusual. Is there a good reason for it? “The robot would have a good way to know something’s unusual by consulting data from 3 million operations,” said Hannaford. “It’s quite an exciting potential that people are recognizing. Verb Surgical is known to be working on this, but others are too.”
Verb is a stealthy joint venture between Alphabet Inc.’s Verily and Johnson & Johnson’s Ethicon. Hannaford worked for Google for a few years and helped launch Verb. Their strategy, he said, is to use big data and the cloud to improve future models of surgical robots.
Truly Autonomous Surgeries are Unlikely Soon, If Ever
The concept of autonomous devices can mean many different things to different people. When Kim compared “driverless” cars to surgical robots, the key similarity was automation. But it’s critical to remember that there’s a lot more brainpower and training involved in performing surgeries than simply driving a car.
Replacing a driver is one thing, but “replacing surgeons, who are among the most highly trained people within the human population, is quite another,” said Hannaford. “It’s not really about replacing the surgeon—we’re trying to supplement and augment the surgeon. One day a surgeon may possibly become a specialist who decides what programs to run on a patient, but we’re very far from that…much farther away than we are from driverless cars.”
Expect innovations in terms of “less invasive approaches, imaging, digitization, and optimized learning on robotic platforms,” Mohr said. Intuitive Surgical continues to focus its efforts on “developments to help us all realize that future.”
And while Kim would like to see truly autonomous surgeries one day, he doesn’t want to guess a timeframe. But his group does have a little proposal for NASA, because if you’re sent to Mars and happen to be left all alone and impaled by something—like in The Martian by Andy Weir—and need to do a procedure on yourself…wouldn’t it be nice to have a medical pod with intelligent vision and the dexterity to help save you? “Simple surgeries require precision and repetition, but ultimately some part of surgery will be autonomously done—not automated—with the intelligent assistance of a smart technology,” he said.