Danyal Fer sat on a chair several feet from the long armed robot, using two metal grips near his chest.
As he moves the handle up, down, left and right, the robot imitates each small movement with its own two arms. Then, as he squeezed his thumb and forefinger together, one of the tiny claws of the robot did the same. This is how a surgeon like Dr. Fir has long used robots to operate on patients. They can remove the prostate from the patient while sitting at the computer console in the room.
But after this brief demonstration, Dr. Fir and his fellow researchers from the University of California, Berkeley, have shown how they hope to advance the art of the arts. New types of computers have arrived. As he and the other researchers watched, the robot began to move all by itself.
With one claw, the machine will lift a small plastic ring from the same small peg on the table, transmit the ring from one claw to the other, move it over the table and hook it onto the new peg. The robot then did the same with several other rings, completing the task as quickly as possible upon advice from Dr. Fir
Training exercises are designed for humans. Moving rings from pins to pins is how surgeons learn to operate robots, just like Berkeley did. Tested automated robots can now match, or even exceed, humans in dexterity, accuracy and speed, according to new research from the Berkeley team.
The project is part of a broader effort to bring artificial intelligence into the operating room. Using several of the same technology supporting self-driving cars, autonomous drones and warehouse robots, researchers are working to create automated surgical robots as well. These methods are still far from everyday use. But progress is being accelerated.
“It’s an exciting time,” said Russell Taylor, a John Hopkins University professor and former IBM researcher known in academia as the father of robotic surgery. Where I wish we were 20 years ago. “
The aim is not to remove the surgeon from the operating room. But to reduce their burden and perhaps increase success rates – which have room for improvement – by automating specific procedures of surgery.
Robots can exceed human precision in certain surgeries, such as pinning into bone. (Jobs that are extremely risky during knee and hip replacements.) The hope is that automated robots can add precision to other tasks such as dissection or suturing, and reduce the risks that come with hard-working surgeons. too
During a recent phone call, Johns Hopkins computer scientist Greg Hager said surgical automation would be as advanced as the Autopilot software that guided his Tesla into the New Jersey Turnpike, as he spoke. The car was self-driving, he said. But his wife is still holding her hand if anything goes wrong. And she will take over when it’s time to get off the highway.
“We can’t automate the entire process, at least without human supervision,” he said, “but we can start building automated tools that make a surgeon’s life a little easier.”
Five years ago, researchers at the Children’s National Health System in Washington, D.C., designed a robot that can automatically suture pig’s intestines during surgery. It is a remarkable milestone for the future that Dr. But it came with an asterisk: The researchers implanted a tiny marker in the pig’s intestines that emit near-infrared light and help guide the robot’s movements.
This method is far from practical as it is not easy to embed or remove the mark. But in recent years, artificial intelligence researchers have significantly improved the efficiency of computer vision, which could allow robots to operate on their own without that mark.
Change is driven by so-called mathematical neural networks that can learn skills by analyzing massive amounts of data. For example, by analyzing thousands of cat photos, a neural network can learn to recognize cats. Likewise, neural networks can be learned from images taken by surgical robots.
The surgical robot is equipped with a camera that records a three-dimensional video of each surgery. The video is streamed into the viewfinder that the surgeon looks into while navigating the operation from a robot’s point of view.
But after that, these images also provide a detailed road map showing the surgical procedure. They can help young surgeons understand how to use these robots and can help train them to handle tasks. By analyzing images showing how a surgeon suggests a robot, a neural network can learn the same skills.
This is how Berkeley researchers work to automate their robots, based on the da Vinci Surgical System, a two-armed machine that allows surgeons to perform more than a million procedures per year. Fir and his colleagues compiled images of robots moving plastic rings while under human control. Their system, then, learned from these pictures, identified the best way to grab the forward ring between the claws and move it to a new peg.
But the process comes with its own asterisk. When the system tells the robot where to go, the robot usually misses a millimeter. Over the months and years of use, many of the metal cables inside the robot’s twin arms have slightly stretched and bent, so their movements aren’t as accurate.
Human workers can unknowingly compensate for this change. But automation cannot do it. This problem is common with automation technology: it struggles to cope with change and uncertainty. Autonomous vehicles are still far from widespread use as they are not nimble enough to handle all the chaos in the daily world.
The Berkeley team decided to create a new neural network that analyzes the robot’s faults and learns how accurate it has lost with each passing day. “It learns how the robots’ joints evolve over time. Go, ”said Brijen Thananjeyan, a PhD student on the team. When automation can explain this transformation, the robot can grab and move the plastic ring, which matches the efficiency of a human operator.
Other labs are trying to take a different approach, Johns Hopkins researcher Axel Krieger, who was part of a 2016 pig stapler project, is working to create a new, automated robotic arm with parts less moving and running. Worcester Polytechnic Institute researchers are developing a method for machines to carefully guide surgeons’ hands as they perform tasks such as inserting a needle for a cancer biopsy or a burn. In the brain to remove the tumor
“It’s like a car with a free lane. But you can still control gas and brakes, ”said Greg Fischer, one of Worcester researchers.
Scientists noted that many obstacles lie ahead. Moving plastic pegs is one thing. Cutting, shedding and suturing are all other things. “What happens when the camera angles change?” Said Ann Majewicz Fey, an associate professor at the University of Texas at Austin.
In the foreseeable future, automation will be the one that works with surgeons rather than replacing them. But even then, it can have profound consequences. For example, doctors could perform surgery at distances greater than the width of the operating room – from miles or more could help wounded soldiers on the distant battlefield.
Too many signal delays to make this possible at this time. But if robots can handle certain tasks on their own, remote surgery can do it. Fir said, “You can submit a high-level plan and the robot will execute it.”
The same technology is important for long-distance surgery. “When we start operating with people on the moon,” he said, “surgeons will have to use all new tools.”