News — NIBIB-funded scientists and engineers are teaming up with neurosurgeons to develop technologies that enable less invasive, image-guided removal of hard-to-reach brain tumors. Their technologies combine novel imaging techniques that allow surgeons to see deep within the brain during surgery with robotic systems that enhance the precision of tissue removal.
A robot that worms its way in
The median survival rate for patients with glioblastomas, or high grade primary brain cancer, is less than two years[1]. One factor contributing to this low rate is the fact that many deep-seated and pervasive tumors are not entirely accessible or even visible when using current neurosurgical tools and imaging techniques.
But several years ago, J. Marc Simard, M.D., a professor of neurosurgery at the University of Maryland School of Medicine in Baltimore (UMB), had an insight that he hoped might address this problem. At the time, he had been watching a TV show in which plastic surgeons were using sterile maggots to remove damaged or dead tissue from a patient.
“Here you had a natural system that recognized bad from good and good from bad,” said Simard. “In other words, the maggots removed all the bad stuff and left all the good stuff alone and they’re really small. I thought, if you had something equivalent to that to remove a brain tumor that would be an absolute home run.”
And so Simard teamed up with Rao Gullapalli, Ph.D., professor of diagnostic radiology and nuclear medicine also at UMB, as well as Jaydev Desai, Ph.D., professor of mechanical engineering at the University of Maryland, College Park, to develop a small neurosurgical robot that could be used to remove deep-seated brain tumors.
Within four years, the team had designed, constructed, and tested their first prototype, a finger-like device with multiple joints, allowing it to move in many directions. At the tip of the robot is an electrocautery tool, which uses electricity to heat and ultimately destroy tumors, as well as a suction tube for removing debris.
“The idea was to have a device that’s small but that can do all the work a surgeon normally does,” said Simard. “You could place this small robotic device inside a tumor and have it work its way around from within, removing pieces of diseased tissue.”
A key component of the team’s device is its ability to be used while a patient is undergoing MRI. By replacing normal vision with continuously updated MRI, the surgeon is able to visualize deep-seated tumors and monitor the robot’s movement without having to create a large incision in the brain.
In addition to reducing incision size, Simard says the ability to view the brain under continuous MRI also helps surgeons keep track of tumor boundaries throughout an operation. “When we’re operating in a conventional way, we get an MRI on a patient before we do the surgery, and we use landmarks that can either be affixed to the scalp or are part of the skull to know where we are within the patient’s brain. But when the surgeon gets in there and starts to remove the tumor, the tissues shift around so that now the boundaries that were well-established when everything was in place don’t exist anymore, and you’re confronted once again with having to distinguish normal brain from tumor. This is very difficult for a surgeon using direct vision, but with MRI, the ability to discriminate tumor from non-tumor is much more powerful.”
Steve Krosnick, M.D., a program director at NIBIB, says real-time MRI guidance during brain tumor surgery would be a tremendous advantage. “Unlike pre-operative MRI or intermittent MRI, which requires interruption of the surgical procedure, real-time intra-operative MRI offers rapid delineation of normal tissue from tumor while accounting for brain shifts that occur during surgery.” But designing a neurosurgical device that can be used inside an MRI magnet is no easy task. One of the first issues you have to consider, said Gullapalli, is a surgeon’s access to the brain. “When you scan a person’s brain during an MRI, he’s deep inside the machine’s tunnel. The problem is, how do you get your hands on the brain while the patient’s in the scanner?”
The team’s solution was to give the surgeon robotic control of the device in order to circumvent the need to access the brain directly. In other words, a surgeon can insert the robot into the brain while the patient is outside of the scanner. Then, when the patient moves into the scanner, the surgeon can sit in a different room and –while watching MRI images of the brain on a monitor—move the robot deep inside the brain and direct it to electrocauterize and aspirate the tissue.
Jaydev Desai, the team’s mechanical engineer, says the most challenging aspect of the project has been designing a robot that can be controlled inside the magnetic field of an MRI. While robots are often controlled via electromagnetic motors, this was not an option because, besides being magnetic, these motors create significant image distortion, making it impossible for the surgeon to perform the task. Other potential mechanisms such as hydraulic systems were off the table due to concerns about fluid leakage.
Instead, Desai decided to use shape memory alloy (SMA)—a material that alters its shape in response to changes in temperature—to control the robot’s movement. In the most recent prototype—developed by Desai and his team at the Robotics, Automation, and Medical Systems (RAMS) laboratory at the University of Maryland, College Park—a system of cables, pulleys and SMA springs are used. This cable and pulley system is an improvement from their previous prototype which caused some image distortion.
With continued support from NIBIB, Desai and colleagues are now working to further reduce image distortion and to test the safety and efficacy of their device in swine as well as in human cadavers. Though it will be several years before their device finds its way into the operating room, Simard is excited by the prospect. “Advancing brain surgery to this level where tiny machines or robots could navigate inside people’s heads while being directed by neurosurgeons with the help of MRI imaging…It’s beyond anything that most people dream of.”
Scoping the brain
On the opposite side of the country, a different group of engineers and neurosurgeons is also working to develop an image-guided, robotically-controlled neurosurgical tool. Lead by Eric Seibel, Ph.D., a professor of mechanical engineering at the University of Washington, the team is attempting to adapt a scanning fiber endoscope—a tool initially developed by Seibel to image inside the narrow bile ducts of the liver—so that it can be used to visualize the brain during surgery.
An endoscope is a thin, tube-like instrument with a video camera attached to its end that can be inserted through a small incision or natural opening in the body to produce real-time video during surgery. Endoscopes are an essential component of minimally invasive surgeries because they allow surgeons to view the inside of the body on a monitor without having to make a large incision.
However, there are many parts of the body such as small vessels and ducts as well as areas deep in the brain that are inaccessible to conventional endoscopes. Although ultrathin endoscopes have recently been developed, Seibel says these smaller scopes come with the price of greatly reduced image resolution.
“Right now, with the current state of the art ultrathin endoscopes, I calculate based on the field of view and their resolution that the person looking at that display would see so little as to be classified in the US as legally blind,” said Seibel.
But with support from NIBIB over ten years ago, Seibel began working on a new type of endoscope that could fit into tiny crevices in the body while retaining high image quality. His end product was a new type of endoscope that, despite having the diameter of a toothpick, can provide doctors with microscopic views of the inside of the body.
Seibel retained image quality while significantly reducing the size of his scope by eschewing traditional endoscope models. Instead of a light source and a video camera, Seibel’s scope consists of a single optical fiber—approximately the size of a human hair—located in the middle of the scope. The fiber releases white laser light (a combination of green, red, and blue lasers) when vibrated at a particular frequency. By directing the laser light through a series of lenses in the scope, it can be reflected widely within the body, providing a 100 degree field of view. As the white laser light interacts with tissue, it picks up coloration and scatters it back to a ring of additional optical fibers which transmit this information to a monitor.
“It’s almost like putting your eyes inside the body so you can see with the wide field view of your human vision,” said Seibel.
In collaboration with three neurosurgeons and an electrical engineer, Seibel is now working to secure his novel endoscope to the tip of a robotically controlled micro-dissection neurosurgical tool.
As opposed to larger traditional endoscopes, Seibel say his scanning fiber endoscope is barely noticeable.
“It’ s like a piece of wet spaghetti,” said Seibel. “It’s even smaller then a piece of wet spaghetti in diameter, but it feels like that. So when it is actually at the tip of the surgeon’s tool, the surgeon wouldn’t feel it dragging behind her.”
One advantage of having the endoscope under robotic control is that the brain can be imaged at a higher magnification.
“A surgeon couldn’t hold a microscope steady in her hand while performing surgery, but the robot can,” said Seibel. Microscopic detail is essential when trying to determine the border between healthy tissue—which if removed could lead to neurological deficits—and cancerous tissue—which if left in the brain could allow a tumor to return.
Krosnick says he’s excited by the combination of high-quality imaging and robotic enabled micro-neurosurgery. “It addresses a critical need, which is to discern tumor margins at high resolution while minimizing disruption to normal structures.”
Seibel believes this discrimination between cancerous and healthy tissue could be enhanced even further by taking advantage of the fact that his scanning endoscope is also able to detect fluorescence. One of the main focuses of his current research is a collaboration with Jim Olson, M.D., Ph.D. at the Fred Hutchinson Cancer Research Center, who is the inventor of a substance called “tumor paint”.
Tumor paint is a fluorescent probe that attaches to cancerous but not healthy cells when injected into the body. Seibel says the ultimate goal would be to give a patient an injection of tumor paint and then use his endoscope to create an image of the fluorescing cancer cells as well as a colored anatomic image of the brain. The two images could then be merged on a screen for the surgeon to view during an operation.“You would be able to see all the structure that a surgeon would see, but you’d also see those molecular pinpoints of light that are cancer cells...and from there the robot can be used to resect, or remove, these small cells of cancer, and it can do it very precisely because you don’t have the shaking of a human holding it.”
Seibel concluded by saying, “There’s a real niche for video-quality, high-resolution, multi-modal imaging that’s in a tiny package so that it can be put on microscopic tools for minimally invasive medicine. I really feel it’s an enabling technology that could move the whole field forward.”
Krosnick is enthusiastic about the progress the two teams have made so far. “These are innovative technologies that, if effective, could significantly add to the brain surgery armamentarium. They’re still early in development, but I think both show considerable promise.” He concluded by emphasizing that, like all new devices, these technologies would need to undergo a series of clinical trials to ensure that they are safe and effective before making their way into an operating room.
References
[1]Cochrane Database Syst Rev. 2013 Apr 30;4:CD007415. doi: 10.1002/14651858.CD007415.pub2. Temozolomide for high grade glioma. Hart MG, Garside R, Rogers G, Stein K, Grant R.,Academic Division of Neurosurgery, Department of Clinical Neurosciences, Department of Neurosurgery, Cambridge, UK.
MEDIA CONTACT
Register for reporter access to contact detailsArticle Multimedia
CITATIONS
R01EB016457; R01EB015870