Artificial Intelligence Can Help Doctors Diagnose Brain Tumors During Surgery. Here’s How.

A new laser-imaging technology and artificial intelligence are poised to significantly advance what neurosurgeons can do to help patients with brain tumors by potentially improving diagnostic accuracy and cutting testing turnaround time from 30 minutes or more to less than three minutes.

By Chris Hayhurst, Contributor

A new imaging technique and artificial intelligence (AI) are poised to significantly advance what neurosurgeons can do to help patients with brain tumors while they’re on the operating room table.

Brain tumor surgery is, traditionally, a high-stakes and time-critical procedure. During a standard operation, the physician removes small samples of brain tissue and immediately sends them to the lab to be frozen, stained, and analyzed. The results, when they’re returned, are used to make decisions—whether, for example, more resection might be necessary or chemoradiation can control the tumor instead. The challenge? The work all unfolds with the patient under anesthesia, so speed and accuracy are paramount, but the biopsy typically takes 30 minutes or longer, and establishing a definitive diagnosis can be difficult.

The new approach, described in Nature Medicine, automates the tumor-analysis process to not only improve diagnostic accuracy but also cut turnaround time to less than three minutes. The technique relies on a laser-imaging technology called stimulated Raman histology (SRH). Developed by the Santa Clara, California-based company Invenio Imaging, SRH adds color to the proteins and DNA in tissue samples to develop a computerized image visible to the human eye, explains Invenio spokesperson Fernando Corredor.

“Two lasers excite the particles in the sample and their unique vibrational properties are what generate the image,” he says. The technology sits on a wheeled platform and looks like a trash bin topped by a computer monitor. Samples are placed in a slot in the machine, and once the image is produced, it’s interpreted by a set of AI algorithms that can recognize cell features indicative of disease. “The whole processing of the image is done at the patient’s bedside,” Corredor says.

For the Nature study, led by Daniel A. Orringer, MD, a neurosurgeon at NYU Langone Health, and by Todd Hollon, MD, chief neurosurgery resident at the University of Michigan, the researchers turned to a type of machine learning called deep convolutional neural networks (CNNs). Using more than 2.5 million previously labeled SRH images from tissue samples provided by 415 patients, the team trained its CNNs to learn the histologic features of 13 categories of brain tumors.

“This technology allows us to see what would otherwise be invisible,” Orringer explained shortly after the study was published. That’s important, he added, because the insight such a system offers could potentially be applied to surgery in near real time: When a tumor has spread, the physician would know exactly to what extent, and therefore could attempt to remove it in its entirety on the spot. (The study itself included brain tissue from 278 patients at three university medical centers, and demonstrated that SRH with AI could diagnose brain tumors with 94.6 percent accuracy.)

Implications for Patient Care

Compared to many other forms of abnormal cell growth, brain tumors are relatively uncommon. According to the American Brain Tumor Association, there are around 87,000 new cases of benign and malignant brain tumors diagnosed in the United States each year, with most diagnoses occurring among adults. (By comparison, in 2019, there were about 175,000 new cases of prostate cancer, 228,000 cases of lung cancer, and 271,000 cases of breast cancer). Still, an estimated 16,000 people nationwide are projected to die from brain cancer in 2020. Surgery, when it’s an option, is the usual treatment for the majority of brain tumors, malignant and non-malignant alike.

“To get those critical details in a very rapid way—and for the diagnosis to be as accurate as you’d expect from a pathologist—is going to benefit the patient.”

—William Cance, MD, chief medical and scientific officer, American Cancer Society

What makes the approach described by Orringer and his colleagues so intriguing is the implications it has for improving patient care in a wide variety of scenarios, says William Cance, MD, chief medical and scientific officer at the American Cancer Society. A surgical oncologist himself, Cance knows what it’s like to rely on frozen sections and to wait for lab results to finish a procedure. “To get those critical details in a very rapid way—and for the diagnosis to be as accurate as you’d expect from a pathologist—is going to benefit the patient,” he predicts.

The study, Cance adds, demonstrates the power of AI to possibly revolutionize cancer diagnosis and treatment. “I believe this approach will ultimately provide a way to see how changes to certain cells are associated with individual genetic abnormalities.” Applied to surgery, he says, SRH with AI should help speed up procedures, thereby leading to significant cost savings and less-invasive surgical techniques.

“If I can know within two or three minutes whether I have a clean margin,” or no cancer cells at the outer edge of the removed tissue, “I can then potentially do less surgery as opposed to taking a wider margin [to be safe],” he explains.

Furthermore, Cance continues, the new technique, which has yet to be approved by the Food and Drug Administration, may prove useful not only in neurosurgery but also in surgical procedures for other types of cancer. “There are so many possibilities when you add machine learning and AI to the tools physicians already have at their disposal,” he says. “As a high-tech imaging approach, I think it has tremendous opportunity.”