The Deep Learning Algorithm That’s Transforming Radiology

Earlier this year, a Stanford University study found that a deep learning algorithm called CheXnet could read images and diagnose pneumonia as accurately as radiologists. To arrive at a reliable diagnosis, CheXnet first absorbed knowledge from 112,000 existing chest x-rays of pneumonia patients, assessing patterns in patient records and diagnoses.

A few technologists at MIT – not radiologists – promptly declared the impending death of the radiologist profession. But cutting-edge radiologists say their profession is not dead or dying. Instead, they are looking forward to the advances promised by artificial intelligence, where machine learning from hundreds of thousands of anonymous patient records will bring accurate prediction to the practice of medicine – and to their patients’ health.

“Radiology is the research and development engine that’s driving all medicine,” Dr. Alex Norbash, a seasoned neuro-radiologist and professor at the University of California, San Diego, said. “We are going to be able to diagnose with incredible accuracy by interacting with massive amounts of data instead of one study at a time.”

This accuracy from deep learning could help doctors give faster and more accurate diagnoses and better tailor patients’ recoveries and treatment to what will work best for each individual.

“We are going to be able to diagnose with incredible accuracy by interacting with massive amounts of data instead of one study at a time.”
— Dr. Alex Norbash, University of California, San Diego

A New Wave of Analysis

Radiology has always been equipment driven. Incredibly expensive imaging equipment—ranging from x-ray and ultrasound hardware to advanced magnetic resonance imaging that goes beyond three dimensions—has dominated the field. Yet, these same devices also become obsolete as soon as the next iteration of the machine is on the market.

This is all about to change with the next wave of advances coming from computers and the cloud.

“Pattern recognition is something radiologists do very well,” says Dr. Kris Kandarpa, the acting director of Applied Science and Technology at the National Institute for Bioimaging and Bioengineering, a federal, medical research institute that translates theory into medical tools. “What artificial intelligence is doing is helping us focus more on the patient by helping us rule some things out early and recognize things we may not have been looking for.”

For example, Kandarpa says sometimes doctors looking for evidence of lymphoma focus on looking for the indicators for the disease. “But a machine educated in pattern recognition can look at all the patterns, even patterns radiologists don’t normally look at. It will ask, ‘did you know that such and such a thing happens in these patients as well?’ Many of our patients have co-morbidities, other illnesses we need to pay attention to,” he says.

According to Kandarpa, AI won’t just transform the interpretation end of the imaging process. It will also help acquire the image quickly and inexpensively.

“What artificial intelligence is doing is helping us focus more on the patient by helping us rule some things out early and recognize things we may not have been looking for.”
— Dr. Kris Kandarpa, acting director of Applied Science and Technology,  National Institute for Bioimaging and Bioengineering

One of the bioimaging institute’s grants went to a researcher who is developing a low-level MRI that, linked to a deep learning machine, can predict what a higher level MRI will see. This accurate prediction, if realized, will cut the length of time for very detailed MRIs, enabling four times as many patients to be imaged by one machine.

“MRIs will get cheaper when there’s technology that can do 3D imaging that predicts what a 7D image from a million dollar machine will be,” Kandarpa said.

Big Data, Precise Diagnosis

For Norbash, harnessing the big data created by AI in imaging will change the precision of diagnostics altogether. “I think quantitative imaging will transform radiology,” he says. “We’re starting to measure stuff in a very accurate way.”

Norbash says pairing existing imaging with deep learning (like ChexNet) will make it possible to share information that’s far more useful to patients and doctors. “What we say now is that [the results] appear normal,” he explained. “Soon, we’ll be able to measure the [degree of] health of what we’re looking at.”

For example, patients will get diagnoses that include greater detail about a muscle or ligament injury, for example, and what the normal course of recovery will be, based on very precise imaging interpretation melded with the deep learning from thousands – if not millions – of identical injuries in other patients.

As doctors obtain more accurate measurements, such as the volume of vital organs like the lungs, heart and brain, they’ll be able to measure changes to these organs—and whether they’re caused by illnesses like emphysema or brain tumors where unhealthy tissue displaces healthy tissue. The ability to measure and compare change also means that doctors will be able to identify the effectiveness of treatments, such as how precisely an approach reduces or at least prevents growth in tumors.

Norbash says, “Using an MRI, we can see blood flow to the heart and you can see it over time. We will not only see it, we’ll be able to measure it to get an accurate representation of how hard the heart is pumping.”

All of that means more personalized care, and sooner.

“Think of that means for our patients: faster and more thorough diagnoses with much more information about what their bodies are doing and what they can expect and what they can do about it,” Norbash says.

Moreover, the predictive power of looking at hundreds of thousands of medical histories from patients will heighten preventive medicine to a precise and useful tool far beyond annual exams and tests.

“With big data we will have incredible fire power, and it is possible that we will have a predictive ability based on a massive population of patients,” Norbash said. “We may have the predictive ability to foresee a failure moment without that failure actually happening. And we’ll have a lot fewer conversations that go: I think it might be this.”

Improving Patient Care

Advances in technology—and in the ability to share information instantaneously are breaking down silos in medical practices—as did the arrival of interventional radiology, where cardiologists now work in close proximity with radiologists. For example, doctors use live images to slide stents through veins to the heart where surgeons once cracked open chests.

“We may have the predictive ability to foresee a failure moment without that failure actually happening.
— Dr. Kris Kandarpa

“Now, we’re replacing heart valves without cutting patients,” Kandarpa said. “This is what imaging has allowed. It has not only made diagnosis more precise, it has moved surgical procedures from very invasive to one where people can walk away the next day.”

Norbash says the ability to scale up to the universe of information will transform his profession, granting doctors like him not only the time to provide better patient care, but also conduct more analytical medical research.

“I can envision how data science and automated detection and recognition are going to allow us early on to recognize which radiological finds are normal so I can spend more time looking at abnormal MRIs,” Norbash said. “I’ll be able to study abnormalities exhaustively.”

Automated recognition will also reduce the routine work and free up radiologists to be able to see more patients.

“In our country we have disparities of care where certain groups have higher stroke, heart attack and cancer rates and need better health care,” Norbash lamented. “These advances will cause a redistribution of radiology efforts because now I can focus on the patients who I can truly help.”