1.4 – Doctor AI: Medical Exam

Host Jessica Chobot thinks she’s healthy. But will her check-in with AI give her a clean bill? The doctor is in on this episode of HvR.
Transcript
Subscribe
All AI: Hype vs. Reality Podcasts

If you’ve been to a medical facility recently, you may have already had some of your healthcare provided for you by AI. Massive amounts of raw medical data – images, genomic sequences, public health statistics – and with computing power up to the task, it can now be harnessed in novel, fascinating ways: diagnosing life-threatening diseases, plus assessing risk of disease and pregnancy complications. That’s just the start. Join host Jessica Chobot as she checks in for a checkup and is joined on her wellness journey by Dave Graham of Dell Technologies, Eran Orr, Founder & CEO of AR/VR health startup XR Health, Dekel Gelbman of Boston biotech firm FDNA, and Joe Marks, Executive Director of the Center for Machine Learning and Health at Carnegie-Mellon University. Together, they’ll answer: Can fully autonomous AI replace doctors, nurses, clinicians and surgeons? Find out in this episode of Hype v. Reality.

What you’ll hear in this episode:

  • Can life-threatening illnesses be diagnosed faster and more accurately with AI?
  • Will AI replace human doctors?
  • Image-based health assessments made easier.
  • Algorithms that can diagnose rare conditions just by looking at your face.
  • The enormous potential for data-driven precision medicine.
  • How robots could help the sick and the elderly.
  • Can we conquer our fear of interacting with inanimate objects?
  • The paramount importance of preserving genetic and medical data privacy.
  • Jessica punches (virtual) things.

Guest List

  • Dave Graham is the director of emerging technologies messaging at Dell Technologies and specializes in AI and social transformation.
  • Joe Marks is the Executive Director of the Center for Machine Learning and Health at Carnegie Mellon University.
  • Dekel Gelbman is the Chief Executive Officer at the artificial intelligence-based company FDNA, Inc. FDNA’s artificial intelligence system DeepGestalt is a facial analysis framework that diagnoses rare disorders.
  • Geoffrey Hinton is an engineering fellow at Google, an emeritus professor at the University of Toronto and the chief scientific adviser of the Vector Institute, which researches machine learning. Hinton is considered a pioneer in the branch of machine learning referred to as deep learning.
  • Eran Orr is the founder and CEO of the XRHealth USA. He holds a BA in Business Management, Government, and Politics, and an MBA in Entrepreneurship and Innovation Business Administration from Ben Gurion University.

Jessica Chobot: I’m Jessica Chobot and this is AI: Hype vs. Reality, an original podcast from Dell Technologies.

Jessica Chobot: I’m about to step into a virtual world to test my reflexes, memory and range of motion. All of that will be analyzed by an AI to find out if I’m as fit and healthy as I think I am, or if there’s anything that I should be worried about.

Eran Orr: My name is Eran. I am the founder and CEO of XRHealth and we are developing a medical AR/VR platform.

Jessica Chobot: So, tell me a little bit about what your AI focuses on.

Eran Orr: We’re taking the old data from the VR/AR systems. We can analyze the raw data and generate insights. Now, by adding another layer of data, context, what’s the injury of that patient, what’s his status, how he’s performing in real time. Then, you can create an AI that actually learns and sees how the patient performs and in real time, react.

Jessica Chobot: Awesome. Well, can we go do it right now?

Eran Orr: Yeah, let’s go.

Jessica Chobot: So, I’m touching all these little glowy balls that–oh, my gloves changed color!

Eran Orr: Exactly.

Jessica Chobot: Oh, that’s not fair. I didn’t even see that.

Eran Orr: You need to pay attention.

Jessica Chobot: I am paying attention.

Jessica Chobot: Okay, before we find out what happened in that virtual world, let’s dive into the hype surrounding artificial intelligence and healthcare.

Jessica Chobot: AI will diagnose life-threatening conditions quicker and more accurately. Fully autonomous surgical systems will replace operating room doctors. Home care will be improved thanks to virtual assistants and in-home robots. All of this will happen any day now.

Jessica Chobot: To find the reality in all of that hype, I’m joined by Dave Graham. He works on emerging tech for Dell Technologies. So, Dave, let’s break down those hype headlines because there’s a lot of different stuff going on here. So, first one up, can life-threatening conditions be diagnosed faster and better with AI?

Dave Graham: The answer is, it depends. What you have is the benefit of large amounts of data backing it. Think through the course of human history, it’s able to look at every single image that’s ever been done. It’s been trained on that and so, it makes educated guesses based on what would take hundreds of thousands of doctors to look at any given point. In those cases, it’s been shown to have a higher accuracy rate in certain circumstances than the comparable human doctor. It just never replaces a human doctor.

Jessica Chobot: Gotcha.

Dave Graham: We’ve heard from him before, but Geoffrey Hinton, the Godfather of Deep Learning and Neural Networks, certainly feels like we’re headed into an AI powered future in healthcare, especially when it comes to diagnostics.

Geoffrey Hinton: They show this patch of skin to a neural network and the neural network tells you what kind of cancer it is or if it’s not cancer. The neural network is as good as a dermatologist now and it’s only been trained on 130,000 examples and with time you could easily train it on 10 million examples and then it will be better than dermatologists. That’s something that’s very easy to see how you use it because you can have an app on your cellphone. You can point it at some funny patch of skin you have and you don’t need the embarrassment of going to the doctor and saying, “Is this skin cancer?”, and the doctor laughing at you, or you don’t have the disappointment of going to the doctor and saying, “Is this skin cancer?”, and the doctor saying, “Why didn’t you come before?”

Jessica Chobot: It sounds like he’s saying that this AI is actually going to be able to replace doctors.

Dave Graham: There’s augmentation of, not replacement. A lot of these things, like he points out, skin cancer diagnoses, again, image-based recognition, image-based diagnostic capabilities are very, very, I won’t say they’re easy, but they’re easier to integrate into medical diagnosis than other softer, or more nuanced type diagnoses. I don’t see them replacing things any time soon. I do see them, long-term, being a very, very valid augmentation of standard clinical diagnosis, or doctor’s office based diagnoses, like Geoffrey Hinton says.

Jessica Chobot: So, that sounds like that takes care of our second questions, our second headline, which is will all doctors be replaced by fully autonomous systems, and it sounds like they won’t be completely replaced. Do you have any real-world examples where AI actually has been used in healthcare to diagnose an illness specifically? Is there anything that people are currently using with this kind of AI?

Dave Graham: Yeah. I found out about something called DeepGestalt. It’s a group of algorithms that analyze images of people’s faces in order to diagnose rare genetic conditions. There are certain diseases that can lead to unusual facial features, but it can be really tricky for a single doctor to diagnose them based on only those features. So, that’s where DeepGestalt comes in and it was built by a Boston bio-tech company called FDNA and this is their CEO, Dekel Gelbman.

Dekel Gelbman: When a patient comes into the doctor with a rare disease, the question isn’t whether that child has Angelman syndrome or not, whether that child has Cornelia de Lange syndrome or not. The question is, what does that child have? So, it’s our job to go through all the different possibilities, rank them, and present them to the clinicians so that the clinician can do a better job with this unknown problem. In essence, what we’re trying to do is we’re trying to use a pretty big data set of patient photos to help the computer distinguish between different diseases based on how they manifest in the face. It comes from a basic collaboration between hundreds and even thousands of different geneticists around the world and every time they use the platform they contribute data that helps educate it and continue development of the technology. Would artificial intelligence replace doctors? We don’t think so. We think that doctors that use artificial intelligence will replace doctors that don’t.

Jessica Chobot: Well, that answers our hype headline number two nicely, that instead of fully autonomous doctors replacing human doctors, it might be doctors using AI replacing doctors that don’t. That being said though, how accurate is DeepGestalt at diagnosing illnesses?

Dave Graham: According to Dekel and his team, it’s as much as 20% and more accurate. So, where human experts diagnose a certain syndrome 75% of the time, DeepGestalt it around 96%.

Jessica Chobot: So, let’s move on to our third hype headline. Will virtual assistants and robots provide home care, ala better home care even?

Dave Graham: I think in the future, yeah absolutely, there is a possibility that they can be used in ways that would help the aged and infirmed. For example, having a regimented schedule of this is when you need your medication. There’s the ability to be placed in people’s homes and being able to have viable technology and connectivity back to a doctor’s office, very remote monitoring and stuff like that. There is that initial fear or initial, kind of, hurdle of interacting with something that’s inanimate, that’s not a human being that a lot of our older populations are so used to doing. I’m going to be repeating augmentation. I think it starts with augmentation and then it gradually could replace in-home care for certain type of populations. But, obviously, there’s always going to be a need for follow-up care, directed care, for folks that absolutely require it.

Jessica Chobot: It’s kind of like, did you ever see that 2012 movie, Robot and Frank, where he gets a butler and he’s kind of home healthcare thing and then he trains it how to rob because he’s actually been a cat burglar for his entire life. It’s amazing. Anyway, just throwing that out there. Well, that is all very, very interesting. Thanks again Dave. Later on in the show, I don’t know if you know about this or not, but I’m getting my health tested by a virtual reality AI app. So, hopefully we don’t find anything out that has been hiding and lurking in the background of my life. Otherwise, it’s going to be a very interesting show.

Dave Graham: Good luck with that.

Jessica Chobot: But, before we go do that I’m actually going to hear from someone whose job it is to be at the center of what’s new and emerging in AI healthcare.

Joe Marks: My name is Joe Marks. I’m the Executive Director of the Center for Machine Learning and Health at Carnegie Mellon University.

Jessica Chobot: So, Joe runs a sort of seed fund within the university. They raise money from sponsors and put that money to work with CMU faculty on what they think are the best projects in digital healthcare, which means Joe has his finger on the pulse of what is being developed. For instance, personalized medicine.

Joe Marks: So, this is the idea of tailoring your treatment for a particular disease based on your health record, your previous health history, and more importantly on your genomic profile, your genes and how they’re expressed. That’s a major trend in healthcare that will be coming over the next decades.

Jessica Chobot: And where does the AI come in? It’s analyzing and learning from massive, massive amounts of data.

Joe Marks: If you get your DNA sequenced, the raw data that comes off the sequencers, 200 gigabytes. If you then get the DNA sequence of mutating tumors or your microbiome, you can quickly see where your genomic profile could end up being terabytes and then, trying to do machine learning, essentially comparing your profile against the profile of other people to find matches and then to see what treatment worked best for them. You’re doing machine learning on data sets where the individual data elements are measured in terabytes.

Jessica Chobot: Another project Joe’s group has invested in is MyHealthyPregnancy, which is an app that uses AI to detect potentially high-risk pregnancies.

Joe Marks: So, using machine learning, predictive analytics to try and identify pregnancies that could benefit from early intervention, with the goal to providing a better outcome for mom and child and also to saving the insurance money. So, that’s another one we’re very excited about.

Jessica Chobot: When it comes to healthcare and AI, a lot of what we’re talking about is data analysis, sifting through all of the data that’s coming from various scans and tests and learning from it. But, here’s the thing, the data itself, its security, is what we should be concerned about in the future, more than AI’s replacing doctors.

Joe Marks: More data can improve treatment, can improve efficiency of providing treatment, but you have to be very careful with that data because it’s intensely private. You can’t undo the effect of a privacy breach. If somebody steals your credit card you can get a new number and a new card overnight. It’s happened to me. Somebody steals your DNA, or learns something about your mental or health condition, you can’t undo that. You can’t unring that bell.

Jessica Chobot: That is a great and sobering reality check, especially as I’m about to have my own health data collected and analyzed inside a virtual world and hopefully they won’t discover anything too concerning.

Jessica Chobot: I’m in Boston’s Innovation District to challenge healthcare hype. Specifically, I’m going to be putting current AI to the test in a physical therapy setting. XRHealth CEO, Eran Orr, has just given me a virtual reality headset, which has put me inside of a strange world to test my range of motion, accuracy and memory.

Jessica Chobot: It’s a very pretty landscape in here. My cabin in the woods.

Eran Orr: Uh-huh. Uh-huh. So, let’s start with a whiplash act. Choose the rotate We will measure your range of motion.

Jessica Chobot: Oh my gosh, there’s a dragon. Can I touch it?

Eran Orr: We will talk about the dragon in a sec, but in this app we are measuring the patient’s range of motion, response time, reaction time, and basically, after a minute, by using our app, the clinician can get more data than it can get in any other way. Hit the go button.

Jessica Chobot: Hit go, all right. I’m in a medieval town with mountains and, oh my gosh, I’m getting yelled at by a guy that tells me to keep looking at the center point. There’s a dragon to the left of me just hanging out.

Eran Orr: Your job now is to protect the dragon from those evil creatures.

Jessica Chobot: Okay.

Eran Orr: And in order to protect him all you need to do is follow him.

Jessica Chobot: With my eyes?

Eran Orr: Yeah, with your head.

Jessica Chobot: Oh, he’s cute. He’s eating these little power balls.

Eran Orr: That’s the beauty about VR. You are playing a game and that’s basically, again, it’s a game where the patient is not aware that you are actually testing his capabilities, his or her.

Jessica Chobot: Can I customize my dragon?

Eran Orr: You want to pick a different color?

Jessica Chobot: These are things you guys need to think about, I’m telling you. One gamer from another. I need to be able to name it, customize it. I should be able to grow it from an egg. I can trade with my friends. You got a whole big world we can explore here.

Eran Orr: Without a doubt.

Jessica Chobot: Okay, that’s virtual reality AI test number one. Certainly, it didn’t feel like any trip to the doctor that I’ve ever done. Felt more like, well, honestly, totally like a video game, and while the computer is crunching the results, onto test number two.

Jessica Chobot: Well, we’re back at the main menu, which is my cabin in the woods.

Eran Orr: Let’s go with Luna.

Jessica Chobot: Good. I was hoping you were going to say Luna because it looks really pretty.

Eran Orr: So, in the Luna app, it’s basically for hot flashes and for pain management.

Jessica Chobot: Oh, okay.

Eran Orr: This is probably the most, at the moment, the most advanced AI we have because, as you can see, there is a moon. Her name is Luna and this is basically an AI trainer that coaches the patient how to cope with the hot flashes and it’s interactive according to your response and according to your performance.

Jessica Chobot: So, just to paint the picture for you. I’m standing in a winter wonderland, surrounded by snow covered mountains. The ground is white and fluffy. There’s even snowflakes lazily drifting all around me.

Jessica Chobot: Oh, whoa, every time I stare at something, it starts to freeze. What’s throwing me off is that when I’m breathing, that’s when the smoke from the breath that you guys have, from being out in the cold, happens. It flips me out.

Eran Orr: All those elements can basically trick the brain and immediately affect the body.

Jessica Chobot: This is amazing. I actually feel as if my lips are cold.

Eran Orr: Now, go back. Hit the home button again. Now, we will test your memory. Go to memorize.

Jessica Chobot: This is going to be interesting.

Eran Orr: Let’s try to do three items to memorize.

Jessica Chobot: French fries, sunglasses, and cowboy hat. So, I have to find those three things?

Eran Orr: Uh-huh.

Jessica Chobot: Okay, french fries, sunglasses and cowboy hat. Is it going to trick me? Is it going to be blue sunglasses and I need red ones?

Eran Orr: No, no, no. Good luck.

Jessica Chobot: I’m standing in a big sunlit hall that kind of looks like an airport terminal and there’s conveyor belts on either side of me and coming toward me is a totally random bunch of stuff.

Jessica Chobot: Oh, gosh, I have hands. Shoot, what was it? French fries, sunglasses, and a cowboy hat.

Eran Orr: Not bad.

Jessica Chobot: Okay. French fries, sunglasses, and a cowboy hat. Am I missing it?

Eran Orr: No, no, no.

Jessica Chobot: I’m still stressed. I want to do so well. I got to get number one. I got to be top of the leader boards.

Jessica Chobot: And now for the final test. A VR app designed to test my reaction time.

Eran Orr: Now, choose the react app.

Jessica Chobot: Alright, react. Now this I think I’m going to be really good at, A, because I play a lot of video games and so I have pretty good hand-eye coordination, and B, I used to do softball for years.

Eran Orr: Okay, so let’s see. Choose applications and react. Now, you will have two boxing gloves in your hands. Your job is to touch the right light according to the color of the glove.

Jessica Chobot: So, I’m touching all these little glowy balls that match my gloves. What, what, oh my gloves changed color.

Eran Orr: Exactly.

Jessica Chobot: Oh, that’s not fair. I didn’t even see that.

Eran Orr: You need to pay attention.

Jessica Chobot: I am paying attention.

Eran Orr: But, while you are doing that, we are measuring if you are impulsive or not, differences between your right side, your left side, reaction time, response time, different things.

Jessica Chobot: And now, the moment of truth. Eran and I sit down at his computer to look at my results.

Eran Orr: Okay, you can take the headset off. Have fun?

Jessica Chobot: Yeah. Well, any game, I always have fun at any game. That’s awesome. Okay.

Eran Orr: Okay, now let’s see your results. So, here you can see your range of motion. Yep, pretty good. Let’s take a look at accuracy, for example. So, accuracy is when you follow the dragon and protected the dragon.

Jessica Chobot: Oh, okay.

Eran Orr: That’s basically where we are measuring your accuracy. So, accuracy, when someone is suffering from whiplash or cervical spine, it’s very hard for them to keep a very accurate line of sight basically.

Jessica Chobot: Because our neck’s constantly moving around?

Eran Orr: Exactly. Because it’s not stable and that’s something that we can measure relatively easy.

Jessica Chobot: So, I disagree with what I’m reading here. I got a bar graph here and it’s saying I’m down to 50%. I was following that dragon precisely all over the map.

Eran Orr: It’s more for people who are suffering from pain or injury.

Jessica Chobot: Okay. What you can’t see on the results screen is that these results are for the quality of my movement, not just my accuracy, and that’s a subtle measurement that has to do with how I move, not how closely I followed the dragon.

Eran Orr: If you are suffering from pain, you will be very hesitated to do very smooth or fast movement. So, that’s basically an indicator.

Jessica Chobot: And so, then the last test was the punching test.

Eran Orr: Exactly. Again, you did, by the way, pretty good, as you can see.

Jessica Chobot: I did really good, yeah.

Eran Orr: You did the best out of the last 15 sessions that we compare to.

Jessica Chobot: How do I compare to the rest of your office?

Eran Orr: This is your results.

Jessica Chobot: Oh yeah, okay, this is the office’s results.

Eran Orr: Exactly. This is all the other employees.

Jessica Chobot: Not to gloat, but to totally gloat, I hit 20 out of 20 targets in my last test. The rest of his office mates, about 10 out of 20. High-five.

Eran Orr: Yeah, you killed it.

Jessica Chobot: On the leader board.

Eran Orr: Yeah. But, again, here in that simple test, now we can see differences between your right side, left side. Reaction time and response time are two different things. The entire idea, and if we want we can now dive into the AI question.

Jessica Chobot: Yes, let’s dive into the AI question.

Eran Orr: So, we are basically now at stage one, or one and a half, so we have all the data component. We can have simple sequence that reacts to the insight that we are generating.

Jessica Chobot: Mm-hmm (affirmative).

Eran Orr: And the next step, I don’t know, maybe a year from now, is that instead of the clinician making the decision, the AI will make the decision. Right?

Jessica Chobot: Got it.

Eran Orr: Because if, let’s say we are seeing that you’re not getting better, not us, the AI, right? The AI see that you are not getting better, so immediately we can adjust the protocols. The next session will be according to your session.

Jessica Chobot: My results in the memory game weren’t great. I only picked the items I was supposed to memorize about 30% of the time.

Jessica Chobot: Alright, so then based on all the data that you collected on me today, overall, is there anything you think that I should be aware of, keep an eye out for, practice?

Eran Orr: So, first of all, I’m not a clinician, and I try not to take the clinician’s roles, but overall, it seems like you are doing great. Keep up the good work.

Jessica Chobot: Oh, thanks. I know you really wanted to say, “Your memory could use some work”. But, that’s fine, because I’m aware of that. Well, this was a really fun day. Thanks. I’m very impressed by it, yeah.

Jessica Chobot: I have emerged from the virtual world to discover that, A, it was a lot of fun and, B, honestly, what surprised me the most about it was the fact that I didn’t even realize that I was getting tested. Also, I’m super competitive, even with myself. So, after experiencing that virtual world, getting personalized results generated by AI, and hearing from experts, is the hype justified? Is AI going to change everything about how healthcare is delivered? Absolutely, but not in any scary ways. We’re not talking fully autonomous surgeons any time soon, but we are talking, right now, about doctors using AI to help them be better at their jobs, better at keeping us all healthy.

Jessica Chobot: That’s AI: Hype vs. Reality from Dell Technologies and if you want to see what it looked like when I stepped into that virtual world, check out delltechnologies.com/hypeVreality. Next time on the podcast, can AI’s beat us at every game, predict the outcome of any sport, and invest our money better than we can? Tune in to find out.