Can AI Catch What Doctors Miss? | Eric Topol | TED

TED
9 Dec 202314:06

Summary

TLDRThe script explores groundbreaking advancements in biomedical and healthcare technology through the lens of Scripps Research's experiences, highlighting the transformative impact of AI, particularly AlphaFold by DeepMind, in protein structure prediction. It discusses how AI has revolutionized diagnostic accuracy, from identifying diseases through retinal images to improving medical imaging and pathology. The narrative also touches on the potential of transformer models and GPT-4 in enhancing healthcare delivery, reducing diagnostic errors, and liberating medical professionals from administrative tasks. The speaker shares compelling patient stories where AI significantly improved diagnostic outcomes, underscoring a future where AI empowers deeper patient-doctor connections and precision medicine.

Takeaways

  • 💻 AlphaFold, developed by DeepMind, has significantly accelerated the process of determining the 3D structures of proteins from taking years to just minutes.
  • 📈 The American Nobel Prize recognized the contributions of Demis Hassabis, John Jumper, and their team for AlphaFold, despite their admitted lack of understanding of the underlying transformer model.
  • 🛠 AlphaFold's success has inspired advancements in predicting structures of not only proteins but also RNA, antibodies, and novel proteins not found in nature.
  • 📱 Diagnostic errors in medicine are a major issue, with a Johns Hopkins study indicating they contribute to 800,000 American deaths or serious disabilities annually, a problem AI could help address.
  • 👁 AI has shown remarkable accuracy in medical imaging, such as determining gender from retinal images with 97% accuracy, outperforming human experts in many areas.
  • 🧪 Deep learning models have demonstrated the ability to detect health issues from retinal scans that are not visible to the human eye, predicting diseases such as Alzheimer's years in advance.
  • 📊 Transformer models, like GPT-4, are setting new standards in AI's capability to process and understand complex data, including language, images, and speech.
  • 📚 The implementation of AI in healthcare promises 'keyboard liberation,' improving patient-doctor interactions by reducing the clerical burden on healthcare providers.
  • 📷 Moorfields Eye Hospital's use of a foundation model to predict various health outcomes from retinal images demonstrates the potential for comprehensive diagnostics from single AI models.
  • 📝 ChatGPT and similar AI technologies have successfully diagnosed complex medical conditions, underscoring their potential to augment or even outperform human clinical judgment in challenging cases.

Q & A

  • What breakthrough allowed protein structure prediction to go from taking years to just minutes?

    -The breakthrough was the work of AlphaFold, a derivative of DeepMind, which takes the one-dimensional amino acid sequence of a protein and can predict its 3D structure at the atomic level in just minutes. This used to take scientists years to determine experimentally.

  • What impact could more accurate protein structure prediction have on fields like RNA, antibodies, and genome analysis?

    -More accurate protein structure prediction models like AlphaFold have inspired similar models for predicting RNA structures, antibody structures, and even detecting mutations across the genome. This could massively accelerate research and discovery in these fields.

  • Why does the speaker suggest the AlphaFold team should get an asterisk on their award?

    -The speaker notes that the AlphaFold team of 30 scientists does not fully understand how their transformer model works. So he jokingly suggests they should get an asterisk on their award since the AI itself deserves some credit.

  • How could AI help reduce diagnostic medical errors?

    -AI has shown it can analyze medical images as well as or better than experts, catching things human doctors miss. With enough accurate labeled training data, AI diagnostic tools could significantly reduce errors and improve health outcomes.

  • What remarkable diagnoses has AI made that stumped teams of human doctors?

    -In one case, ChatGPT correctly diagnosed a boy with occulta spina bifida when 17 previous doctors missed it over 3 years. In another, it diagnosed limbic encephalitis in a patient incorrectly told she had long COVID. AI tools show real promise in augmenting human diagnostic capabilities.

  • How could transformer models like GPT help transform medicine?

    -Transformer models can process huge datasets spanning text, images and speech. Applied to healthcare, they could help automate keyboard/data entry tasks, analyze patient history to assist diagnosis, generate notes from doctor-patient conversations, provide decision support, and more.

  • What is keyboard liberation in healthcare and why does it matter?

    -Keyboard liberation refers to freeing doctors from having to do data entry and clerical work, letting them focus on patients. With AI handling notes, prescriptions, referrals etc. based on conversations, it improves doctor-patient relationships and outcomes.

  • How was the Moorfields eye disease prediction model different from previous disease detection AIs?

    -Unlike previous models trained on images for single diseases, this open-source model was trained on 1.6 million eye images to predict likelihoods for 8 different eye conditions with one unified model. It demonstrates the expanded capabilities from transformer architectures.

  • What future positive impacts does the speaker foresee AI having on medicine?

    -The speaker is excited about keyboard liberation freeing up doctor time, improved doctor-patient relationships as a result, AI assistance making diagnosis faster and more accurate, and transformed medical education so students learn up-to-date best practices augmented by AI.

  • What advice does the speaker give to a medical student about embracing the coming age of AI in medicine?

    -He tells the student doctor that he's lucky to be practicing in an era with keyboard liberation, more time with patients, AI diagnostic assistance, and better patient care. But he notes AI still needs extensive validation before being deployed widely.

Outlines

00:00

🧬 The Revolution of Protein Structure Prediction with AI

The speaker shares their experience at Scripps Research, highlighting the rapid advancements in protein structure prediction due to AlphaFold, a derivative of DeepMind. This AI model has significantly reduced the time required to define the 3D structure of proteins from years to minutes, revolutionizing the field. It has spurred the development of various models for predicting protein structures, RNA, antibodies, and even designing novel proteins. However, the speaker raises a thought-provoking question regarding the recognition of AI contributions, pondering if AI should receive acknowledgment in awards when the underlying mechanisms of models like transformers are not fully understood by their creators.

05:03

🔬 AI's Impact on Medical Diagnostics and Precision Medicine

The speaker discusses the potential of AI to address diagnostic errors in medicine, a significant issue highlighted by studies indicating that such errors contribute to a considerable number of deaths and disabilities annually. By leveraging AI, the medical community can achieve not just precision but accuracy in diagnostics. The speaker provides examples of AI's ability to discern details undetectable by human eyes in medical imaging, such as distinguishing gender from retinal images or identifying diseases from chest X-rays with remarkable accuracy. These advancements underscore the potential of supervised learning and AI in enhancing diagnostic processes and medical outcomes.

10:04

👁️ Breakthroughs in Health Care and Medicine Through AI

The narrative concludes with the introduction of a groundbreaking foundation model in medicine, capable of predicting various health outcomes from retinal images. This model represents a significant leap forward, consolidating multiple studies into a single, open-source model. The speaker shares compelling patient stories where AI, specifically ChatGPT, accurately diagnosed conditions missed by numerous doctors, showcasing AI's potential to revolutionize healthcare. The conversation shifts towards the future of medicine, emphasizing the transformative impact of AI on the doctor-patient relationship, diagnostics, and overall healthcare efficiency. The speaker reflects on the exciting prospects for future generations of medical professionals, highlighting the importance of validation to ensure that the benefits of AI in healthcare substantially outweigh any risks.

Mindmap

Keywords

💡AlphaFold

AlphaFold is an AI program developed by DeepMind that predicts the 3D structure of proteins based on their amino acid sequence. This breakthrough has significantly accelerated biological research, reducing a process that could take years to mere minutes. In the video, AlphaFold is credited for its revolutionary impact on structural biology, exemplified by its ability to rapidly and accurately model protein structures, which previously was a time-consuming task for researchers.

💡Transformer model

The transformer model is a type of deep learning architecture primarily used for understanding language. It has a unique mechanism called 'attention', allowing it to weigh the importance of different words in a sentence. The video raises a philosophical question about the recognition of AI contributions, citing the creators of AlphaFold, who utilized transformer models, yet do not fully understand how these models make their predictions, suggesting an asterisk (*) might be added to their award to acknowledge the AI's role.

💡Diagnostic medical errors

Diagnostic medical errors refer to mistakes made in diagnosing a patient's condition. The video highlights this as a significant issue in healthcare, with a large number of patients affected annually. It suggests AI's potential to reduce these errors, thereby improving patient outcomes and safety.

💡Precision medicine

Precision medicine is a medical approach that tailors treatment to the individual characteristics of each patient. The video criticizes the repetitive nature of errors in the medical field, humorously noting that making the same mistake consistently is not the goal of precision medicine. Instead, it advocates for both precision and accuracy, suggesting AI could play a key role in achieving this.

💡Supervised learning

Supervised learning is a type of machine learning where the model is trained on a labeled dataset, meaning each training example is paired with an output label. The video discusses supervised learning in the context of training AI to interpret medical images, such as retinal scans, to diagnose diseases or predict health conditions that are not visible to human eyes.

💡Machine vision

Machine vision refers to the ability of a computer to 'see' and interpret images. The script discusses several instances where machine vision, through the use of AI, surpasses human capabilities in medical imaging, such as detecting polyps during colonoscopy or identifying diseases from X-rays and scans, highlighting the potential of AI to enhance diagnostic accuracy.

💡Self-supervised learning

Self-supervised learning is a machine learning technique where the system learns to understand data by itself, without the need for labeled data. The video mentions this as a solution to a significant bottleneck in medicine—the lack of expert-labeled images for training AI—allowing for more scalable and efficient learning from medical images.

💡GPT-4

GPT-4 is a state-of-the-art language model developed by OpenAI, known for its multimodal capabilities, integrating language, images, and speech. The video discusses its potential in medicine, emphasizing its role in processing vast amounts of information and its application in generating synthetic notes and assisting with diagnoses, leading towards 'keyboard liberation' for clinicians.

💡Convolutional neural networks

Convolutional neural networks (CNNs) are a type of deep learning algorithm specifically designed to process pixel data and are widely used in image recognition and processing tasks. The video references CNNs in the context of medical imaging, where they have been utilized to detect diseases and health conditions from various scans and tests, showcasing their effectiveness in augmenting human diagnostic capabilities.

💡Keyboard liberation

Keyboard liberation refers to the use of AI to automate the documentation process in healthcare, freeing clinicians from the burdensome task of manual data entry. The video envisions a future where AI-generated synthetic notes and automated clerical tasks enable doctors to focus more on patient care, enhancing the patient-doctor relationship by providing clinicians more time to engage with their patients.

Highlights

AlphaFold's impact on reducing protein structure prediction time from years to minutes.

Recognition of AlphaFold's creators by the American Nobel Prize.

AlphaFold's influence on the development of other prediction models for proteins, RNA, and antibodies.

Discussion on whether AI should receive credit in awards when creators do not fully understand the underlying models.

The significant issue of diagnostic medical errors leading to deaths and disabilities.

AI's potential to reduce diagnostic errors and advance precision medicine.

AI's remarkable accuracy in determining patient sex from retinal images.

AI's capabilities surpassing human experts in medical imaging diagnosis.

Machine vision's superior performance in detecting polyps during colonoscopies.

Deep learning models uncovering medical conditions from retina images beyond human capability.

AI's unexpected ability to diagnose systemic diseases from electrocardiograms and chest X-rays.

Transformation from deep learning to transformer models in AI, enhancing data interpretation.

Introduction of GPT-4 and its multimodal capabilities, revolutionizing medical data analysis.

AI's role in liberating healthcare professionals from administrative tasks.

Examples of AI diagnosing rare medical conditions successfully where human doctors failed.

GPT-4's competitive diagnostic performance against expert clinicians.

The future of medicine with AI integration, offering more time for patient-doctor interaction.

Transcripts

play00:05

I've had the real fortune of working at Scripps Research

play00:09

for the last 17 years.

play00:11

It's the largest nonprofit biomedical institution in the country.

play00:16

And I've watched some of my colleagues,

play00:19

who have spent two to three years

play00:21

to define the crystal 3-D structure of a protein.

play00:26

Well, now that can be done in two or three minutes.

play00:29

And that's because of the work of AlphaFold,

play00:32

which is a derivative of DeepMind, Demis Hassabis and John Jumper,

play00:38

recognized by the American Nobel Prize in September.

play00:42

What's interesting, this work,

play00:44

which is taking the amino acid sequence in one dimension

play00:49

and predicting the three-dimensional protein at atomic level,

play00:54

[has] now inspired many other of these protein structure prediction models,

play01:00

as well as RNA and antibodies,

play01:03

and even being able to pick up all the missense mutations in the genome,

play01:08

and even being able to come up wit proteins

play01:12

that have never been invented before, that don't exist in nature.

play01:16

Now, the only thing I think about this is it was a transformer model,

play01:20

we'll talk about that in a moment,

play01:22

in this award, since Demis and John

play01:27

and their team of 30 scientists

play01:29

don't understand how the transformer model works,

play01:33

shouldn't the AI get an asterisk as part of that award?

play01:39

I'm going to switch from life science,

play01:41

which has been the singular biggest contribution just reviewed,

play01:45

to medicine.

play01:47

And in the medical community,

play01:49

the thing that we don't talk much about are diagnostic medical errors.

play01:55

And according to the National Academy of Medicine,

play01:58

all of us will experience at least one in our lifetime.

play02:01

And we know from a recent Johns Hopkins study

play02:04

that these errors have led to 800,000 Americans dead

play02:10

or seriously disabled each year.

play02:13

So this is a big problem.

play02:15

And the question is, can AI help us?

play02:18

And you keep hearing about the term “precision medicine.”

play02:22

Well, if you keep making the same mistake over and over again, that's very precise.

play02:28

(Laughter)

play02:30

We don't need that,

play02:31

we need accuracy and precision medicine.

play02:34

So can we get there?

play02:36

Well, this is a picture of the retina.

play02:39

And this was the first major hint,

play02:42

training 100,000 images with supervised learning.

play02:47

Could the machine see things that people couldn't see?

play02:52

And so the question was, to the retinal experts,

play02:55

is this from a man or a woman?

play02:58

And the chance of getting it accurate was 50 percent.

play03:02

(Laughter)

play03:03

But the AI got it right, 97 percent.

play03:07

So that training,

play03:09

the features are not even fully defined of how that was possible.

play03:14

Well that gets then to all of medical images.

play03:17

This is just representative, the chest X-ray.

play03:20

And in fact with the chest X-ray,

play03:22

the ability here for the AI to pick up,

play03:26

the radiologists, expert radiologists missing the nodule,

play03:30

which turned out to be picked up by the AI as cancerous,

play03:34

and this is, of course, representative of all of medical scans,

play03:38

whether it’s CT scans, MRI, ultrasound.

play03:42

That through supervised learning of large, labeled, annotated data sets,

play03:47

we can see AI do at least as well, if not better,

play03:51

than expert physicians.

play03:55

And 21 randomized trials of picking up polyps --

play03:59

machine vision during colonoscopy -- have all shown

play04:03

that polyps are picked up better

play04:06

with the aid of machine vision than by the gastroenterologist alone,

play04:10

especially as the day goes on, later in the day, interestingly.

play04:15

We don't know whether picking up all these additional polyps

play04:18

changes the natural history of cancers,

play04:20

but it tells you about machine eyes,

play04:23

the power of machine eyes.

play04:25

Now that was interesting.

play04:27

But now still with deep learning models, not transformer models,

play04:33

we've seen and learned that the ability

play04:36

for computer vision to pick up things that human eyes can't see

play04:42

is quite remarkable.

play04:43

Here's the retina.

play04:46

Picking up the control of diabetes and blood pressure.

play04:50

Kidney disease.

play04:52

Liver and gallbladder disease.

play04:56

The heart calcium score,

play04:58

which you would normally get through a scan of the heart.

play05:03

Alzheimer's disease before any clinical symptoms have been manifest.

play05:08

Predicting heart attacks and strokes.

play05:11

Hyperlipidemia.

play05:13

And seven years before any symptoms of Parkinson's disease,

play05:18

to pick that up.

play05:19

Now this is interesting because in the future,

play05:23

we'll be taking pictures of our retina at checkups.

play05:27

This is the gateway to almost every system in the body.

play05:31

It's really striking.

play05:32

And we'll come back to this because each one of these studies

play05:36

was done with tens or hundreds [of] thousands of images

play05:40

with supervised learning,

play05:42

and they’re all separate studies by different investigators.

play05:46

Now, as a cardiologist, I love to read cardiograms.

play05:50

I've been doing it for over 30 years.

play05:53

But I couldn't see these things.

play05:56

Like, the age and the sex of the patient,

play05:59

or the ejection fraction of the heart,

play06:02

making difficult diagnoses that are frequently missed.

play06:06

The anemia of the patient, that is, the hemoglobin to the decimal point.

play06:11

Predicting whether a person,

play06:13

who's never had atrial fibrillation or stroke

play06:15

from the ECG,

play06:17

whether that's going to likely occur.

play06:20

Diabetes, a diagnosis of diabetes and prediabetes, from the cardiogram.

play06:25

The filling pressure of the heart.

play06:28

Hypothyroidism

play06:30

and kidney disease.

play06:32

Imagine getting an electrocardiogram to tell you about all these other things,

play06:36

not really so much about the heart.

play06:39

Then there's the chest X-ray.

play06:41

Who would have guessed that we could accurately determine

play06:45

the race of the patient,

play06:46

no less the ethical implications of that,

play06:49

from a chest X-ray through machine eyes?

play06:53

And interestingly, picking up the diagnosis of diabetes,

play06:57

as well as how well the diabetes is controlled,

play07:01

through the chest X-ray.

play07:04

And of course, so many different parameters about the heart,

play07:08

which we could never,

play07:10

radiologists or cardiologists, could never be able to come up

play07:14

with what machine vision can do.

play07:17

Pathologists often argue about a slide,

play07:21

about what does it really show?

play07:23

But with this ability of machine eyes,

play07:27

the driver genomic mutations of the cancer can be defined,

play07:31

no less the structural copy number variants

play07:34

that are accounting or present in that tumor.

play07:37

Also, where is that tumor coming from?

play07:40

For many patients, we don’t know.

play07:42

But it can be determined through AI.

play07:46

And also the prognosis of the patient,

play07:49

just from the slide,

play07:51

by all of the training.

play07:53

Again, this is all just convolutional neural networks,

play07:58

not transformer models.

play08:00

So when we go from the deep neural networks to transformer models,

play08:06

this classic pre-print,

play08:08

one of the most cited pre-prints ever,

play08:11

"Attention is All You Need,"

play08:12

the ability to now be able to look at many more items,

play08:17

whether it be language or images,

play08:20

and be able to put this in context,

play08:23

setting up a transformational progress in many fields.

play08:29

The prototype is, the outgrowth of this is GPT-4.

play08:34

With over a trillion connections.

play08:37

Our human brain has 100 trillion connections or parameters.

play08:42

But one trillion,

play08:43

just think of all the information, knowledge,

play08:45

that's packed into those one trillion.

play08:47

And interestingly, this is now multimodal with language, with images,

play08:52

with speech.

play08:53

And it involves a massive amount of graphic processing units.

play08:58

And it's with self-supervised learning,

play09:00

which is a big bottleneck in medicine

play09:02

because we can't get experts to label images.

play09:05

This can be done with self-supervised learning.

play09:08

So what does this set up in medicine?

play09:11

It sets up, for example, keyboard liberation.

play09:16

The one thing that both doctors, clinicians

play09:20

and patients would like to see.

play09:23

Everyone hates being data clerks as clinicians,

play09:27

and patients would like to see their doctor

play09:30

when they finally have the visit they've waited for a long time.

play09:34

So the ability to change the face-to-face contact

play09:39

is just one step along the way.

play09:41

By having the liberation from keyboards with synthetic notes

play09:46

that are driven, derived from the conversation,

play09:49

and then all the downstream normal data clerk functions that are done,

play09:54

often off-hours.

play09:56

Now we're seeing in health systems across the United States

play09:59

where people, physicians are saving many hours of time

play10:03

and heading towards ultimately keyboard liberation.

play10:08

We recently published, with the group at Moorfields Eye Institute,

play10:12

led by Pearse Keane,

play10:13

the first foundation model in medicine from the retina.

play10:16

And remember those eight different things that were all done by separate studies?

play10:21

This was all done with one model.

play10:23

This is with 1.6 million retinal images

play10:27

predicting all these different outcome likelihoods.

play10:32

And this is all open-source,

play10:33

which is of course really important that others can build on these models.

play10:38

Now I just want to review a couple of really interesting patients.

play10:44

Andrew, who is now six years old.

play10:47

He had three years of relentlessly increasing pain, arrested growth.

play10:55

His gait suffered with a dragging of his left foot,

play10:57

he had severe headaches.

play10:59

He went to 17 doctors over three years.

play11:03

His mother then entered all his symptoms into ChatGPT.

play11:08

It made the diagnosis of occulta spina bifida,

play11:12

which meant he had a tethered spinal cord that was missed by all 17 doctors

play11:18

over three years.

play11:19

He had surgery to release the cord.

play11:21

He's now perfectly healthy.

play11:24

(Applause)

play11:30

This is a patient that was sent to me,

play11:33

who was suffering with, she was told, long COVID.

play11:38

She saw many different physicians, neurologists,

play11:42

and her sister entered all her symptoms after getting nowhere,

play11:46

no treatment for long COVID,

play11:48

there is no treatment validated,

play11:49

and her sister put all her symptoms into ChatGPT.

play11:54

It found out it actually was not long COVID,

play11:56

she had limbic encephalitis, which is treatable.

play12:00

She was treated, and now she's doing extremely well.

play12:03

But these are not just anecdotes anymore.

play12:06

70 very difficult cases

play12:09

that are the clinical pathologic conferences

play12:12

at the New England Journal of Medicine

play12:14

were compared to GPT-4,

play12:17

and the chatbot did as well

play12:20

or better than the expert master clinicians

play12:23

in making the diagnosis.

play12:26

So I just want to close with a recent conversation with my fellow.

play12:31

Medicine is still an apprenticeship,

play12:33

and Andrew Cho is 30 years old,

play12:37

in his second year of cardiology fellowship.

play12:39

We see all patients together in the clinic.

play12:42

And at the end of clinic the other day,

play12:45

I sat down and said to him,

play12:47

"Andrew, you are so lucky.

play12:50

You're going to be practicing medicine in an era of keyboard liberation.

play12:55

You're going to be connecting with patients

play12:57

the way we haven't done for decades."

play13:00

That is the ability to have the note

play13:03

and the work from the conversation

play13:06

to derive things like pre-authorization,

play13:10

billing, prescriptions, future appointments --

play13:14

all the things that we do,

play13:16

including nudges to the patient.

play13:17

For example, did you get your blood pressure checks

play13:20

and what did they show

play13:21

and all that coming back to you.

play13:23

But much more than that,

play13:24

to help with making diagnoses.

play13:27

And the gift of time

play13:29

that having all the data of a patient

play13:32

that's all teed up before even seeing the patient.

play13:35

And all this support changes the future of the patient-doctor relationship,

play13:41

bringing in the gift of time.

play13:44

So this is really exciting.

play13:46

I said to Andrew, everything has to be validated, of course,

play13:50

that the benefit greatly outweighs any risk.

play13:54

But it is really a remarkable time for the future of health care,

play13:59

it's so damn exciting.

play14:01

Thank you.

play14:03

(Applause)