Video

HRS 22: AI To Identify LV Dysfunction From Smartwatch ECGs

Published: 05 May 2022

  • Views:

    Views Icon 107
  • Likes:

    Heart Icon 0
View Transcript Download Transcript

In this video from HRS 2022, Dr Zachi Attia (The Mayo Clinic, Rochester, MI, US) discusses the outcomes of an international trial which aimed to evaluate the effectiveness of an artificial intelligence algorithm in identifying left ventricular dysfunction using an Apple watch ECG. 

Discussion points:

1. Importance of this study
2. AI Technology
3. Study design and patient population
4. Main findings
5. Impact on practice
6. Further study required and next steps

Recorded onsite at HRS 2022, San Francisco.

Transcript

- My name is Zachi Attia. I'm the co-director of AI cardiology at the Mayo clinic. Our talk was on the use of artificial intelligence to diagnose low ejection fraction or weak heart pump from an Apple watch ECG. This work was done as a collaboration between Mayo cardiology Mayo platform and the Mayo centre of digital health and was not supported by anyway, by Apple.

Importance of this study

So low injection infection or a weak heart pump affect about 2% of the population. Patients who are diagnosed with low ejection fraction often go to clinical heart failure, and the of dying within five years are significantly higher. And while it's such an awful disease we have useful treatments and medications to help these patients. The main issue is they usually don't get diagnosed early enough. Contment of the diagnosis require imaging like CT, MRI or echocardiogram which are expensive require a physician to read them. And many of the American population all around the world are more than 150 miles from the closest echocardiogram. So we wanted to find a way to diagnose those screened these patients, and see who needs that echocardiogram by using the ECG. And the ECG is a much cheaper, much more ubiquitous test because you can today measure it from a watch, for example.

AI Technology

We used something called a convolutional neural network. It was originally designed to look at images and we applied it on ECG tracings, and you train it by showing the AI many, many ECGs of patients with normally injection fraction have many ECGs with a low ejection fraction and we're able to do it because in Mayo, we have data from patients who have both an echocardiogram and ECG and what we've done to apply it to a watch is we took those ECG and we made them synthetically look like they were recorded by an Apple watch. We took about 50,000 of these ECGs and built a model by showing it again, a lot of normal and abnormal patients. So the AI was able to find subtle patterns that human cannot look and even a cardiology expert cannot look at an ECG and say this a patient with low EF but by showing it 50 thousands of them the AI was a able to learn these patterns. We then recorded patients with an Apple watch using an app, and we tested on patients that had an echo and an Apple Watch and we tested that model on these patients.

Study design and patient population

This is a proof of concept study. We took any patient that had a Mayo clinic app with an iOS 14 or above just because that's the minimal version you need to acquire an ECG with a digital format. And we asked them, do you have an Apple Watch, and will you be willing to participate in the study? If they answer yes, we send them a link to the app and that's a nice thing about the study because everything was done remotely. We had one study coordinator, Jennifer Neumann who emailed all of these patients, got the responses and automatically if they consent it to the study, they got an app and used the app to upload all of the ECGs and kept getting push notification, telling them they want to keep sending us your ECG every other week, and then if they recorded new ECGs, we got those as well. We had 2500 patients and that uploaded 125,000 ECGs within about five months. The average time of a user in the study was four months and they actually used the app twice a month. So they kept being very very engaged. Interestingly, the population, we had patients ageing from 22 to 92, and we noticed the higher use of the app for older their patients probably because they were more interested in their health or maybe they were more interested in their cardiac health specifically.

Main Findings

The study had two aims. First one was to look, if we can acquire data remotely digitally in a decentralised way from a diverse population. And we could, we, again, as I mentioned 125,000 ECGs in five months from 46 states in the US and 11 countries around the world. So, that was very successful. The second part is we were able to take those ECGs and feed them to our AI model for a subgroup of patient who had both an echo and ECG. We ran into the model and saw that we had UAC area under the curve of 0.88. Under the curve is a way to assess how well your screening model works. The worst ones are 0.5. The perfect one will be 1. And most clinically used tests are between 0.8 to 0.9. So 0.88 is a very high AUC which shows that this tool might be able to screen patients using their own watch from home.

Impact on practice

When we developed the original model, we used the 12 lead. And even though it worked well, we decided to study showing how these tools affect patients. And we gave clinicians, we randomised clinicians and half of the clinician got access to these tools and half got the regular tools that they have. And we noticed that the 32% uptick in detecting these new patients. So even though they didn't send more patients to echocardiograms, they found more sick patients. So you are able to see that these tools affect patients in a positive way and don't increase burden on the medical system. This is a proof of concept using the watch. So I assume we will do something very, very similar maybe an enriched population like patients who are undergoing chemotherapy and are in higher likelihood of developing it due to cardiotoxicity.

Further research required and next steps

So, as I mentioned, we have to validate to see that it actually benefits patients, right? We want to make sure that these tools accurate as they are, are being used correctly by clinicians. We developed something called the AI dashboard which allows clinicians to look at the patient's ECG. They can zoom in, they can focus and it improve the patient clinician communication. We want to embed this tool with clinician and see that patients are getting diagnosed more often using this tool by uploading their apple works through the dashboard.

Videography: Dan Brent