Usefulness of Heat Map Explanations for Deep-Learning-Based Electrocardiogram Analysis
Research output: Contribution to journal › Journal article › Research › peer-review
Standard
Usefulness of Heat Map Explanations for Deep-Learning-Based Electrocardiogram Analysis. / Storås, Andrea M.; Andersen, Ole Emil; Lockhart, Sam; Thielemann, Roman; Gnesin, Filip; Thambawita, Vajira; Hicks, Steven A.; Kanters, Jørgen K.; Strümke, Inga; Halvorsen, Pål; Riegler, Michael A.
In: Diagnostics, Vol. 13, No. 14, 2345, 2023.Research output: Contribution to journal › Journal article › Research › peer-review
Harvard
APA
Vancouver
Author
Bibtex
}
RIS
TY - JOUR
T1 - Usefulness of Heat Map Explanations for Deep-Learning-Based Electrocardiogram Analysis
AU - Storås, Andrea M.
AU - Andersen, Ole Emil
AU - Lockhart, Sam
AU - Thielemann, Roman
AU - Gnesin, Filip
AU - Thambawita, Vajira
AU - Hicks, Steven A.
AU - Kanters, Jørgen K.
AU - Strümke, Inga
AU - Halvorsen, Pål
AU - Riegler, Michael A.
N1 - Publisher Copyright: © 2023 by the authors.
PY - 2023
Y1 - 2023
N2 - Deep neural networks are complex machine learning models that have shown promising results in analyzing high-dimensional data such as those collected from medical examinations. Such models have the potential to provide fast and accurate medical diagnoses. However, the high complexity makes deep neural networks and their predictions difficult to understand. Providing model explanations can be a way of increasing the understanding of “black box” models and building trust. In this work, we applied transfer learning to develop a deep neural network to predict sex from electrocardiograms. Using the visual explanation method Grad-CAM, heat maps were generated from the model in order to understand how it makes predictions. To evaluate the usefulness of the heat maps and determine if the heat maps identified electrocardiogram features that could be recognized to discriminate sex, medical doctors provided feedback. Based on the feedback, we concluded that, in our setting, this mode of explainable artificial intelligence does not provide meaningful information to medical doctors and is not useful in the clinic. Our results indicate that improved explanation techniques that are tailored to medical data should be developed before deep neural networks can be applied in the clinic for diagnostic purposes.
AB - Deep neural networks are complex machine learning models that have shown promising results in analyzing high-dimensional data such as those collected from medical examinations. Such models have the potential to provide fast and accurate medical diagnoses. However, the high complexity makes deep neural networks and their predictions difficult to understand. Providing model explanations can be a way of increasing the understanding of “black box” models and building trust. In this work, we applied transfer learning to develop a deep neural network to predict sex from electrocardiograms. Using the visual explanation method Grad-CAM, heat maps were generated from the model in order to understand how it makes predictions. To evaluate the usefulness of the heat maps and determine if the heat maps identified electrocardiogram features that could be recognized to discriminate sex, medical doctors provided feedback. Based on the feedback, we concluded that, in our setting, this mode of explainable artificial intelligence does not provide meaningful information to medical doctors and is not useful in the clinic. Our results indicate that improved explanation techniques that are tailored to medical data should be developed before deep neural networks can be applied in the clinic for diagnostic purposes.
KW - electrocardiograms
KW - explainable artificial intelligence
KW - heat maps
U2 - 10.3390/diagnostics13142345
DO - 10.3390/diagnostics13142345
M3 - Journal article
C2 - 37510089
AN - SCOPUS:85166374897
VL - 13
JO - Diagnostics
JF - Diagnostics
SN - 2075-4418
IS - 14
M1 - 2345
ER -
ID: 362057165