In Computerized medical imaging and graphics : the official journal of the Computerized Medical Imaging Society
Self-attention mechanism-based algorithms are attractive in digital pathology due to their interpretability, but suffer from computation complexity. This paper presents a novel, lightweight Attention-based Multiple Instance Mutation Learning (AMIML) model to allow small-scale attention operations for predicting gene mutations. Compared to the standard self-attention model, AMIML reduces the number of model parameters by approximately 70%. Using data for 24 clinically relevant genes from four cancer cohorts in TCGA studies (UCEC, BRCA, GBM, and KIRC), we compare AMIML with a standard self-attention model, five other deep learning models, and four traditional machine learning models. The results show that AMIML has excellent robustness and outperforms all the baseline algorithms in the vast majority of the tested genes. Conversely, the performance of the reference deep learning and machine learning models vary across different genes, and produce suboptimal prediction for certain genes. Furthermore, with the flexible and interpretable attention-based pooling mechanism, AMIML can further zero in and detect predictive image patches.
Guo Bangwei, Li Xingyu, Yang Miaomiao, Zhang Hong, Xu Xu Steven
2023-Jan-24
Attention mechanism, Deep learning, Gene Mutation, Mutiple Instance Learning, Whole slide images