In Biomedical engineering online
BACKGROUND : Alzheimer's Disease (AD) is a degenerative brain disorder that often occurs in people over 65 years old. As advanced AD is difficult to manage, accurate diagnosis of the disorder is critical. Previous studies have revealed effective deep learning methods of classification. However, deep learning methods require a large number of image datasets. Moreover, medical images are affected by various environmental factors. In the current study, we propose a deep learning-based method for diagnosis of Alzheimer's disease (AD) that is less sensitive to different datasets for external validation, based upon F-18 fluorodeoxyglucose positron emission tomography/computed tomography (FDG-PET/CT).
RESULTS : The accuracy, sensitivity, and specificity of our proposed network were 86.09%, 80.00%, and 92.96% (respectively) using our dataset, and 91.02%, 87.93%, and 93.57% (respectively) using the Alzheimer's Disease Neuroimaging Initiative (ADNI) dataset. We observed that our model classified AD and normal cognitive (NC) cases based on the posterior cingulate cortex (PCC), where pathological changes occur in AD. The performance of the GAP layer was considered statistically significant compared to the fully connected layer in both datasets for accuracy, sensitivity, and specificity (p < 0.01). In addition, performance comparison between the ADNI dataset and our dataset showed no statistically significant differences in accuracy, sensitivity, and specificity (p > 0.05).
CONCLUSIONS : The proposed model demonstrated the effectiveness of AD classification using the GAP layer. Our model learned the AD features from PCC in both the ADNI and Severance datasets, which can be seen in the heatmap. Furthermore, we showed that there were no significant differences in performance using statistical analysis.
Kim Han Woong, Lee Ha Eun, Oh KyeongTaek, Lee Sangwon, Yun Mijin, Yoo Sun K
Alzheimer’s disease, Convolutional neural network, Deep learning, External validation, F-18 FDG-PET/CT, Feasibility study