Receive a weekly summary and discussion of the top papers of the week by leading researchers in the field.

In British journal of cancer ; h5-index 89.0

BACKGROUND : This study aims to develop an attention-based deep learning model for distinguishing benign from malignant breast lesions on CESM.

METHODS : Preoperative CESM images of 1239 patients, which were definitely diagnosed on pathology in a multicentre cohort, were divided into training and validation sets, internal and external test sets. The regions of interest of the breast lesions were outlined manually by a senior radiologist. We adopted three conventional convolutional neural networks (CNNs), namely, DenseNet 121, Xception, and ResNet 50, as the backbone architectures and incorporated the convolutional block attention module (CBAM) into them for classification. The performance of the models was analysed in terms of the receiver operating characteristic (ROC) curve, accuracy, the positive predictive value (PPV), the negative predictive value (NPV), the F1 score, the precision recall curve (PRC), and heat maps. The final models were compared with the diagnostic performance of conventional CNNs, radiomics models, and two radiologists with specialised breast imaging experience.

RESULTS : The best-performing deep learning model, that is, the CBAM-based Xception, achieved an area under the ROC curve (AUC) of 0.970, a sensitivity of 0.848, a specificity of 1.000, and an accuracy of 0.891 on the external test set, which was higher than those of other CNNs, radiomics models, and radiologists. The PRC and the heat maps also indicated the favourable predictive performance of the attention-based CNN model. The diagnostic performance of two radiologists improved with deep learning assistance.

CONCLUSIONS : Using an attention-based deep learning model based on CESM images can help to distinguishing benign from malignant breast lesions, and the diagnostic performance of radiologists improved with deep learning assistance.

Mao Ning, Zhang Haicheng, Dai Yi, Li Qin, Lin Fan, Gao Jing, Zheng Tiantian, Zhao Feng, Xie Haizhu, Xu Cong, Ma Heng

2022-Dec-15