Receive a weekly summary and discussion of the top papers of the week by leading researchers in the field.

In Medicine

BACKGROUND : Screening mammography has led to reduced breast cancer-specific mortality and is recommended worldwide. However, the resultant doctors' workload of reading mammographic scans needs to be addressed. Although computer-aided detection (CAD) systems have been developed to support readers, the findings are conflicting regarding whether traditional CAD systems improve reading performance. Rapid progress in the artificial intelligence (AI) field has led to the advent of newer CAD systems using deep learning-based algorithms which have the potential to reach human performance levels. Those systems, however, have been developed using mammography images mainly from women in western countries. Because Asian women characteristically have higher-density breasts, it is uncertain whether those AI systems can apply to Japanese women. In this study, we will construct a deep learning-based CAD system trained using mammography images from a large number of Japanese women with high quality reading.

METHODS : We will collect digital mammography images taken for screening or diagnostic purposes at multiple institutions in Japan. A total of 15,000 images, consisting of 5000 images with breast cancer and 10,000 images with benign lesions, will be collected. At least 1000 images of normal breasts will also be collected for use as reference data. With these data, we will construct a deep learning-based AI system to detect breast cancer on mammograms. The primary endpoint will be the sensitivity and specificity of the AI system with the test image set.

DISCUSSION : When the ability of AI reading is shown to be on a par with that of human reading, images of normal breasts or benign lesions that do not have to be read by a human can be selected by AI beforehand. Our AI might work well in Asian women who have similar breast density, size, and shape to those of Japanese women.

TRIAL REGISTRATION : UMIN, trial number UMIN000039009. Registered 26 December 2019,

Yamaguchi Takeshi, Inoue Kenichi, Tsunoda Hiroko, Uematsu Takayoshi, Shinohara Norimitsu, Mukai Hirofumi