In Medical engineering & physics ; h5-index 37.0
Ultrasound (US) is an important imaging modality used to assess breast lesions for malignant features. In the past decade, many machine learning models have been developed for automated discrimination of breast cancer versus normal on US images, but few have classified the images based on the Breast Imaging Reporting and Data System (BI-RADS) classes. This work aimed to develop a model for classifying US breast lesions using a BI-RADS classification framework with a new multi-class US image dataset. We proposed a deep model that combined a novel pyramid triple deep feature generator (PTDFG) with transfer learning based on three pre-trained networks for creating deep features. Bilinear interpolation was applied to decompose the input image into four images of successively smaller dimensions, constituting a four-level pyramid for downstream feature generation with the pre-trained networks. Neighborhood component analysis was applied to the generated features to select each network's 1,000 most informative features, which were fed to support vector machine classifier for automated classification using a ten-fold cross-validation strategy. Our proposed model was validated using a new US image dataset containing 1,038 images divided into eight BI-RADS classes and histopathological results. We defined three classification schemes: Case 1 involved the classification of all images into eight categories; Case 2, classification of breast US images into five BI-RADS classes; and Case 3, classification of BI-RADS 4 lesions into benign versus malignant classes. Our PTDFG-based transfer learning model attained accuracy rates of 79.29%, 80.42%, and 88.67% for Case 1, Case 2, and Case 3, respectively.
Kaplan Ela, Chan Wai Yee, Dogan Sengul, Barua Prabal D, Bulut Haci Taner, Tuncer Turker, Cizik Mert, Tan Ru-San, Acharya U Rajendra
Artificial intelligence, BI-RADS, Deep feature generator, Pyramid structure, Ultrasound breast