Receive a weekly summary and discussion of the top papers of the week by leading researchers in the field.

In Critical reviews in biomedical engineering

Many researchers have developed computer-assisted diagnostic (CAD) methods to diagnose breast cancer using histopathology microscopic images. These techniques help to improve the accuracy of biopsy diagnosis with hematoxylin and eosin-stained images. On the other hand, most CAD systems usually rely on inefficient and time-consuming manual feature extraction methods. Using a deep learning (DL) model with convolutional layers, we present a method to extract the most useful pictorial information for breast cancer classification. Breast biopsy images stained with hematoxylin and eosin can be categorized into four groups namely benign lesions, normal tissue, carcinoma in situ, and invasive carcinoma. To correctly classify different types of breast cancer, it is important to classify histopathological images accurately. The MobileNet architecture model is used to obtain high accuracy with less resource utilization. The proposed model is fast, inexpensive, and safe due to which it is suitable for the detection of breast cancer at an early stage. This lightweight deep neural network can be accelerated using field-programmable gate arrays for the detection of breast cancer. DL has been implemented to successfully classify breast cancer. The model uses categorical cross-entropy to learn to give the correct class a high probability and other classes a low probability. It is used in the classification stage of the convolutional neural network (CNN) after the clustering stage, thereby improving the performance of the proposed system. To measure training and validation accuracy, the model was trained on Google Colab for 280 epochs with a powerful GPU with 2496 CUDA cores, 12 GB GDDR5 VRAM, and 12.6 GB RAM. Our results demonstrate that deep CNN with a chi-square test has improved the accuracy of histopathological image classification of breast cancer by greater than 11% compared with other state-of-the-art methods.

Laxmisagar H S, Hanumantharaju M C

2022