Receive a weekly summary and discussion of the top papers of the week by leading researchers in the field.

In Computerized medical imaging and graphics : the official journal of the Computerized Medical Imaging Society

Oral Squamous Cell Carcinoma (OSCC) is the most prevalent type of oral cancer across the globe. Histopathology examination is the gold standard for OSCC examination, where stained histopathology slides help in studying and analyzing the cell structures under a microscope to determine the stages and grading of OSCC. One of the staining methods popularly known as H&E staining is used to produce differential coloration, highlight key tissue features, and improve contrast, which makes cell analysis easier. However, the stained H&E histopathology images exhibit inter and intra-variation due to staining techniques, incubation times, and staining reagents. These variations negatively impact computer-aided diagnosis (CAD) and Machine learning algorithm's accuracy and development. A pre-processing procedure called stain normalization must be employed to reduce stain variance's negative impacts. Numerous state-of-the-art stain normalization methods are introduced. However, a robust multi-domain stain normalization approach is still required because, in a real-world situation, the OSCC histopathology images will include more than two color variations involving several domains. In this paper, a multi-domain stain translation method is proposed. The proposed method is an attention gated generator based on a Conditional Generative Adversarial Network (cGAN) with a novel objective function to enforce color distribution and the perpetual resemblance between the source and target domains. Instead of using WSI scanner images like previous techniques, the proposed method is experimented on OSCC histopathology images obtained by several conventional microscopes coupled with cameras. The proposed method receives the L* channel from the L*a*b* color space in inference mode and generates the G(a*b*) channel, which are color-adapted. The proposed technique uses mappings learned during training phases to translate the source domain to the target domain; mapping are learned using the whole color distribution of the target domain instead of one reference image. The suggested technique outperforms the four state-of-the-art methods in multi-domain OSCC histopathological translation, the claim is supported by results obtained after assessment in both quantitative and qualitative ways.

Barua Barun, Bora Kangkana, Kr Das Anup, Ahmed Gazi N, Rahman Tashnin

2023-Feb-24

Attention gated generator, Conditional generative adversarial network (cGAN), H&E stain normalization, Histopathology, OSCC, Stain translation