In Pacific Symposium on Biocomputing. Pacific Symposium on Biocomputing ; h5-index 0.0
The integration of multi-modal data, such as histopathological images and genomic data, is essential for understanding cancer heterogeneity and complexity for personalized treatments, as well as for enhancing survival predictions in cancer study. Histopathology, as a clinical gold-standard tool for diagnosis and prognosis in cancers, allows clinicians to make precise decisions on therapies, whereas high-throughput genomic data have been investigated to dissect the genetic mechanisms of cancers. We propose a biologically interpretable deep learning model (PAGE-Net) that integrates histopathological images and genomic data, not only to improve survival prediction, but also to identify genetic and histopathological patterns that cause different survival rates in patients. PAGE-Net consists of pathology/genome/demography-specific layers, each of which provides comprehensive biological interpretation. In particular, we propose a novel patch-wise texture-based convolutional neural network, with a patch aggregation strategy, to extract global survival-discriminative features, without manual annotation for the pathology-specific layers. We adapted the pathway-based sparse deep neural network, named Cox-PASNet, for the genome-specific layers. The proposed deep learning model was assessed with the histopathological images and the gene expression data of Glioblastoma Multiforme (GBM) at The Cancer Genome Atlas (TCGA) and The Cancer Imaging Archive (TCIA). PAGE-Net achieved a C-index of 0.702, which is higher than the results achieved with only histopathological images (0.509) and Cox-PASNet (0.640). More importantly, PAGE-Net can simultaneously identify histopathological and genomic prognostic factors associated with patients survivals. The source code of PAGE-Net is publicly available at https://github.com/DataX-JieHao/PAGE-Net.
Hao Jie, Kosaraju Sai Chandra, Tsaku Nelson Zange, Song Dae Hyun, Kang Mingon