In IEEE journal of biomedical and health informatics
Deep learning has been used to analyze and diag-nose various skin diseases through medical imaging. However,recent researches show that a well trained deep learning modelmay not generalize well to data from different cohorts due todomain shift. Simple data fusion techniques such as combiningdisease samples from different data sources are not effective tosolve this problem. In this paper, we present two methods for anovel task of cross-domain skin disease recognition. Starting froma fully supervised deep convolutional neural network classifierpre-trained on ImageNet, we explore a two-step progressivetransfer learning technique by fine-tuning the network on twoskin disease datasets. We then propose to adopt adversariallearning as a domain adaptation technique to perform invariantattribute translation from source to target domain in orderto improve the recognition performance. In order to evaluatethese two methods, we analyze generalization capability of thetrained model on melanoma detection, cancer detection andcross-modality learning tasks on two skin image datasets collectedfrom different clinical settings and cohorts with different diseasedistributions. The experiments prove the effectiveness of ourmethod in solving the domain shift problem.
Gu Yanyang, Ge Zongyuan, Bonnington C Paul, Zhou Jun