Alert button
Picture for Xiaorong Wang

Xiaorong Wang

Alert button

One Hyper-Initializer for All Network Architectures in Medical Image Analysis

Jun 08, 2022
Fangxin Shang, Yehui Yang, Dalu Yang, Junde Wu, Xiaorong Wang, Yanwu Xu

Figure 1 for One Hyper-Initializer for All Network Architectures in Medical Image Analysis
Figure 2 for One Hyper-Initializer for All Network Architectures in Medical Image Analysis
Figure 3 for One Hyper-Initializer for All Network Architectures in Medical Image Analysis
Figure 4 for One Hyper-Initializer for All Network Architectures in Medical Image Analysis

Pre-training is essential to deep learning model performance, especially in medical image analysis tasks where limited training data are available. However, existing pre-training methods are inflexible as the pre-trained weights of one model cannot be reused by other network architectures. In this paper, we propose an architecture-irrelevant hyper-initializer, which can initialize any given network architecture well after being pre-trained for only once. The proposed initializer is a hypernetwork which takes a downstream architecture as input graphs and outputs the initialization parameters of the respective architecture. We show the effectiveness and efficiency of the hyper-initializer through extensive experimental results on multiple medical imaging modalities, especially in data-limited fields. Moreover, we prove that the proposed algorithm can be reused as a favorable plug-and-play initializer for any downstream architecture and task (both classification and segmentation) of the same modality.

Viaarxiv icon

Contrastive Centroid Supervision Alleviates Domain Shift in Medical Image Classification

May 31, 2022
Wenshuo Zhou, Dalu Yang, Binghong Wu, Yehui Yang, Junde Wu, Xiaorong Wang, Lei Wang, Haifeng Huang, Yanwu Xu

Figure 1 for Contrastive Centroid Supervision Alleviates Domain Shift in Medical Image Classification
Figure 2 for Contrastive Centroid Supervision Alleviates Domain Shift in Medical Image Classification
Figure 3 for Contrastive Centroid Supervision Alleviates Domain Shift in Medical Image Classification
Figure 4 for Contrastive Centroid Supervision Alleviates Domain Shift in Medical Image Classification

Deep learning based medical imaging classification models usually suffer from the domain shift problem, where the classification performance drops when training data and real-world data differ in imaging equipment manufacturer, image acquisition protocol, patient populations, etc. We propose Feature Centroid Contrast Learning (FCCL), which can improve target domain classification performance by extra supervision during training with contrastive loss between instance and class centroid. Compared with current unsupervised domain adaptation and domain generalization methods, FCCL performs better while only requires labeled image data from a single source domain and no target domain. We verify through extensive experiments that FCCL can achieve superior performance on at least three imaging modalities, i.e. fundus photographs, dermatoscopic images, and H & E tissue images.

Viaarxiv icon