Picture for Seyedeh Sahar Taheri Otaghsara

Seyedeh Sahar Taheri Otaghsara

Multi-encoder nnU-Net outperforms Transformer models with self-supervised pretraining

Add code
Apr 04, 2025
Figure 1 for Multi-encoder nnU-Net outperforms Transformer models with self-supervised pretraining
Figure 2 for Multi-encoder nnU-Net outperforms Transformer models with self-supervised pretraining
Figure 3 for Multi-encoder nnU-Net outperforms Transformer models with self-supervised pretraining
Viaarxiv icon