Language models are good pathologists: using attention-based sequence reduction and text-pretrained transformers for efficient WSI classification

Add code
Nov 14, 2022
Figure 1 for Language models are good pathologists: using attention-based sequence reduction and text-pretrained transformers for efficient WSI classification
Figure 2 for Language models are good pathologists: using attention-based sequence reduction and text-pretrained transformers for efficient WSI classification
Figure 3 for Language models are good pathologists: using attention-based sequence reduction and text-pretrained transformers for efficient WSI classification
Figure 4 for Language models are good pathologists: using attention-based sequence reduction and text-pretrained transformers for efficient WSI classification

Share this with someone who'll enjoy it:

View paper onarxiv icon

Share this with someone who'll enjoy it: