Alert button
Picture for Melanie Boxberg

Melanie Boxberg

Alert button

B-Cos Aligned Transformers Learn Human-Interpretable Features

Add code
Bookmark button
Alert button
Jan 18, 2024
Manuel Tran, Amal Lahiani, Yashin Dicente Cid, Melanie Boxberg, Peter Lienemann, Christian Matek, Sophia J. Wagner, Fabian J. Theis, Eldad Klaiman, Tingying Peng

Viaarxiv icon

Fully transformer-based biomarker prediction from colorectal cancer histology: a large-scale multicentric study

Add code
Bookmark button
Alert button
Jan 23, 2023
Sophia J. Wagner, Daniel Reisenbüchler, Nicholas P. West, Jan Moritz Niehues, Gregory Patrick Veldhuizen, Philip Quirke, Heike I. Grabsch, Piet A. van den Brandt, Gordon G. A. Hutchins, Susan D. Richman, Tanwei Yuan, Rupert Langer, Josien Christina Anna Jenniskens, Kelly Offermans, Wolfram Mueller, Richard Gray, Stephen B. Gruber, Joel K. Greenson, Gad Rennert, Joseph D. Bonner, Daniel Schmolze, Jacqueline A. James, Maurice B. Loughrey, Manuel Salto-Tellez, Hermann Brenner, Michael Hoffmeister, Daniel Truhn, Julia A. Schnabel, Melanie Boxberg, Tingying Peng, Jakob Nikolas Kather

Figure 1 for Fully transformer-based biomarker prediction from colorectal cancer histology: a large-scale multicentric study
Figure 2 for Fully transformer-based biomarker prediction from colorectal cancer histology: a large-scale multicentric study
Figure 3 for Fully transformer-based biomarker prediction from colorectal cancer histology: a large-scale multicentric study
Figure 4 for Fully transformer-based biomarker prediction from colorectal cancer histology: a large-scale multicentric study
Viaarxiv icon

Local Attention Graph-based Transformer for Multi-target Genetic Alteration Prediction

Add code
Bookmark button
Alert button
May 13, 2022
Daniel Reisenbüchler, Sophia J. Wagner, Melanie Boxberg, Tingying Peng

Figure 1 for Local Attention Graph-based Transformer for Multi-target Genetic Alteration Prediction
Figure 2 for Local Attention Graph-based Transformer for Multi-target Genetic Alteration Prediction
Figure 3 for Local Attention Graph-based Transformer for Multi-target Genetic Alteration Prediction
Figure 4 for Local Attention Graph-based Transformer for Multi-target Genetic Alteration Prediction
Viaarxiv icon

S5CL: Unifying Fully-Supervised, Self-Supervised, and Semi-Supervised Learning Through Hierarchical Contrastive Learning

Add code
Bookmark button
Alert button
Mar 14, 2022
Manuel Tran, Sophia J. Wagner, Melanie Boxberg, Tingying Peng

Figure 1 for S5CL: Unifying Fully-Supervised, Self-Supervised, and Semi-Supervised Learning Through Hierarchical Contrastive Learning
Figure 2 for S5CL: Unifying Fully-Supervised, Self-Supervised, and Semi-Supervised Learning Through Hierarchical Contrastive Learning
Figure 3 for S5CL: Unifying Fully-Supervised, Self-Supervised, and Semi-Supervised Learning Through Hierarchical Contrastive Learning
Figure 4 for S5CL: Unifying Fully-Supervised, Self-Supervised, and Semi-Supervised Learning Through Hierarchical Contrastive Learning
Viaarxiv icon

Structure-Preserving Multi-Domain Stain Color Augmentation using Style-Transfer with Disentangled Representations

Add code
Bookmark button
Alert button
Jul 26, 2021
Sophia J. Wagner, Nadieh Khalili, Raghav Sharma, Melanie Boxberg, Carsten Marr, Walter de Back, Tingying Peng

Figure 1 for Structure-Preserving Multi-Domain Stain Color Augmentation using Style-Transfer with Disentangled Representations
Figure 2 for Structure-Preserving Multi-Domain Stain Color Augmentation using Style-Transfer with Disentangled Representations
Figure 3 for Structure-Preserving Multi-Domain Stain Color Augmentation using Style-Transfer with Disentangled Representations
Figure 4 for Structure-Preserving Multi-Domain Stain Color Augmentation using Style-Transfer with Disentangled Representations
Viaarxiv icon