Picture for Jonas Fischer

Jonas Fischer

Max Planck Institute for Informatics

Temporal Concept Dynamics in Diffusion Models via Prompt-Conditioned Interventions

Add code
Dec 09, 2025
Viaarxiv icon

Disentangling Polysemantic Channels in Convolutional Neural Networks

Add code
Apr 17, 2025
Viaarxiv icon

VITAL: More Understandable Feature Visualization through Distribution Alignment and Relevant Information Flow

Add code
Mar 28, 2025
Viaarxiv icon

Escaping Plato's Cave: Robust Conceptual Reasoning through Interpretable 3D Neural Object Volumes

Add code
Mar 17, 2025
Viaarxiv icon

Unlocking Open-Set Language Accessibility in Vision Models

Add code
Mar 14, 2025
Viaarxiv icon

Now you see me! A framework for obtaining class-relevant saliency maps

Add code
Mar 10, 2025
Figure 1 for Now you see me! A framework for obtaining class-relevant saliency maps
Figure 2 for Now you see me! A framework for obtaining class-relevant saliency maps
Figure 3 for Now you see me! A framework for obtaining class-relevant saliency maps
Figure 4 for Now you see me! A framework for obtaining class-relevant saliency maps
Viaarxiv icon

Sailing in high-dimensional spaces: Low-dimensional embeddings through angle preservation

Add code
Jun 14, 2024
Viaarxiv icon

Not all tickets are equal and we know it: Guiding pruning with domain-specific knowledge

Add code
Mar 05, 2024
Figure 1 for Not all tickets are equal and we know it: Guiding pruning with domain-specific knowledge
Figure 2 for Not all tickets are equal and we know it: Guiding pruning with domain-specific knowledge
Figure 3 for Not all tickets are equal and we know it: Guiding pruning with domain-specific knowledge
Figure 4 for Not all tickets are equal and we know it: Guiding pruning with domain-specific knowledge
Viaarxiv icon

Finding Interpretable Class-Specific Patterns through Efficient Neural Search

Add code
Dec 07, 2023
Viaarxiv icon

Understanding and Mitigating Classification Errors Through Interpretable Token Patterns

Add code
Nov 18, 2023
Viaarxiv icon