Picture for Mathilde Guillemot

Mathilde Guillemot

Breaking Batch Normalization for better explainability of Deep Neural Networks through Layer-wise Relevance Propagation

Add code
Feb 24, 2020
Figure 1 for Breaking Batch Normalization for better explainability of Deep Neural Networks through Layer-wise Relevance Propagation
Figure 2 for Breaking Batch Normalization for better explainability of Deep Neural Networks through Layer-wise Relevance Propagation
Figure 3 for Breaking Batch Normalization for better explainability of Deep Neural Networks through Layer-wise Relevance Propagation
Figure 4 for Breaking Batch Normalization for better explainability of Deep Neural Networks through Layer-wise Relevance Propagation
Viaarxiv icon