Picture for Hosein Mohebbi

Hosein Mohebbi

Homophone Disambiguation Reveals Patterns of Context Mixing in Speech Transformers

Add code
Oct 15, 2023
Viaarxiv icon

DecoderLens: Layerwise Interpretation of Encoder-Decoder Transformers

Add code
Oct 05, 2023
Figure 1 for DecoderLens: Layerwise Interpretation of Encoder-Decoder Transformers
Figure 2 for DecoderLens: Layerwise Interpretation of Encoder-Decoder Transformers
Figure 3 for DecoderLens: Layerwise Interpretation of Encoder-Decoder Transformers
Figure 4 for DecoderLens: Layerwise Interpretation of Encoder-Decoder Transformers
Viaarxiv icon

Quantifying Context Mixing in Transformers

Add code
Feb 08, 2023
Figure 1 for Quantifying Context Mixing in Transformers
Figure 2 for Quantifying Context Mixing in Transformers
Figure 3 for Quantifying Context Mixing in Transformers
Figure 4 for Quantifying Context Mixing in Transformers
Viaarxiv icon

AdapLeR: Speeding up Inference by Adaptive Length Reduction

Add code
Mar 16, 2022
Figure 1 for AdapLeR: Speeding up Inference by Adaptive Length Reduction
Figure 2 for AdapLeR: Speeding up Inference by Adaptive Length Reduction
Figure 3 for AdapLeR: Speeding up Inference by Adaptive Length Reduction
Figure 4 for AdapLeR: Speeding up Inference by Adaptive Length Reduction
Viaarxiv icon

Not All Models Localize Linguistic Knowledge in the Same Place: A Layer-wise Probing on BERToids' Representations

Add code
Sep 15, 2021
Figure 1 for Not All Models Localize Linguistic Knowledge in the Same Place: A Layer-wise Probing on BERToids' Representations
Figure 2 for Not All Models Localize Linguistic Knowledge in the Same Place: A Layer-wise Probing on BERToids' Representations
Figure 3 for Not All Models Localize Linguistic Knowledge in the Same Place: A Layer-wise Probing on BERToids' Representations
Figure 4 for Not All Models Localize Linguistic Knowledge in the Same Place: A Layer-wise Probing on BERToids' Representations
Viaarxiv icon

Exploring the Role of BERT Token Representations to Explain Sentence Probing Results

Add code
Apr 03, 2021
Figure 1 for Exploring the Role of BERT Token Representations to Explain Sentence Probing Results
Figure 2 for Exploring the Role of BERT Token Representations to Explain Sentence Probing Results
Figure 3 for Exploring the Role of BERT Token Representations to Explain Sentence Probing Results
Figure 4 for Exploring the Role of BERT Token Representations to Explain Sentence Probing Results
Viaarxiv icon