Picture for Shan X. Wang

Shan X. Wang

Hamming Attention Distillation: Binarizing Keys and Queries for Efficient Long-Context Transformers

Add code
Feb 03, 2025
Figure 1 for Hamming Attention Distillation: Binarizing Keys and Queries for Efficient Long-Context Transformers
Figure 2 for Hamming Attention Distillation: Binarizing Keys and Queries for Efficient Long-Context Transformers
Figure 3 for Hamming Attention Distillation: Binarizing Keys and Queries for Efficient Long-Context Transformers
Figure 4 for Hamming Attention Distillation: Binarizing Keys and Queries for Efficient Long-Context Transformers
Viaarxiv icon