Picture for Qixin Chang

Qixin Chang

FastAttention: Extend FlashAttention2 to NPUs and Low-resource GPUs

Add code
Oct 22, 2024
Figure 1 for FastAttention: Extend FlashAttention2 to NPUs and Low-resource GPUs
Figure 2 for FastAttention: Extend FlashAttention2 to NPUs and Low-resource GPUs
Figure 3 for FastAttention: Extend FlashAttention2 to NPUs and Low-resource GPUs
Figure 4 for FastAttention: Extend FlashAttention2 to NPUs and Low-resource GPUs
Viaarxiv icon