Alert button
Picture for Timo Lohrenz

Timo Lohrenz

Alert button

Relaxed Attention for Transformer Models

Add code
Bookmark button
Alert button
Sep 20, 2022
Timo Lohrenz, Björn Möller, Zhengyang Li, Tim Fingscheidt

Figure 1 for Relaxed Attention for Transformer Models
Figure 2 for Relaxed Attention for Transformer Models
Figure 3 for Relaxed Attention for Transformer Models
Figure 4 for Relaxed Attention for Transformer Models
Viaarxiv icon

Relaxed Attention: A Simple Method to Boost Performance of End-to-End Automatic Speech Recognition

Add code
Bookmark button
Alert button
Jul 02, 2021
Timo Lohrenz, Patrick Schwarz, Zhengyang Li, Tim Fingscheidt

Figure 1 for Relaxed Attention: A Simple Method to Boost Performance of End-to-End Automatic Speech Recognition
Figure 2 for Relaxed Attention: A Simple Method to Boost Performance of End-to-End Automatic Speech Recognition
Figure 3 for Relaxed Attention: A Simple Method to Boost Performance of End-to-End Automatic Speech Recognition
Figure 4 for Relaxed Attention: A Simple Method to Boost Performance of End-to-End Automatic Speech Recognition
Viaarxiv icon

Multi-Encoder Learning and Stream Fusion for Transformer-Based End-to-End Automatic Speech Recognition

Add code
Bookmark button
Alert button
Mar 31, 2021
Timo Lohrenz, Zhengyang Li, Tim Fingscheidt

Figure 1 for Multi-Encoder Learning and Stream Fusion for Transformer-Based End-to-End Automatic Speech Recognition
Figure 2 for Multi-Encoder Learning and Stream Fusion for Transformer-Based End-to-End Automatic Speech Recognition
Figure 3 for Multi-Encoder Learning and Stream Fusion for Transformer-Based End-to-End Automatic Speech Recognition
Figure 4 for Multi-Encoder Learning and Stream Fusion for Transformer-Based End-to-End Automatic Speech Recognition
Viaarxiv icon