Alert button
Picture for Benjamin Heinzerling

Benjamin Heinzerling

Alert button

On the Importance of Subword Information for Morphological Tasks in Truly Low-Resource Languages

Sep 26, 2019
Yi Zhu, Benjamin Heinzerling, Ivan Vulić, Michael Strube, Roi Reichart, Anna Korhonen

Figure 1 for On the Importance of Subword Information for Morphological Tasks in Truly Low-Resource Languages
Figure 2 for On the Importance of Subword Information for Morphological Tasks in Truly Low-Resource Languages
Figure 3 for On the Importance of Subword Information for Morphological Tasks in Truly Low-Resource Languages
Figure 4 for On the Importance of Subword Information for Morphological Tasks in Truly Low-Resource Languages
Viaarxiv icon

Fine-Grained Entity Typing in Hyperbolic Space

Jun 06, 2019
Federico López, Benjamin Heinzerling, Michael Strube

Figure 1 for Fine-Grained Entity Typing in Hyperbolic Space
Figure 2 for Fine-Grained Entity Typing in Hyperbolic Space
Figure 3 for Fine-Grained Entity Typing in Hyperbolic Space
Figure 4 for Fine-Grained Entity Typing in Hyperbolic Space
Viaarxiv icon

Sequence Tagging with Contextual and Non-Contextual Subword Representations: A Multilingual Evaluation

Jun 04, 2019
Benjamin Heinzerling, Michael Strube

Figure 1 for Sequence Tagging with Contextual and Non-Contextual Subword Representations: A Multilingual Evaluation
Figure 2 for Sequence Tagging with Contextual and Non-Contextual Subword Representations: A Multilingual Evaluation
Figure 3 for Sequence Tagging with Contextual and Non-Contextual Subword Representations: A Multilingual Evaluation
Figure 4 for Sequence Tagging with Contextual and Non-Contextual Subword Representations: A Multilingual Evaluation
Viaarxiv icon

BPEmb: Tokenization-free Pre-trained Subword Embeddings in 275 Languages

Oct 05, 2017
Benjamin Heinzerling, Michael Strube

Figure 1 for BPEmb: Tokenization-free Pre-trained Subword Embeddings in 275 Languages
Figure 2 for BPEmb: Tokenization-free Pre-trained Subword Embeddings in 275 Languages
Figure 3 for BPEmb: Tokenization-free Pre-trained Subword Embeddings in 275 Languages
Figure 4 for BPEmb: Tokenization-free Pre-trained Subword Embeddings in 275 Languages
Viaarxiv icon

Revisiting Selectional Preferences for Coreference Resolution

Jul 20, 2017
Benjamin Heinzerling, Nafise Sadat Moosavi, Michael Strube

Figure 1 for Revisiting Selectional Preferences for Coreference Resolution
Figure 2 for Revisiting Selectional Preferences for Coreference Resolution
Figure 3 for Revisiting Selectional Preferences for Coreference Resolution
Figure 4 for Revisiting Selectional Preferences for Coreference Resolution
Viaarxiv icon