Alert button
Picture for Haejun Lee

Haejun Lee

Alert button

Transformers Get Stable: An End-to-End Signal Propagation Theory for Language Models

Add code
Bookmark button
Alert button
Mar 14, 2024
Akhil Kedia, Mohd Abbas Zaidi, Sushil Khyalia, Jungho Jung, Harshith Goka, Haejun Lee

Figure 1 for Transformers Get Stable: An End-to-End Signal Propagation Theory for Language Models
Figure 2 for Transformers Get Stable: An End-to-End Signal Propagation Theory for Language Models
Figure 3 for Transformers Get Stable: An End-to-End Signal Propagation Theory for Language Models
Figure 4 for Transformers Get Stable: An End-to-End Signal Propagation Theory for Language Models
Viaarxiv icon

Span-Selective Linear Attention Transformers for Effective and Robust Schema-Guided Dialogue State Tracking

Add code
Bookmark button
Alert button
Jun 15, 2023
Björn Bebensee, Haejun Lee

Figure 1 for Span-Selective Linear Attention Transformers for Effective and Robust Schema-Guided Dialogue State Tracking
Figure 2 for Span-Selective Linear Attention Transformers for Effective and Robust Schema-Guided Dialogue State Tracking
Figure 3 for Span-Selective Linear Attention Transformers for Effective and Robust Schema-Guided Dialogue State Tracking
Figure 4 for Span-Selective Linear Attention Transformers for Effective and Robust Schema-Guided Dialogue State Tracking
Viaarxiv icon

FiE: Building a Global Probability Space by Leveraging Early Fusion in Encoder for Open-Domain Question Answering

Add code
Bookmark button
Alert button
Nov 18, 2022
Akhil Kedia, Mohd Abbas Zaidi, Haejun Lee

Figure 1 for FiE: Building a Global Probability Space by Leveraging Early Fusion in Encoder for Open-Domain Question Answering
Figure 2 for FiE: Building a Global Probability Space by Leveraging Early Fusion in Encoder for Open-Domain Question Answering
Figure 3 for FiE: Building a Global Probability Space by Leveraging Early Fusion in Encoder for Open-Domain Question Answering
Figure 4 for FiE: Building a Global Probability Space by Leveraging Early Fusion in Encoder for Open-Domain Question Answering
Viaarxiv icon

You Only Need One Model for Open-domain Question Answering

Add code
Bookmark button
Alert button
Dec 14, 2021
Haejun Lee, Akhil Kedia, Jongwon Lee, Ashwin Paranjape, Christopher D. Manning, Kyoung-Gu Woo

Figure 1 for You Only Need One Model for Open-domain Question Answering
Figure 2 for You Only Need One Model for Open-domain Question Answering
Figure 3 for You Only Need One Model for Open-domain Question Answering
Figure 4 for You Only Need One Model for Open-domain Question Answering
Viaarxiv icon

SLM: Learning a Discourse Language Representation with Sentence Unshuffling

Add code
Bookmark button
Alert button
Oct 30, 2020
Haejun Lee, Drew A. Hudson, Kangwook Lee, Christopher D. Manning

Figure 1 for SLM: Learning a Discourse Language Representation with Sentence Unshuffling
Figure 2 for SLM: Learning a Discourse Language Representation with Sentence Unshuffling
Figure 3 for SLM: Learning a Discourse Language Representation with Sentence Unshuffling
Figure 4 for SLM: Learning a Discourse Language Representation with Sentence Unshuffling
Viaarxiv icon

Retrieve, Rerank, Read, then Iterate: Answering Open-Domain Questions of Arbitrary Complexity from Text

Add code
Bookmark button
Alert button
Oct 23, 2020
Peng Qi, Haejun Lee, Oghenetegiri "TG" Sido, Christopher D. Manning

Figure 1 for Retrieve, Rerank, Read, then Iterate: Answering Open-Domain Questions of Arbitrary Complexity from Text
Figure 2 for Retrieve, Rerank, Read, then Iterate: Answering Open-Domain Questions of Arbitrary Complexity from Text
Figure 3 for Retrieve, Rerank, Read, then Iterate: Answering Open-Domain Questions of Arbitrary Complexity from Text
Figure 4 for Retrieve, Rerank, Read, then Iterate: Answering Open-Domain Questions of Arbitrary Complexity from Text
Viaarxiv icon

Syllable-level Neural Language Model for Agglutinative Language

Add code
Bookmark button
Alert button
Aug 18, 2017
Seunghak Yu, Nilesh Kulkarni, Haejun Lee, Jihie Kim

Figure 1 for Syllable-level Neural Language Model for Agglutinative Language
Figure 2 for Syllable-level Neural Language Model for Agglutinative Language
Figure 3 for Syllable-level Neural Language Model for Agglutinative Language
Figure 4 for Syllable-level Neural Language Model for Agglutinative Language
Viaarxiv icon

An Embedded Deep Learning based Word Prediction

Add code
Bookmark button
Alert button
Jul 06, 2017
Seunghak Yu, Nilesh Kulkarni, Haejun Lee, Jihie Kim

Figure 1 for An Embedded Deep Learning based Word Prediction
Figure 2 for An Embedded Deep Learning based Word Prediction
Figure 3 for An Embedded Deep Learning based Word Prediction
Figure 4 for An Embedded Deep Learning based Word Prediction
Viaarxiv icon