Alert button
Picture for Jong-Hyeok Lee

Jong-Hyeok Lee

Alert button

POSTECH, Korea

Bring More Attention to Syntactic Symmetry for Automatic Postediting of High-Quality Machine Translations

Add code
Bookmark button
Alert button
May 17, 2023
Baikjin Jung, Myungji Lee, Jong-Hyeok Lee, Yunsu Kim

Figure 1 for Bring More Attention to Syntactic Symmetry for Automatic Postediting of High-Quality Machine Translations
Figure 2 for Bring More Attention to Syntactic Symmetry for Automatic Postediting of High-Quality Machine Translations
Figure 3 for Bring More Attention to Syntactic Symmetry for Automatic Postediting of High-Quality Machine Translations
Figure 4 for Bring More Attention to Syntactic Symmetry for Automatic Postediting of High-Quality Machine Translations
Viaarxiv icon

Towards Semi-Supervised Learning of Automatic Post-Editing: Data-Synthesis by Infilling Mask with Erroneous Tokens

Add code
Bookmark button
Alert button
Apr 08, 2022
WonKee Lee, Seong-Hwan Heo, Baikjin Jung, Jong-Hyeok Lee

Figure 1 for Towards Semi-Supervised Learning of Automatic Post-Editing: Data-Synthesis by Infilling Mask with Erroneous Tokens
Figure 2 for Towards Semi-Supervised Learning of Automatic Post-Editing: Data-Synthesis by Infilling Mask with Erroneous Tokens
Figure 3 for Towards Semi-Supervised Learning of Automatic Post-Editing: Data-Synthesis by Infilling Mask with Erroneous Tokens
Figure 4 for Towards Semi-Supervised Learning of Automatic Post-Editing: Data-Synthesis by Infilling Mask with Erroneous Tokens
Viaarxiv icon

mcBERT: Momentum Contrastive Learning with BERT for Zero-Shot Slot Filling

Add code
Bookmark button
Alert button
Mar 24, 2022
Seong-Hwan Heo, WonKee Lee, Jong-Hyeok Lee

Figure 1 for mcBERT: Momentum Contrastive Learning with BERT for Zero-Shot Slot Filling
Figure 2 for mcBERT: Momentum Contrastive Learning with BERT for Zero-Shot Slot Filling
Figure 3 for mcBERT: Momentum Contrastive Learning with BERT for Zero-Shot Slot Filling
Figure 4 for mcBERT: Momentum Contrastive Learning with BERT for Zero-Shot Slot Filling
Viaarxiv icon

Modeling Inter-Speaker Relationship in XLNet for Contextual Spoken Language Understanding

Add code
Bookmark button
Alert button
Oct 28, 2019
Jonggu Kim, Jong-Hyeok Lee

Figure 1 for Modeling Inter-Speaker Relationship in XLNet for Contextual Spoken Language Understanding
Figure 2 for Modeling Inter-Speaker Relationship in XLNet for Contextual Spoken Language Understanding
Figure 3 for Modeling Inter-Speaker Relationship in XLNet for Contextual Spoken Language Understanding
Viaarxiv icon

Transformer-based Automatic Post-Editing with a Context-Aware Encoding Approach for Multi-Source Inputs

Add code
Bookmark button
Alert button
Aug 15, 2019
WonKee Lee, Junsu Park, Byung-Hyun Go, Jong-Hyeok Lee

Figure 1 for Transformer-based Automatic Post-Editing with a Context-Aware Encoding Approach for Multi-Source Inputs
Figure 2 for Transformer-based Automatic Post-Editing with a Context-Aware Encoding Approach for Multi-Source Inputs
Figure 3 for Transformer-based Automatic Post-Editing with a Context-Aware Encoding Approach for Multi-Source Inputs
Figure 4 for Transformer-based Automatic Post-Editing with a Context-Aware Encoding Approach for Multi-Source Inputs
Viaarxiv icon

Decay-Function-Free Time-Aware Attention to Context and Speaker Indicator for Spoken Language Understanding

Add code
Bookmark button
Alert button
Mar 29, 2019
Jonggu Kim, Jong-Hyeok Lee

Figure 1 for Decay-Function-Free Time-Aware Attention to Context and Speaker Indicator for Spoken Language Understanding
Figure 2 for Decay-Function-Free Time-Aware Attention to Context and Speaker Indicator for Spoken Language Understanding
Figure 3 for Decay-Function-Free Time-Aware Attention to Context and Speaker Indicator for Spoken Language Understanding
Figure 4 for Decay-Function-Free Time-Aware Attention to Context and Speaker Indicator for Spoken Language Understanding
Viaarxiv icon

Self-Attention-Based Message-Relevant Response Generation for Neural Conversation Model

Add code
Bookmark button
Alert button
May 23, 2018
Jonggu Kim, Doyeon Kong, Jong-Hyeok Lee

Figure 1 for Self-Attention-Based Message-Relevant Response Generation for Neural Conversation Model
Figure 2 for Self-Attention-Based Message-Relevant Response Generation for Neural Conversation Model
Figure 3 for Self-Attention-Based Message-Relevant Response Generation for Neural Conversation Model
Figure 4 for Self-Attention-Based Message-Relevant Response Generation for Neural Conversation Model
Viaarxiv icon

Multiple Range-Restricted Bidirectional Gated Recurrent Units with Attention for Relation Classification

Add code
Bookmark button
Alert button
Nov 01, 2017
Jonggu Kim, Jong-Hyeok Lee

Figure 1 for Multiple Range-Restricted Bidirectional Gated Recurrent Units with Attention for Relation Classification
Figure 2 for Multiple Range-Restricted Bidirectional Gated Recurrent Units with Attention for Relation Classification
Viaarxiv icon

Improving Term Frequency Normalization for Multi-topical Documents, and Application to Language Modeling Approaches

Add code
Bookmark button
Alert button
Feb 08, 2015
Seung-Hoon Na, In-Su Kang, Jong-Hyeok Lee

Figure 1 for Improving Term Frequency Normalization for Multi-topical Documents, and Application to Language Modeling Approaches
Figure 2 for Improving Term Frequency Normalization for Multi-topical Documents, and Application to Language Modeling Approaches
Viaarxiv icon

Unlimited Vocabulary Grapheme to Phoneme Conversion for Korean TTS

Add code
Bookmark button
Alert button
Jun 10, 1998
Byeongchang Kim, WonIl Lee, Geunbae Lee, Jong-Hyeok Lee

Figure 1 for Unlimited Vocabulary Grapheme to Phoneme Conversion for Korean TTS
Figure 2 for Unlimited Vocabulary Grapheme to Phoneme Conversion for Korean TTS
Figure 3 for Unlimited Vocabulary Grapheme to Phoneme Conversion for Korean TTS
Figure 4 for Unlimited Vocabulary Grapheme to Phoneme Conversion for Korean TTS
Viaarxiv icon