Alert button
Picture for Gichang Lee

Gichang Lee

Alert button

On the Effect of Pretraining Corpora on In-context Learning by a Large-scale Language Model

Apr 28, 2022
Seongjin Shin, Sang-Woo Lee, Hwijeen Ahn, Sungdong Kim, HyoungSeok Kim, Boseop Kim, Kyunghyun Cho, Gichang Lee, Woomyoung Park, Jung-Woo Ha, Nako Sung

Figure 1 for On the Effect of Pretraining Corpora on In-context Learning by a Large-scale Language Model
Figure 2 for On the Effect of Pretraining Corpora on In-context Learning by a Large-scale Language Model
Figure 3 for On the Effect of Pretraining Corpora on In-context Learning by a Large-scale Language Model
Figure 4 for On the Effect of Pretraining Corpora on In-context Learning by a Large-scale Language Model
Viaarxiv icon

What Changes Can Large-scale Language Models Bring? Intensive Study on HyperCLOVA: Billions-scale Korean Generative Pretrained Transformers

Sep 10, 2021
Boseop Kim, HyoungSeok Kim, Sang-Woo Lee, Gichang Lee, Donghyun Kwak, Dong Hyeon Jeon, Sunghyun Park, Sungju Kim, Seonhoon Kim, Dongpil Seo, Heungsub Lee, Minyoung Jeong, Sungjae Lee, Minsub Kim, Suk Hyun Ko, Seokhun Kim, Taeyong Park, Jinuk Kim, Soyoung Kang, Na-Hyeon Ryu, Kang Min Yoo, Minsuk Chang, Soobin Suh, Sookyo In, Jinseong Park, Kyungduk Kim, Hiun Kim, Jisu Jeong, Yong Goo Yeo, Donghoon Ham, Dongju Park, Min Young Lee, Jaewook Kang, Inho Kang, Jung-Woo Ha, Woomyoung Park, Nako Sung

Figure 1 for What Changes Can Large-scale Language Models Bring? Intensive Study on HyperCLOVA: Billions-scale Korean Generative Pretrained Transformers
Figure 2 for What Changes Can Large-scale Language Models Bring? Intensive Study on HyperCLOVA: Billions-scale Korean Generative Pretrained Transformers
Figure 3 for What Changes Can Large-scale Language Models Bring? Intensive Study on HyperCLOVA: Billions-scale Korean Generative Pretrained Transformers
Figure 4 for What Changes Can Large-scale Language Models Bring? Intensive Study on HyperCLOVA: Billions-scale Korean Generative Pretrained Transformers
Viaarxiv icon

Sentiment Classification with Word Attention based on Weakly Supervised Learning with a Convolutional Neural Network

Sep 29, 2017
Gichang Lee, Jaeyun Jeong, Seungwan Seo, CzangYeob Kim, Pilsung Kang

Figure 1 for Sentiment Classification with Word Attention based on Weakly Supervised Learning with a Convolutional Neural Network
Figure 2 for Sentiment Classification with Word Attention based on Weakly Supervised Learning with a Convolutional Neural Network
Figure 3 for Sentiment Classification with Word Attention based on Weakly Supervised Learning with a Convolutional Neural Network
Figure 4 for Sentiment Classification with Word Attention based on Weakly Supervised Learning with a Convolutional Neural Network
Viaarxiv icon