Alert button
Picture for Shuangzhi Wu

Shuangzhi Wu

Alert button

Modeling Paragraph-Level Vision-Language Semantic Alignment for Multi-Modal Summarization

Add code
Bookmark button
Alert button
Aug 24, 2022
Xinnian Liang, Chenhao Cui, Shuangzhi Wu, Jiali Zeng, Yufan Jiang, Zhoujun Li

Figure 1 for Modeling Paragraph-Level Vision-Language Semantic Alignment for Multi-Modal Summarization
Figure 2 for Modeling Paragraph-Level Vision-Language Semantic Alignment for Multi-Modal Summarization
Figure 3 for Modeling Paragraph-Level Vision-Language Semantic Alignment for Multi-Modal Summarization
Figure 4 for Modeling Paragraph-Level Vision-Language Semantic Alignment for Multi-Modal Summarization
Viaarxiv icon

An Efficient Coarse-to-Fine Facet-Aware Unsupervised Summarization Framework based on Semantic Blocks

Add code
Bookmark button
Alert button
Aug 17, 2022
Xinnian Liang, Jing Li, Shuangzhi Wu, Jiali Zeng, Yufan Jiang, Mu Li, Zhoujun Li

Figure 1 for An Efficient Coarse-to-Fine Facet-Aware Unsupervised Summarization Framework based on Semantic Blocks
Figure 2 for An Efficient Coarse-to-Fine Facet-Aware Unsupervised Summarization Framework based on Semantic Blocks
Figure 3 for An Efficient Coarse-to-Fine Facet-Aware Unsupervised Summarization Framework based on Semantic Blocks
Figure 4 for An Efficient Coarse-to-Fine Facet-Aware Unsupervised Summarization Framework based on Semantic Blocks
Viaarxiv icon

UM4: Unified Multilingual Multiple Teacher-Student Model for Zero-Resource Neural Machine Translation

Add code
Bookmark button
Alert button
Jul 11, 2022
Jian Yang, Yuwei Yin, Shuming Ma, Dongdong Zhang, Shuangzhi Wu, Hongcheng Guo, Zhoujun Li, Furu Wei

Figure 1 for UM4: Unified Multilingual Multiple Teacher-Student Model for Zero-Resource Neural Machine Translation
Figure 2 for UM4: Unified Multilingual Multiple Teacher-Student Model for Zero-Resource Neural Machine Translation
Figure 3 for UM4: Unified Multilingual Multiple Teacher-Student Model for Zero-Resource Neural Machine Translation
Figure 4 for UM4: Unified Multilingual Multiple Teacher-Student Model for Zero-Resource Neural Machine Translation
Viaarxiv icon

Modeling Multi-Granularity Hierarchical Features for Relation Extraction

Add code
Bookmark button
Alert button
Apr 09, 2022
Xinnian Liang, Shuangzhi Wu, Mu Li, Zhoujun Li

Figure 1 for Modeling Multi-Granularity Hierarchical Features for Relation Extraction
Figure 2 for Modeling Multi-Granularity Hierarchical Features for Relation Extraction
Figure 3 for Modeling Multi-Granularity Hierarchical Features for Relation Extraction
Figure 4 for Modeling Multi-Granularity Hierarchical Features for Relation Extraction
Viaarxiv icon

Task-guided Disentangled Tuning for Pretrained Language Models

Add code
Bookmark button
Alert button
Mar 22, 2022
Jiali Zeng, Yufan Jiang, Shuangzhi Wu, Yongjing Yin, Mu Li

Figure 1 for Task-guided Disentangled Tuning for Pretrained Language Models
Figure 2 for Task-guided Disentangled Tuning for Pretrained Language Models
Figure 3 for Task-guided Disentangled Tuning for Pretrained Language Models
Figure 4 for Task-guided Disentangled Tuning for Pretrained Language Models
Viaarxiv icon

Learning Confidence for Transformer-based Neural Machine Translation

Add code
Bookmark button
Alert button
Mar 22, 2022
Yu Lu, Jiali Zeng, Jiajun Zhang, Shuangzhi Wu, Mu Li

Figure 1 for Learning Confidence for Transformer-based Neural Machine Translation
Figure 2 for Learning Confidence for Transformer-based Neural Machine Translation
Figure 3 for Learning Confidence for Transformer-based Neural Machine Translation
Figure 4 for Learning Confidence for Transformer-based Neural Machine Translation
Viaarxiv icon

One Model, Multiple Tasks: Pathways for Natural Language Understanding

Add code
Bookmark button
Alert button
Mar 07, 2022
Duyu Tang, Fan Zhang, Yong Dai, Cong Zhou, Shuangzhi Wu, Shuming Shi

Figure 1 for One Model, Multiple Tasks: Pathways for Natural Language Understanding
Figure 2 for One Model, Multiple Tasks: Pathways for Natural Language Understanding
Figure 3 for One Model, Multiple Tasks: Pathways for Natural Language Understanding
Figure 4 for One Model, Multiple Tasks: Pathways for Natural Language Understanding
Viaarxiv icon

Pretraining without Wordpieces: Learning Over a Vocabulary of Millions of Words

Add code
Bookmark button
Alert button
Feb 24, 2022
Zhangyin Feng, Duyu Tang, Cong Zhou, Junwei Liao, Shuangzhi Wu, Xiaocheng Feng, Bing Qin, Yunbo Cao, Shuming Shi

Figure 1 for Pretraining without Wordpieces: Learning Over a Vocabulary of Millions of Words
Figure 2 for Pretraining without Wordpieces: Learning Over a Vocabulary of Millions of Words
Figure 3 for Pretraining without Wordpieces: Learning Over a Vocabulary of Millions of Words
Figure 4 for Pretraining without Wordpieces: Learning Over a Vocabulary of Millions of Words
Viaarxiv icon

Unsupervised Keyphrase Extraction by Jointly Modeling Local and Global Context

Add code
Bookmark button
Alert button
Sep 15, 2021
Xinnian Liang, Shuangzhi Wu, Mu Li, Zhoujun Li

Figure 1 for Unsupervised Keyphrase Extraction by Jointly Modeling Local and Global Context
Figure 2 for Unsupervised Keyphrase Extraction by Jointly Modeling Local and Global Context
Figure 3 for Unsupervised Keyphrase Extraction by Jointly Modeling Local and Global Context
Figure 4 for Unsupervised Keyphrase Extraction by Jointly Modeling Local and Global Context
Viaarxiv icon

Improving Machine Reading Comprehension with Single-choice Decision and Transfer Learning

Add code
Bookmark button
Alert button
Nov 06, 2020
Yufan Jiang, Shuangzhi Wu, Jing Gong, Yahui Cheng, Peng Meng, Weiliang Lin, Zhibo Chen, Mu li

Figure 1 for Improving Machine Reading Comprehension with Single-choice Decision and Transfer Learning
Figure 2 for Improving Machine Reading Comprehension with Single-choice Decision and Transfer Learning
Figure 3 for Improving Machine Reading Comprehension with Single-choice Decision and Transfer Learning
Figure 4 for Improving Machine Reading Comprehension with Single-choice Decision and Transfer Learning
Viaarxiv icon