Picture for Yilin Niu

Yilin Niu

ChatGLM: A Family of Large Language Models from GLM-130B to GLM-4 All Tools

Add code
Jun 18, 2024
Figure 1 for ChatGLM: A Family of Large Language Models from GLM-130B to GLM-4 All Tools
Figure 2 for ChatGLM: A Family of Large Language Models from GLM-130B to GLM-4 All Tools
Figure 3 for ChatGLM: A Family of Large Language Models from GLM-130B to GLM-4 All Tools
Figure 4 for ChatGLM: A Family of Large Language Models from GLM-130B to GLM-4 All Tools
Viaarxiv icon

ChatGLM-RLHF: Practices of Aligning Large Language Models with Human Feedback

Add code
Apr 03, 2024
Viaarxiv icon

Towards Efficient and Exact Optimization of Language Model Alignment

Add code
Feb 02, 2024
Figure 1 for Towards Efficient and Exact Optimization of Language Model Alignment
Figure 2 for Towards Efficient and Exact Optimization of Language Model Alignment
Figure 3 for Towards Efficient and Exact Optimization of Language Model Alignment
Figure 4 for Towards Efficient and Exact Optimization of Language Model Alignment
Viaarxiv icon

A Semantic-based Method for Unsupervised Commonsense Question Answering

Add code
May 31, 2021
Figure 1 for A Semantic-based Method for Unsupervised Commonsense Question Answering
Figure 2 for A Semantic-based Method for Unsupervised Commonsense Question Answering
Figure 3 for A Semantic-based Method for Unsupervised Commonsense Question Answering
Figure 4 for A Semantic-based Method for Unsupervised Commonsense Question Answering
Viaarxiv icon

REPT: Bridging Language Models and Machine Reading Comprehension via Retrieval-Based Pre-training

Add code
May 18, 2021
Figure 1 for REPT: Bridging Language Models and Machine Reading Comprehension via Retrieval-Based Pre-training
Figure 2 for REPT: Bridging Language Models and Machine Reading Comprehension via Retrieval-Based Pre-training
Figure 3 for REPT: Bridging Language Models and Machine Reading Comprehension via Retrieval-Based Pre-training
Figure 4 for REPT: Bridging Language Models and Machine Reading Comprehension via Retrieval-Based Pre-training
Viaarxiv icon

A Self-Training Method for Machine Reading Comprehension with Soft Evidence Extraction

Add code
May 11, 2020
Figure 1 for A Self-Training Method for Machine Reading Comprehension with Soft Evidence Extraction
Figure 2 for A Self-Training Method for Machine Reading Comprehension with Soft Evidence Extraction
Figure 3 for A Self-Training Method for Machine Reading Comprehension with Soft Evidence Extraction
Figure 4 for A Self-Training Method for Machine Reading Comprehension with Soft Evidence Extraction
Viaarxiv icon

CoTK: An Open-Source Toolkit for Fast Development and Fair Evaluation of Text Generation

Add code
Feb 03, 2020
Figure 1 for CoTK: An Open-Source Toolkit for Fast Development and Fair Evaluation of Text Generation
Figure 2 for CoTK: An Open-Source Toolkit for Fast Development and Fair Evaluation of Text Generation
Figure 3 for CoTK: An Open-Source Toolkit for Fast Development and Fair Evaluation of Text Generation
Figure 4 for CoTK: An Open-Source Toolkit for Fast Development and Fair Evaluation of Text Generation
Viaarxiv icon

Word Embedding based Edit Distance

Add code
Oct 25, 2018
Figure 1 for Word Embedding based Edit Distance
Figure 2 for Word Embedding based Edit Distance
Figure 3 for Word Embedding based Edit Distance
Figure 4 for Word Embedding based Edit Distance
Viaarxiv icon