Picture for Jinjie Ni

Jinjie Ni

MixEval: Deriving Wisdom of the Crowd from LLM Benchmark Mixtures

Add code
Jun 03, 2024
Viaarxiv icon

OpenMoE: An Early Effort on Open Mixture-of-Experts Language Models

Add code
Jan 29, 2024
Figure 1 for OpenMoE: An Early Effort on Open Mixture-of-Experts Language Models
Figure 2 for OpenMoE: An Early Effort on Open Mixture-of-Experts Language Models
Figure 3 for OpenMoE: An Early Effort on Open Mixture-of-Experts Language Models
Figure 4 for OpenMoE: An Early Effort on Open Mixture-of-Experts Language Models
Viaarxiv icon

A Survey on Semantic Processing Techniques

Add code
Oct 22, 2023
Figure 1 for A Survey on Semantic Processing Techniques
Figure 2 for A Survey on Semantic Processing Techniques
Figure 3 for A Survey on Semantic Processing Techniques
Figure 4 for A Survey on Semantic Processing Techniques
Viaarxiv icon

Finding the Pillars of Strength for Multi-Head Attention

Add code
May 22, 2023
Figure 1 for Finding the Pillars of Strength for Multi-Head Attention
Figure 2 for Finding the Pillars of Strength for Multi-Head Attention
Figure 3 for Finding the Pillars of Strength for Multi-Head Attention
Figure 4 for Finding the Pillars of Strength for Multi-Head Attention
Viaarxiv icon

Logical Reasoning over Natural Language as Knowledge Representation: A Survey

Add code
Mar 21, 2023
Figure 1 for Logical Reasoning over Natural Language as Knowledge Representation: A Survey
Figure 2 for Logical Reasoning over Natural Language as Knowledge Representation: A Survey
Figure 3 for Logical Reasoning over Natural Language as Knowledge Representation: A Survey
Figure 4 for Logical Reasoning over Natural Language as Knowledge Representation: A Survey
Viaarxiv icon

Adaptive Knowledge Distillation between Text and Speech Pre-trained Models

Add code
Mar 07, 2023
Figure 1 for Adaptive Knowledge Distillation between Text and Speech Pre-trained Models
Figure 2 for Adaptive Knowledge Distillation between Text and Speech Pre-trained Models
Figure 3 for Adaptive Knowledge Distillation between Text and Speech Pre-trained Models
Figure 4 for Adaptive Knowledge Distillation between Text and Speech Pre-trained Models
Viaarxiv icon

deHuBERT: Disentangling Noise in a Self-supervised Model for Robust Speech Recognition

Add code
Feb 28, 2023
Figure 1 for deHuBERT: Disentangling Noise in a Self-supervised Model for Robust Speech Recognition
Figure 2 for deHuBERT: Disentangling Noise in a Self-supervised Model for Robust Speech Recognition
Figure 3 for deHuBERT: Disentangling Noise in a Self-supervised Model for Robust Speech Recognition
Figure 4 for deHuBERT: Disentangling Noise in a Self-supervised Model for Robust Speech Recognition
Viaarxiv icon

A Class-Aware Representation Refinement Framework for Graph Classification

Add code
Sep 02, 2022
Figure 1 for A Class-Aware Representation Refinement Framework for Graph Classification
Figure 2 for A Class-Aware Representation Refinement Framework for Graph Classification
Figure 3 for A Class-Aware Representation Refinement Framework for Graph Classification
Figure 4 for A Class-Aware Representation Refinement Framework for Graph Classification
Viaarxiv icon

Fusing task-oriented and open-domain dialogues in conversational agents

Add code
Sep 09, 2021
Figure 1 for Fusing task-oriented and open-domain dialogues in conversational agents
Figure 2 for Fusing task-oriented and open-domain dialogues in conversational agents
Figure 3 for Fusing task-oriented and open-domain dialogues in conversational agents
Figure 4 for Fusing task-oriented and open-domain dialogues in conversational agents
Viaarxiv icon

Recent Advances in Deep Learning Based Dialogue Systems: A Systematic Survey

Add code
Jun 01, 2021
Figure 1 for Recent Advances in Deep Learning Based Dialogue Systems: A Systematic Survey
Figure 2 for Recent Advances in Deep Learning Based Dialogue Systems: A Systematic Survey
Figure 3 for Recent Advances in Deep Learning Based Dialogue Systems: A Systematic Survey
Figure 4 for Recent Advances in Deep Learning Based Dialogue Systems: A Systematic Survey
Viaarxiv icon