Picture for Pingyi Zhou

Pingyi Zhou

XL3M: A Training-free Framework for LLM Length Extension Based on Segment-wise Inference

Add code
May 28, 2024
Viaarxiv icon

Extending Context Window of Large Language Models via Semantic Compression

Add code
Dec 15, 2023
Figure 1 for Extending Context Window of Large Language Models via Semantic Compression
Figure 2 for Extending Context Window of Large Language Models via Semantic Compression
Figure 3 for Extending Context Window of Large Language Models via Semantic Compression
Figure 4 for Extending Context Window of Large Language Models via Semantic Compression
Viaarxiv icon

PanGu-Σ: Towards Trillion Parameter Language Model with Sparse Heterogeneous Computing

Add code
Mar 20, 2023
Figure 1 for PanGu-Σ: Towards Trillion Parameter Language Model with Sparse Heterogeneous Computing
Figure 2 for PanGu-Σ: Towards Trillion Parameter Language Model with Sparse Heterogeneous Computing
Figure 3 for PanGu-Σ: Towards Trillion Parameter Language Model with Sparse Heterogeneous Computing
Figure 4 for PanGu-Σ: Towards Trillion Parameter Language Model with Sparse Heterogeneous Computing
Viaarxiv icon

MultiCoder: Multi-Programming-Lingual Pre-Training for Low-Resource Code Completion

Add code
Dec 19, 2022
Figure 1 for MultiCoder: Multi-Programming-Lingual Pre-Training for Low-Resource Code Completion
Figure 2 for MultiCoder: Multi-Programming-Lingual Pre-Training for Low-Resource Code Completion
Figure 3 for MultiCoder: Multi-Programming-Lingual Pre-Training for Low-Resource Code Completion
Figure 4 for MultiCoder: Multi-Programming-Lingual Pre-Training for Low-Resource Code Completion
Viaarxiv icon

PanGu-Coder: Program Synthesis with Function-Level Language Modeling

Add code
Jul 22, 2022
Figure 1 for PanGu-Coder: Program Synthesis with Function-Level Language Modeling
Figure 2 for PanGu-Coder: Program Synthesis with Function-Level Language Modeling
Figure 3 for PanGu-Coder: Program Synthesis with Function-Level Language Modeling
Figure 4 for PanGu-Coder: Program Synthesis with Function-Level Language Modeling
Viaarxiv icon

CODE-MVP: Learning to Represent Source Code from Multiple Views with Contrastive Pre-Training

Add code
May 04, 2022
Figure 1 for CODE-MVP: Learning to Represent Source Code from Multiple Views with Contrastive Pre-Training
Figure 2 for CODE-MVP: Learning to Represent Source Code from Multiple Views with Contrastive Pre-Training
Figure 3 for CODE-MVP: Learning to Represent Source Code from Multiple Views with Contrastive Pre-Training
Figure 4 for CODE-MVP: Learning to Represent Source Code from Multiple Views with Contrastive Pre-Training
Viaarxiv icon

Compilable Neural Code Generation with Compiler Feedback

Add code
Mar 10, 2022
Figure 1 for Compilable Neural Code Generation with Compiler Feedback
Figure 2 for Compilable Neural Code Generation with Compiler Feedback
Figure 3 for Compilable Neural Code Generation with Compiler Feedback
Figure 4 for Compilable Neural Code Generation with Compiler Feedback
Viaarxiv icon

Pan More Gold from the Sand: Refining Open-domain Dialogue Training with Noisy Self-Retrieval Generation

Add code
Jan 27, 2022
Figure 1 for Pan More Gold from the Sand: Refining Open-domain Dialogue Training with Noisy Self-Retrieval Generation
Figure 2 for Pan More Gold from the Sand: Refining Open-domain Dialogue Training with Noisy Self-Retrieval Generation
Figure 3 for Pan More Gold from the Sand: Refining Open-domain Dialogue Training with Noisy Self-Retrieval Generation
Figure 4 for Pan More Gold from the Sand: Refining Open-domain Dialogue Training with Noisy Self-Retrieval Generation
Viaarxiv icon

SynCoBERT: Syntax-Guided Multi-Modal Contrastive Pre-Training for Code Representation

Add code
Sep 09, 2021
Figure 1 for SynCoBERT: Syntax-Guided Multi-Modal Contrastive Pre-Training for Code Representation
Figure 2 for SynCoBERT: Syntax-Guided Multi-Modal Contrastive Pre-Training for Code Representation
Figure 3 for SynCoBERT: Syntax-Guided Multi-Modal Contrastive Pre-Training for Code Representation
Figure 4 for SynCoBERT: Syntax-Guided Multi-Modal Contrastive Pre-Training for Code Representation
Viaarxiv icon