Alert button
Picture for Martin Weyssow

Martin Weyssow

Alert button

CodeUltraFeedback: An LLM-as-a-Judge Dataset for Aligning Large Language Models to Coding Preferences

Add code
Bookmark button
Alert button
Mar 14, 2024
Martin Weyssow, Aton Kamanda, Houari Sahraoui

Figure 1 for CodeUltraFeedback: An LLM-as-a-Judge Dataset for Aligning Large Language Models to Coding Preferences
Figure 2 for CodeUltraFeedback: An LLM-as-a-Judge Dataset for Aligning Large Language Models to Coding Preferences
Figure 3 for CodeUltraFeedback: An LLM-as-a-Judge Dataset for Aligning Large Language Models to Coding Preferences
Figure 4 for CodeUltraFeedback: An LLM-as-a-Judge Dataset for Aligning Large Language Models to Coding Preferences
Viaarxiv icon

CodeLL: A Lifelong Learning Dataset to Support the Co-Evolution of Data and Language Models of Code

Add code
Bookmark button
Alert button
Dec 20, 2023
Martin Weyssow, Claudio Di Sipio, Davide Di Ruscio, Houari Sahraoui

Viaarxiv icon

Exploring Parameter-Efficient Fine-Tuning Techniques for Code Generation with Large Language Models

Add code
Bookmark button
Alert button
Aug 21, 2023
Martin Weyssow, Xin Zhou, Kisub Kim, David Lo, Houari Sahraoui

Figure 1 for Exploring Parameter-Efficient Fine-Tuning Techniques for Code Generation with Large Language Models
Figure 2 for Exploring Parameter-Efficient Fine-Tuning Techniques for Code Generation with Large Language Models
Figure 3 for Exploring Parameter-Efficient Fine-Tuning Techniques for Code Generation with Large Language Models
Figure 4 for Exploring Parameter-Efficient Fine-Tuning Techniques for Code Generation with Large Language Models
Viaarxiv icon

On the Usage of Continual Learning for Out-of-Distribution Generalization in Pre-trained Language Models of Code

Add code
Bookmark button
Alert button
May 06, 2023
Martin Weyssow, Xin Zhou, Kisub Kim, David Lo, Houari Sahraoui

Figure 1 for On the Usage of Continual Learning for Out-of-Distribution Generalization in Pre-trained Language Models of Code
Figure 2 for On the Usage of Continual Learning for Out-of-Distribution Generalization in Pre-trained Language Models of Code
Figure 3 for On the Usage of Continual Learning for Out-of-Distribution Generalization in Pre-trained Language Models of Code
Figure 4 for On the Usage of Continual Learning for Out-of-Distribution Generalization in Pre-trained Language Models of Code
Viaarxiv icon

AST-Probe: Recovering abstract syntax trees from hidden representations of pre-trained language models

Add code
Bookmark button
Alert button
Jun 23, 2022
José Antonio Hernández López, Martin Weyssow, Jesús Sánchez Cuadrado, Houari Sahraoui

Figure 1 for AST-Probe: Recovering abstract syntax trees from hidden representations of pre-trained language models
Figure 2 for AST-Probe: Recovering abstract syntax trees from hidden representations of pre-trained language models
Figure 3 for AST-Probe: Recovering abstract syntax trees from hidden representations of pre-trained language models
Figure 4 for AST-Probe: Recovering abstract syntax trees from hidden representations of pre-trained language models
Viaarxiv icon

Better Modeling the Programming World with Code Concept Graphs-augmented Multi-modal Learning

Add code
Bookmark button
Alert button
Jan 10, 2022
Martin Weyssow, Houari Sahraoui, Bang Liu

Figure 1 for Better Modeling the Programming World with Code Concept Graphs-augmented Multi-modal Learning
Figure 2 for Better Modeling the Programming World with Code Concept Graphs-augmented Multi-modal Learning
Figure 3 for Better Modeling the Programming World with Code Concept Graphs-augmented Multi-modal Learning
Figure 4 for Better Modeling the Programming World with Code Concept Graphs-augmented Multi-modal Learning
Viaarxiv icon

Recommending Metamodel Concepts during Modeling Activities with Pre-Trained Language Models

Add code
Bookmark button
Alert button
Apr 04, 2021
Martin Weyssow, Houari Sahraoui, Eugene Syriani

Figure 1 for Recommending Metamodel Concepts during Modeling Activities with Pre-Trained Language Models
Figure 2 for Recommending Metamodel Concepts during Modeling Activities with Pre-Trained Language Models
Figure 3 for Recommending Metamodel Concepts during Modeling Activities with Pre-Trained Language Models
Figure 4 for Recommending Metamodel Concepts during Modeling Activities with Pre-Trained Language Models
Viaarxiv icon