Alert button
Picture for Martin Weyssow

Martin Weyssow

Alert button

CodeLL: A Lifelong Learning Dataset to Support the Co-Evolution of Data and Language Models of Code

Dec 20, 2023
Martin Weyssow, Claudio Di Sipio, Davide Di Ruscio, Houari Sahraoui

Viaarxiv icon

Exploring Parameter-Efficient Fine-Tuning Techniques for Code Generation with Large Language Models

Aug 21, 2023
Martin Weyssow, Xin Zhou, Kisub Kim, David Lo, Houari Sahraoui

Figure 1 for Exploring Parameter-Efficient Fine-Tuning Techniques for Code Generation with Large Language Models
Figure 2 for Exploring Parameter-Efficient Fine-Tuning Techniques for Code Generation with Large Language Models
Figure 3 for Exploring Parameter-Efficient Fine-Tuning Techniques for Code Generation with Large Language Models
Figure 4 for Exploring Parameter-Efficient Fine-Tuning Techniques for Code Generation with Large Language Models
Viaarxiv icon

On the Usage of Continual Learning for Out-of-Distribution Generalization in Pre-trained Language Models of Code

May 06, 2023
Martin Weyssow, Xin Zhou, Kisub Kim, David Lo, Houari Sahraoui

Figure 1 for On the Usage of Continual Learning for Out-of-Distribution Generalization in Pre-trained Language Models of Code
Figure 2 for On the Usage of Continual Learning for Out-of-Distribution Generalization in Pre-trained Language Models of Code
Figure 3 for On the Usage of Continual Learning for Out-of-Distribution Generalization in Pre-trained Language Models of Code
Figure 4 for On the Usage of Continual Learning for Out-of-Distribution Generalization in Pre-trained Language Models of Code
Viaarxiv icon

AST-Probe: Recovering abstract syntax trees from hidden representations of pre-trained language models

Jun 23, 2022
José Antonio Hernández López, Martin Weyssow, Jesús Sánchez Cuadrado, Houari Sahraoui

Figure 1 for AST-Probe: Recovering abstract syntax trees from hidden representations of pre-trained language models
Figure 2 for AST-Probe: Recovering abstract syntax trees from hidden representations of pre-trained language models
Figure 3 for AST-Probe: Recovering abstract syntax trees from hidden representations of pre-trained language models
Figure 4 for AST-Probe: Recovering abstract syntax trees from hidden representations of pre-trained language models
Viaarxiv icon

Better Modeling the Programming World with Code Concept Graphs-augmented Multi-modal Learning

Jan 10, 2022
Martin Weyssow, Houari Sahraoui, Bang Liu

Figure 1 for Better Modeling the Programming World with Code Concept Graphs-augmented Multi-modal Learning
Figure 2 for Better Modeling the Programming World with Code Concept Graphs-augmented Multi-modal Learning
Figure 3 for Better Modeling the Programming World with Code Concept Graphs-augmented Multi-modal Learning
Figure 4 for Better Modeling the Programming World with Code Concept Graphs-augmented Multi-modal Learning
Viaarxiv icon

Recommending Metamodel Concepts during Modeling Activities with Pre-Trained Language Models

Apr 04, 2021
Martin Weyssow, Houari Sahraoui, Eugene Syriani

Figure 1 for Recommending Metamodel Concepts during Modeling Activities with Pre-Trained Language Models
Figure 2 for Recommending Metamodel Concepts during Modeling Activities with Pre-Trained Language Models
Figure 3 for Recommending Metamodel Concepts during Modeling Activities with Pre-Trained Language Models
Figure 4 for Recommending Metamodel Concepts during Modeling Activities with Pre-Trained Language Models
Viaarxiv icon