Picture for Chuan Shi

Chuan Shi

Data-centric Prompt Tuning for Dynamic Graphs

Add code
Jan 17, 2026
Viaarxiv icon

Bridging Code Graphs and Large Language Models for Better Code Understanding

Add code
Dec 08, 2025
Figure 1 for Bridging Code Graphs and Large Language Models for Better Code Understanding
Figure 2 for Bridging Code Graphs and Large Language Models for Better Code Understanding
Figure 3 for Bridging Code Graphs and Large Language Models for Better Code Understanding
Figure 4 for Bridging Code Graphs and Large Language Models for Better Code Understanding
Viaarxiv icon

Unifying and Enhancing Graph Transformers via a Hierarchical Mask Framework

Add code
Oct 21, 2025
Viaarxiv icon

Transferable Parasitic Estimation via Graph Contrastive Learning and Label Rebalancing in AMS Circuits

Add code
Jul 09, 2025
Viaarxiv icon

Masked Language Models are Good Heterogeneous Graph Generalizers

Add code
Jun 06, 2025
Figure 1 for Masked Language Models are Good Heterogeneous Graph Generalizers
Figure 2 for Masked Language Models are Good Heterogeneous Graph Generalizers
Figure 3 for Masked Language Models are Good Heterogeneous Graph Generalizers
Figure 4 for Masked Language Models are Good Heterogeneous Graph Generalizers
Viaarxiv icon

Graph Positional Autoencoders as Self-supervised Learners

Add code
May 29, 2025
Viaarxiv icon

Data-centric Federated Graph Learning with Large Language Models

Add code
Mar 25, 2025
Viaarxiv icon

Blend the Separated: Mixture of Synergistic Experts for Data-Scarcity Drug-Target Interaction Prediction

Add code
Mar 20, 2025
Viaarxiv icon

HeTGB: A Comprehensive Benchmark for Heterophilic Text-Attributed Graphs

Add code
Mar 05, 2025
Viaarxiv icon

Between Circuits and Chomsky: Pre-pretraining on Formal Languages Imparts Linguistic Biases

Add code
Feb 26, 2025
Figure 1 for Between Circuits and Chomsky: Pre-pretraining on Formal Languages Imparts Linguistic Biases
Figure 2 for Between Circuits and Chomsky: Pre-pretraining on Formal Languages Imparts Linguistic Biases
Figure 3 for Between Circuits and Chomsky: Pre-pretraining on Formal Languages Imparts Linguistic Biases
Figure 4 for Between Circuits and Chomsky: Pre-pretraining on Formal Languages Imparts Linguistic Biases
Viaarxiv icon