Picture for Yoshitomo Matsubara

Yoshitomo Matsubara

A Transformer Model for Symbolic Regression towards Scientific Discovery

Add code
Dec 13, 2023
Viaarxiv icon

torchdistill Meets Hugging Face Libraries for Reproducible, Coding-Free Deep Learning Studies: A Case Study on NLP

Add code
Oct 26, 2023
Viaarxiv icon

SplitBeam: Effective and Efficient Beamforming in Wi-Fi Networks Through Split Computing

Add code
Oct 12, 2023
Viaarxiv icon

Cross-Lingual Knowledge Distillation for Answer Sentence Selection in Low-Resource Languages

Add code
May 25, 2023
Viaarxiv icon

Rethinking Symbolic Regression Datasets and Benchmarks for Scientific Discovery

Add code
Jun 21, 2022
Figure 1 for Rethinking Symbolic Regression Datasets and Benchmarks for Scientific Discovery
Figure 2 for Rethinking Symbolic Regression Datasets and Benchmarks for Scientific Discovery
Figure 3 for Rethinking Symbolic Regression Datasets and Benchmarks for Scientific Discovery
Figure 4 for Rethinking Symbolic Regression Datasets and Benchmarks for Scientific Discovery
Viaarxiv icon

SC2: Supervised Compression for Split Computing

Add code
Mar 16, 2022
Figure 1 for SC2: Supervised Compression for Split Computing
Figure 2 for SC2: Supervised Compression for Split Computing
Figure 3 for SC2: Supervised Compression for Split Computing
Figure 4 for SC2: Supervised Compression for Split Computing
Viaarxiv icon

Ensemble Transformer for Efficient and Accurate Ranking Tasks: an Application to Question Answering Systems

Add code
Jan 15, 2022
Figure 1 for Ensemble Transformer for Efficient and Accurate Ranking Tasks: an Application to Question Answering Systems
Figure 2 for Ensemble Transformer for Efficient and Accurate Ranking Tasks: an Application to Question Answering Systems
Figure 3 for Ensemble Transformer for Efficient and Accurate Ranking Tasks: an Application to Question Answering Systems
Figure 4 for Ensemble Transformer for Efficient and Accurate Ranking Tasks: an Application to Question Answering Systems
Viaarxiv icon

BottleFit: Learning Compressed Representations in Deep Neural Networks for Effective and Efficient Split Computing

Add code
Jan 07, 2022
Figure 1 for BottleFit: Learning Compressed Representations in Deep Neural Networks for Effective and Efficient Split Computing
Figure 2 for BottleFit: Learning Compressed Representations in Deep Neural Networks for Effective and Efficient Split Computing
Figure 3 for BottleFit: Learning Compressed Representations in Deep Neural Networks for Effective and Efficient Split Computing
Figure 4 for BottleFit: Learning Compressed Representations in Deep Neural Networks for Effective and Efficient Split Computing
Viaarxiv icon

Supervised Compression for Resource-constrained Edge Computing Systems

Add code
Aug 21, 2021
Figure 1 for Supervised Compression for Resource-constrained Edge Computing Systems
Figure 2 for Supervised Compression for Resource-constrained Edge Computing Systems
Figure 3 for Supervised Compression for Resource-constrained Edge Computing Systems
Figure 4 for Supervised Compression for Resource-constrained Edge Computing Systems
Viaarxiv icon

Split Computing and Early Exiting for Deep Learning Applications: Survey and Research Challenges

Add code
Mar 08, 2021
Figure 1 for Split Computing and Early Exiting for Deep Learning Applications: Survey and Research Challenges
Figure 2 for Split Computing and Early Exiting for Deep Learning Applications: Survey and Research Challenges
Figure 3 for Split Computing and Early Exiting for Deep Learning Applications: Survey and Research Challenges
Figure 4 for Split Computing and Early Exiting for Deep Learning Applications: Survey and Research Challenges
Viaarxiv icon