Picture for Sijia Peng

Sijia Peng

A 2D Semantic-Aware Position Encoding for Vision Transformers

Add code
May 14, 2025
Viaarxiv icon

Rethinking Time Encoding via Learnable Transformation Functions

Add code
May 01, 2025
Viaarxiv icon

Mamba or Transformer for Time Series Forecasting? Mixture of Universals (MoU) Is All You Need

Add code
Aug 28, 2024
Figure 1 for Mamba or Transformer for Time Series Forecasting? Mixture of Universals (MoU) Is All You Need
Figure 2 for Mamba or Transformer for Time Series Forecasting? Mixture of Universals (MoU) Is All You Need
Figure 3 for Mamba or Transformer for Time Series Forecasting? Mixture of Universals (MoU) Is All You Need
Figure 4 for Mamba or Transformer for Time Series Forecasting? Mixture of Universals (MoU) Is All You Need
Viaarxiv icon