Picture for Feiyi Wang

Feiyi Wang

HPC Digital Twins for Evaluating Scheduling Policies, Incentive Structures and their Impact on Power and Cooling

Add code
Aug 28, 2025
Viaarxiv icon

Intelligent Sampling of Extreme-Scale Turbulence Datasets for Accurate and Efficient Spatiotemporal Model Training

Add code
Aug 05, 2025
Viaarxiv icon

Pixel-Resolved Long-Context Learning for Turbulence at Exascale: Resolving Small-scale Eddies Toward the Viscous Limit

Add code
Jul 22, 2025
Viaarxiv icon

Distributed Cross-Channel Hierarchical Aggregation for Foundation Models

Add code
Jun 26, 2025
Viaarxiv icon

Analyzing 16,193 LLM Papers for Fun and Profits

Add code
Apr 15, 2025
Viaarxiv icon

ProTransformer: Robustify Transformers via Plug-and-Play Paradigm

Add code
Oct 30, 2024
Figure 1 for ProTransformer: Robustify Transformers via Plug-and-Play Paradigm
Figure 2 for ProTransformer: Robustify Transformers via Plug-and-Play Paradigm
Figure 3 for ProTransformer: Robustify Transformers via Plug-and-Play Paradigm
Figure 4 for ProTransformer: Robustify Transformers via Plug-and-Play Paradigm
Viaarxiv icon

A Digital Twin Framework for Liquid-cooled Supercomputers as Demonstrated at Exascale

Add code
Oct 07, 2024
Viaarxiv icon

Scalable Artificial Intelligence for Science: Perspectives, Methods and Exemplars

Add code
Jun 24, 2024
Figure 1 for Scalable Artificial Intelligence for Science: Perspectives, Methods and Exemplars
Figure 2 for Scalable Artificial Intelligence for Science: Perspectives, Methods and Exemplars
Figure 3 for Scalable Artificial Intelligence for Science: Perspectives, Methods and Exemplars
Figure 4 for Scalable Artificial Intelligence for Science: Perspectives, Methods and Exemplars
Viaarxiv icon

Pretraining Billion-scale Geospatial Foundational Models on Frontier

Add code
Apr 17, 2024
Figure 1 for Pretraining Billion-scale Geospatial Foundational Models on Frontier
Figure 2 for Pretraining Billion-scale Geospatial Foundational Models on Frontier
Figure 3 for Pretraining Billion-scale Geospatial Foundational Models on Frontier
Figure 4 for Pretraining Billion-scale Geospatial Foundational Models on Frontier
Viaarxiv icon

Optimizing Distributed Training on Frontier for Large Language Models

Add code
Dec 21, 2023
Figure 1 for Optimizing Distributed Training on Frontier for Large Language Models
Figure 2 for Optimizing Distributed Training on Frontier for Large Language Models
Figure 3 for Optimizing Distributed Training on Frontier for Large Language Models
Figure 4 for Optimizing Distributed Training on Frontier for Large Language Models
Viaarxiv icon