Alert button
Picture for Stanley J. Osher

Stanley J. Osher

Alert button

Wasserstein proximal operators describe score-based generative models and resolve memorization

Add code
Bookmark button
Alert button
Feb 09, 2024
Benjamin J. Zhang, Siting Liu, Wuchen Li, Markos A. Katsoulakis, Stanley J. Osher

Viaarxiv icon

PDE Generalization of In-Context Operator Networks: A Study on 1D Scalar Nonlinear Conservation Laws

Add code
Bookmark button
Alert button
Jan 21, 2024
Liu Yang, Stanley J. Osher

Viaarxiv icon

Prompting In-Context Operator Learning with Sensor Data, Equations, and Natural Language

Add code
Bookmark button
Alert button
Aug 09, 2023
Liu Yang, Tingwei Meng, Siting Liu, Stanley J. Osher

Figure 1 for Prompting In-Context Operator Learning with Sensor Data, Equations, and Natural Language
Figure 2 for Prompting In-Context Operator Learning with Sensor Data, Equations, and Natural Language
Figure 3 for Prompting In-Context Operator Learning with Sensor Data, Equations, and Natural Language
Figure 4 for Prompting In-Context Operator Learning with Sensor Data, Equations, and Natural Language
Viaarxiv icon

In-Context Operator Learning for Differential Equation Problems

Add code
Bookmark button
Alert button
Apr 17, 2023
Liu Yang, Siting Liu, Tingwei Meng, Stanley J. Osher

Figure 1 for In-Context Operator Learning for Differential Equation Problems
Figure 2 for In-Context Operator Learning for Differential Equation Problems
Figure 3 for In-Context Operator Learning for Differential Equation Problems
Figure 4 for In-Context Operator Learning for Differential Equation Problems
Viaarxiv icon

Momentum Transformer: Closing the Performance Gap Between Self-attention and Its Linearization

Add code
Bookmark button
Alert button
Aug 01, 2022
Tan Nguyen, Richard G. Baraniuk, Robert M. Kirby, Stanley J. Osher, Bao Wang

Figure 1 for Momentum Transformer: Closing the Performance Gap Between Self-attention and Its Linearization
Figure 2 for Momentum Transformer: Closing the Performance Gap Between Self-attention and Its Linearization
Figure 3 for Momentum Transformer: Closing the Performance Gap Between Self-attention and Its Linearization
Figure 4 for Momentum Transformer: Closing the Performance Gap Between Self-attention and Its Linearization
Viaarxiv icon

Multi-Agent Shape Control with Optimal Transport

Add code
Bookmark button
Alert button
Jun 30, 2022
Alex Tong Lin, Stanley J. Osher

Figure 1 for Multi-Agent Shape Control with Optimal Transport
Figure 2 for Multi-Agent Shape Control with Optimal Transport
Figure 3 for Multi-Agent Shape Control with Optimal Transport
Figure 4 for Multi-Agent Shape Control with Optimal Transport
Viaarxiv icon

Transformer with Fourier Integral Attentions

Add code
Bookmark button
Alert button
Jun 01, 2022
Tan Nguyen, Minh Pham, Tam Nguyen, Khai Nguyen, Stanley J. Osher, Nhat Ho

Figure 1 for Transformer with Fourier Integral Attentions
Figure 2 for Transformer with Fourier Integral Attentions
Figure 3 for Transformer with Fourier Integral Attentions
Figure 4 for Transformer with Fourier Integral Attentions
Viaarxiv icon

Proximal Implicit ODE Solvers for Accelerating Learning Neural ODEs

Add code
Bookmark button
Alert button
Apr 19, 2022
Justin Baker, Hedi Xia, Yiwei Wang, Elena Cherkaev, Akil Narayan, Long Chen, Jack Xin, Andrea L. Bertozzi, Stanley J. Osher, Bao Wang

Figure 1 for Proximal Implicit ODE Solvers for Accelerating Learning Neural ODEs
Figure 2 for Proximal Implicit ODE Solvers for Accelerating Learning Neural ODEs
Figure 3 for Proximal Implicit ODE Solvers for Accelerating Learning Neural ODEs
Figure 4 for Proximal Implicit ODE Solvers for Accelerating Learning Neural ODEs
Viaarxiv icon

Transformer with a Mixture of Gaussian Keys

Add code
Bookmark button
Alert button
Oct 16, 2021
Tam Nguyen, Tan M. Nguyen, Dung Le, Khuong Nguyen, Anh Tran, Richard G. Baraniuk, Nhat Ho, Stanley J. Osher

Figure 1 for Transformer with a Mixture of Gaussian Keys
Figure 2 for Transformer with a Mixture of Gaussian Keys
Figure 3 for Transformer with a Mixture of Gaussian Keys
Figure 4 for Transformer with a Mixture of Gaussian Keys
Viaarxiv icon