Picture for Takashi Furuya

Takashi Furuya

Approximation Theory for Lipschitz Continuous Transformers

Add code
Feb 17, 2026
Viaarxiv icon

Approximation theory for 1-Lipschitz ResNets

Add code
May 17, 2025
Viaarxiv icon

Kolmogorov-Arnold Networks: Approximation and Learning Guarantees for Functions and their Derivatives

Add code
Apr 21, 2025
Figure 1 for Kolmogorov-Arnold Networks: Approximation and Learning Guarantees for Functions and their Derivatives
Figure 2 for Kolmogorov-Arnold Networks: Approximation and Learning Guarantees for Functions and their Derivatives
Figure 3 for Kolmogorov-Arnold Networks: Approximation and Learning Guarantees for Functions and their Derivatives
Figure 4 for Kolmogorov-Arnold Networks: Approximation and Learning Guarantees for Functions and their Derivatives
Viaarxiv icon

Is In-Context Universality Enough? MLPs are Also Universal In-Context

Add code
Feb 05, 2025
Figure 1 for Is In-Context Universality Enough? MLPs are Also Universal In-Context
Figure 2 for Is In-Context Universality Enough? MLPs are Also Universal In-Context
Viaarxiv icon

Can neural operators always be continuously discretized?

Add code
Dec 04, 2024
Figure 1 for Can neural operators always be continuously discretized?
Viaarxiv icon

Simultaneously Solving FBSDEs with Neural Operators of Logarithmic Depth, Constant Width, and Sub-Linear Rank

Add code
Oct 18, 2024
Figure 1 for Simultaneously Solving FBSDEs with Neural Operators of Logarithmic Depth, Constant Width, and Sub-Linear Rank
Figure 2 for Simultaneously Solving FBSDEs with Neural Operators of Logarithmic Depth, Constant Width, and Sub-Linear Rank
Viaarxiv icon

Quantitative Approximation for Neural Operators in Nonlinear Parabolic Equations

Add code
Oct 03, 2024
Viaarxiv icon

Transformers are Universal In-context Learners

Add code
Aug 02, 2024
Viaarxiv icon

Mixture of Experts Soften the Curse of Dimensionality in Operator Learning

Add code
Apr 13, 2024
Figure 1 for Mixture of Experts Soften the Curse of Dimensionality in Operator Learning
Viaarxiv icon

Breaking the Curse of Dimensionality with Distributed Neural Computation

Add code
Feb 05, 2024
Viaarxiv icon