Alert button
Picture for Hardik Kothari

Hardik Kothari

Alert button

Enhancing training of physics-informed neural networks using domain-decomposition based preconditioning strategies

Jun 30, 2023
Alena Kopaničáková, Hardik Kothari, George Em Karniadakis, Rolf Krause

Figure 1 for Enhancing training of physics-informed neural networks using domain-decomposition based preconditioning strategies
Figure 2 for Enhancing training of physics-informed neural networks using domain-decomposition based preconditioning strategies
Figure 3 for Enhancing training of physics-informed neural networks using domain-decomposition based preconditioning strategies
Figure 4 for Enhancing training of physics-informed neural networks using domain-decomposition based preconditioning strategies

We propose to enhance the training of physics-informed neural networks (PINNs). To this aim, we introduce nonlinear additive and multiplicative preconditioning strategies for the widely used L-BFGS optimizer. The nonlinear preconditioners are constructed by utilizing the Schwarz domain-decomposition framework, where the parameters of the network are decomposed in a layer-wise manner. Through a series of numerical experiments, we demonstrate that both, additive and multiplicative preconditioners significantly improve the convergence of the standard L-BFGS optimizer, while providing more accurate solutions of the underlying partial differential equations. Moreover, the additive preconditioner is inherently parallel, thus giving rise to a novel approach to model parallelism.

* 22 pages, 7 figures 
Viaarxiv icon