Graph neural network (GNN) is a promising approach to learning and predicting physical phenomena described in boundary value problems, such as partial differential equations (PDEs) with boundary conditions. However, existing models inadequately treat boundary conditions essential for the reliable prediction of such problems. In addition, because of the locally connected nature of GNNs, it is difficult to accurately predict the state after a long time, where interaction between vertices tends to be global. We present our approach termed physics-embedded neural networks that considers boundary conditions and predicts the state after a long time using an implicit method. It is built based on an $\mathrm{E}(n)$-equivariant GNN, resulting in high generalization performance on various shapes. We demonstrate that our model learns flow phenomena in complex shapes and outperforms a well-optimized classical solver and a state-of-the-art machine learning model in speed-accuracy trade-off. Therefore, our model can be a useful standard for realizing reliable, fast, and accurate GNN-based PDE solvers.
Graphs correspond to one of the most important data structures used to represent pairwise relations between objects. Specifically, using the graphs embedded in the Euclidean space is essential to solve real problems, such as object detection, structural chemistry analysis, and physical simulation. A crucial requirement to employ the graphs in the Euclidean space is to learn the isometric transformation invariant and equivariant features. In the present paper, we propose a set of the transformation invariant and equivariant models called IsoGCNs that are based on graph convolutional networks. We discuss an example of IsoGCNs that corresponds to differential equations. We also demonstrate that the proposed model achieves high prediction performance on the considered finite element analysis dataset and can scale up to the graphs with 1M vertices.