Abstract:Local gauge structures play a central role in a wide range of condensed matter systems and synthetic quantum platforms, where they emerge as effective descriptions of strongly correlated phases and engineered dynamics. We introduce a gauge-invariant graph neural network (GNN) architecture for Abelian lattice gauge models, in which symmetry is enforced explicitly through local gauge-invariant inputs, such as Wilson loops, and preserved throughout message passing, eliminating redundant gauge degrees of freedom while retaining expressive power. We benchmark the approach on both $\mathbb{Z}_2$ and $\mathrm{U}(1)$ lattice gauge models, achieving accurate predictions of global observables and spatially resolved quantities despite the nonlocal correlations induced by gauge-matter coupling. We further demonstrate that the learned model serves as an efficient surrogate for semiclassical dynamics in $\mathrm{U}(1)$ quantum link models, enabling stable and scalable time evolution without repeated fermionic diagonalization, while faithfully reproducing both local dynamics and statistical correlations. These results establish gauge-invariant message passing as a compact and physically grounded framework for learning and simulating Abelian lattice gauge systems.
Abstract:Local gauge symmetry underlies fundamental interactions and strongly correlated quantum matter, yet existing machine-learning approaches lack a general, principled framework for learning under site-dependent symmetries, particularly for intrinsically nonlocal observables. Here we introduce a gauge-equivariant graph neural network that embeds non-Abelian symmetry directly into message passing via matrix-valued, gauge-covariant features and symmetry-compatible updates, extending equivariant learning from global to fully local symmetries. In this formulation, message passing implements gauge-covariant transport across the lattice, allowing nonlocal correlations and loop-like structures to emerge naturally from local operations. We validate the approach across pure gauge, gauge-matter, and dynamical regimes, establishing gauge-equivariant message passing as a general paradigm for learning in systems governed by local symmetry.
Abstract:We review recent advances in machine-learning (ML) force-field methods for large-scale Landau-Lifshitz-Gilbert (LLG) simulations of metallic spin systems. We generalize the Behler-Parrinello (BP) ML architecture -- originally developed for quantum molecular dynamics -- to construct scalable and transferable ML models capable of capturing the intricate dependence of electron-mediated exchange fields on the local magnetic environment characteristic of itinerant magnets. A central ingredient of this framework is the implementation of symmetry-aware magnetic descriptors based on group-theoretical bispectrum formalisms. Leveraging these ML force fields, LLG simulations faithfully reproduce hallmark non-collinear magnetic orders -- such as the $120^\circ$ and tetrahedral states -- on the triangular lattice, and successfully capture the complex spin textures emerging in the mixed-phase states of a square-lattice double-exchange model under thermal quench. We further discuss a generalized potential theory that extends the BP formalism to incorporate both conservative and nonconservative electronic torques, thereby enabling ML models to learn nonequilibrium exchange fields from computationally demanding microscopic approaches such as nonequilibrium Green's-function techniques. This extension yields quantitatively accurate predictions of voltage-driven domain-wall motion and establishes a foundation for quantum-accurate, multiscale modeling of nonequilibrium spin dynamics and spintronic functionalities.
Abstract:Scalable and symmetry-consistent force-field models are essential for extending quantum-accurate simulations to large spatiotemporal scales. While descriptor-based neural networks can incorporate lattice symmetries through carefully engineered features, we show that graph neural networks (GNNs) provide a conceptually simpler and more unified alternative in which discrete lattice translation and point-group symmetries are enforced directly through local message passing and weight sharing. We develop a GNN-based force-field framework for the adiabatic dynamics of lattice Hamiltonians and demonstrate it for the semiclassical Holstein model. Trained on exact-diagonalization data, the GNN achieves high force accuracy, strict linear scaling with system size, and direct transferability to large lattices. Enabled by this scalability, we perform large-scale Langevin simulations of charge-density-wave ordering following thermal quenches, revealing dynamical scaling and anomalously slow sub--Allen--Cahn coarsening. These results establish GNNs as an elegant and efficient architecture for symmetry-aware, large-scale dynamical simulations of correlated lattice systems.
Abstract:Learning reduced descriptions of chaotic many-body dynamics is fundamentally challenging: although microscopic equations are Markovian, collective observables exhibit strong memory and exponential sensitivity to initial conditions and prediction errors. We show that a self-attention-based transformer framework provides an effective approach for modeling such chaotic collective dynamics directly from time-series data. By selectively reweighting long-range temporal correlations, the transformer learns a non-Markovian reduced description that overcomes intrinsic limitations of conventional recurrent architectures. As a concrete demonstration, we study the one-dimensional semiclassical Holstein model, where interaction quenches induce strongly nonlinear and chaotic dynamics of the charge-density-wave order parameter. While pointwise predictions inevitably diverge at long times, the transformer faithfully reproduces the statistical "climate" of the chaos, including temporal correlations and characteristic decay scales. Our results establish self-attention as a powerful mechanism for learning effective reduced dynamics in chaotic many-body systems.
Abstract:Nonequilibrium electronic forces play a central role in voltage-driven phase transitions but are notoriously expensive to evaluate in dynamical simulations. Here we develop a machine learning framework for adiabatic lattice dynamics coupled to nonequilibrium electrons, and demonstrate it for a gating induced insulator to metal transition out of a charge density wave state in the Holstein model. Although exact electronic forces can be obtained from nonequilibrium Green's function (NEGF) calculations, their high computational cost renders long time dynamical simulations prohibitively expensive. By exploiting the locality of the electronic response, we train a neural network to directly predict instantaneous local electronic forces from the lattice configuration, thereby bypassing repeated NEGF calculations during time evolution. When combined with Brownian dynamics, the resulting machine learning force field quantitatively reproduces domain wall motion and nonequilibrium phase transition dynamics obtained from full NEGF simulations, while achieving orders of magnitude gains in computational efficiency. Our results establish direct force learning as an efficient and accurate approach for simulating nonequilibrium lattice dynamics in driven quantum materials.
Abstract:Machine-learning (ML) force fields enable large-scale simulations with near-first-principles accuracy at substantially reduced computational cost. Recent work has extended ML force-field approaches to adiabatic dynamical simulations of condensed-matter lattice models with coupled electronic and structural or magnetic degrees of freedom. However, most existing formulations rely on hand-crafted, symmetry-aware descriptors, whose construction is often system-specific and can hinder generality and transferability across different lattice Hamiltonians. Here we introduce a symmetry-preserving framework based on equivariant neural networks (ENNs) that provides a general, data-driven mapping from local configurations of dynamical variables to the associated on-site forces in a lattice Hamiltonian. In contrast to ENN architectures developed for molecular systems -- where continuous Euclidean symmetries dominate -- our approach aims to embed the discrete point-group and internal symmetries intrinsic to lattice models directly into the neural-network representation of the force field. As a proof of principle, we construct an ENN-based force-field model for the adiabatic dynamics of the Holstein Hamiltonian on a square lattice, a canonical system for electron-lattice physics. The resulting ML-enabled large-scale dynamical simulations faithfully capture mesoscale evolution of the symmetry-breaking phase, illustrating the utility of lattice-equivariant architectures for linking microscopic electronic processes to emergent dynamical behavior in condensed-matter lattice systems.




Abstract:We review the recent development of machine-learning (ML) force-field frameworks for Landau-Lifshitz-Gilbert (LLG) dynamics simulations of itinerant electron magnets, focusing on the general theory and implementations of symmetry-invariant representations of spin configurations. The crucial properties that such magnetic descriptors must satisfy are differentiability with respect to spin rotations and invariance to both lattice point-group symmetry and internal spin rotation symmetry. We propose an efficient implementation based on the concept of reference irreducible representations, modified from the group-theoretical power-spectrum and bispectrum methods. The ML framework is demonstrated using the s-d models, which are widely applied in spintronics research. We show that LLG simulations based on local fields predicted by the trained ML models successfully reproduce representative non-collinear spin structures, including 120$^\circ$, tetrahedral, and skyrmion crystal orders of the triangular-lattice s-d models. Large-scale thermal quench simulations enabled by ML models further reveal intriguing freezing dynamics and glassy stripe states consisting of skyrmions and bi-merons. Our work highlights the utility of ML force-field approach to dynamical modeling of complex spin orders in itinerant electron magnets.




Abstract:The phase ordering kinetics of emergent orders in correlated electron systems is a fundamental topic in non-equilibrium physics, yet it remains largely unexplored. The intricate interplay between quasiparticles and emergent order-parameter fields could lead to unusual coarsening dynamics that is beyond the standard theories. However, accurate treatment of both quasiparticles and collective degrees of freedom is a multi-scale challenge in dynamical simulations of correlated electrons. Here we leverage modern machine learning (ML) methods to achieve a linear-scaling algorithm for simulating the coarsening of charge density waves (CDWs), one of the fundamental symmetry breaking phases in functional electron materials. We demonstrate our approach on the square-lattice Hubbard-Holstein model and uncover an intriguing enhancement of CDW coarsening which is related to the screening of on-site potential by electron-electron interactions. Our study provides fresh insights into the role of electron correlations in non-equilibrium dynamics and underscores the promise of ML force-field approaches for advancing multi-scale dynamical modeling of correlated electron systems.




Abstract:An echo state network (ESN) is a type of reservoir computer that uses a recurrent neural network with a sparsely connected hidden layer. Compared with other recurrent neural networks, one great advantage of ESN is the simplicity of its training process. Yet, despite the seemingly restricted learnable parameters, ESN has been shown to successfully capture the spatial-temporal dynamics of complex patterns. Here we build an ESN to model the coarsening dynamics of charge-density waves (CDW) in a semi-classical Holstein model, which exhibits a checkerboard electron density modulation at half-filling stabilized by a commensurate lattice distortion. The inputs to the ESN are local CDW order-parameters in a finite neighborhood centered around a given site, while the output is the predicted CDW order of the center site at the next time step. Special care is taken in the design of couplings between hidden layer and input nodes to ensure lattice symmetries are properly incorporated into the ESN model. Since the model predictions depend only on CDW configurations of a finite domain, the ESN is scalable and transferrable in the sense that a model trained on dataset from a small system can be directly applied to dynamical simulations on larger lattices. Our work opens a new avenue for efficient dynamical modeling of pattern formations in functional electron materials.