Alert button
Picture for Yang-Hui He

Yang-Hui He

Alert button

Machine Learning Clifford invariants of ADE Coxeter elements

Sep 29, 2023
Siqi Chen, Pierre-Philippe Dechant, Yang-Hui He, Elli Heyes, Edward Hirst, Dmitrii Riabchenko

There has been recent interest in novel Clifford geometric invariants of linear transformations. This motivates the investigation of such invariants for a certain type of geometric transformation of interest in the context of root systems, reflection groups, Lie groups and Lie algebras: the Coxeter transformations. We perform exhaustive calculations of all Coxeter transformations for $A_8$, $D_8$ and $E_8$ for a choice of basis of simple roots and compute their invariants, using high-performance computing. This computational algebra paradigm generates a dataset that can then be mined using techniques from data science such as supervised and unsupervised machine learning. In this paper we focus on neural network classification and principal component analysis. Since the output -- the invariants -- is fully determined by the choice of simple roots and the permutation order of the corresponding reflections in the Coxeter element, we expect huge degeneracy in the mapping. This provides the perfect setup for machine learning, and indeed we see that the datasets can be machine learned to very high accuracy. This paper is a pump-priming study in experimental mathematics using Clifford algebras, showing that such Clifford algebraic datasets are amenable to machine learning, and shedding light on relationships between these novel and other well-known geometric invariants and also giving rise to analytic results.

* 34 pages, 16 Figures, 12 Tables 
Viaarxiv icon

Towards Quantum Advantage on Noisy Quantum Computers

Sep 27, 2022
Ismail Yunus Akhalwaya, Shashanka Ubaru, Kenneth L. Clarkson, Mark S. Squillante, Vishnu Jejjala, Yang-Hui He, Kugendran Naidoo, Vasileios Kalantzis, Lior Horesh

Figure 1 for Towards Quantum Advantage on Noisy Quantum Computers
Figure 2 for Towards Quantum Advantage on Noisy Quantum Computers
Figure 3 for Towards Quantum Advantage on Noisy Quantum Computers
Figure 4 for Towards Quantum Advantage on Noisy Quantum Computers

Topological data analysis (TDA) is a powerful technique for extracting complex and valuable shape-related summaries of high-dimensional data. However, the computational demands of classical TDA algorithms are exorbitant, and quickly become impractical for high-order characteristics. Quantum computing promises exponential speedup for certain problems. Yet, many existing quantum algorithms with notable asymptotic speedups require a degree of fault tolerance that is currently unavailable. In this paper, we present NISQ-TDA, the first fully implemented end-to-end quantum machine learning algorithm needing only a linear circuit-depth, that is applicable to non-handcrafted high-dimensional classical data, with potential speedup under stringent conditions. The algorithm neither suffers from the data-loading problem nor does it need to store the input data on the quantum computer explicitly. Our approach includes three key innovations: (a) an efficient realization of the full boundary operator as a sum of Pauli operators; (b) a quantum rejection sampling and projection approach to restrict a uniform superposition to the simplices of the desired order in the complex; and (c) a stochastic rank estimation method to estimate the topological features in the form of approximate Betti numbers. We present theoretical results that establish additive error guarantees for NISQ-TDA, and the circuit and computational time and depth complexities for exponentially scaled output estimates, up to the error tolerance. The algorithm was successfully executed on quantum computing devices, as well as on noisy quantum simulators, applied to small datasets. Preliminary empirical results suggest that the algorithm is robust to noise.

* This paper is a follow up to arXiv:2108.02811 with additional results 
Viaarxiv icon

Exponential advantage on noisy quantum computers

Sep 19, 2022
Ismail Yunus Akhalwaya, Shashanka Ubaru, Kenneth L. Clarkson, Mark S. Squillante, Vishnu Jejjala, Yang-Hui He, Kugendran Naidoo, Vasileios Kalantzis, Lior Horesh

Figure 1 for Exponential advantage on noisy quantum computers
Figure 2 for Exponential advantage on noisy quantum computers
Figure 3 for Exponential advantage on noisy quantum computers
Figure 4 for Exponential advantage on noisy quantum computers

Quantum computing offers the potential of exponential speedup over classical computation for certain problems. However, many of the existing algorithms with provable speedups require currently unavailable fault-tolerant quantum computers. We present NISQ-TDA, the first fully implemented quantum machine learning algorithm with provable exponential speedup on arbitrary classical (non-handcrafted) data and needing only a linear circuit depth. We report the successful execution of our NISQ-TDA algorithm, applied to small datasets run on quantum computing devices, as well as on noisy quantum simulators. We empirically confirm that the algorithm is robust to noise, and provide target depths and noise levels to realize near-term, non-fault-tolerant quantum advantage on real-world problems. Our unique data-loading projection method is the main source of noise robustness, introducing a new self-correcting data-loading approach.

* arXiv admin note: substantial text overlap with arXiv:2108.02811 
Viaarxiv icon

Machine Learning Class Numbers of Real Quadratic Fields

Sep 19, 2022
Malik Amir, Yang-Hui He, Kyu-Hwan Lee, Thomas Oliver, Eldar Sultanow

Figure 1 for Machine Learning Class Numbers of Real Quadratic Fields
Figure 2 for Machine Learning Class Numbers of Real Quadratic Fields
Figure 3 for Machine Learning Class Numbers of Real Quadratic Fields
Figure 4 for Machine Learning Class Numbers of Real Quadratic Fields

We implement and interpret various supervised learning experiments involving real quadratic fields with class numbers 1, 2 and 3. We quantify the relative difficulties in separating class numbers of matching/different parity from a data-scientific perspective, apply the methodology of feature analysis and principal component analysis, and use symbolic classification to develop machine-learned formulas for class numbers 1, 2 and 3 that apply to our dataset.

* 26 pages, 20 figures 
Viaarxiv icon

Machine Learning Algebraic Geometry for Physics

Apr 21, 2022
Jiakang Bao, Yang-Hui He, Elli Heyes, Edward Hirst

Figure 1 for Machine Learning Algebraic Geometry for Physics
Figure 2 for Machine Learning Algebraic Geometry for Physics
Figure 3 for Machine Learning Algebraic Geometry for Physics
Figure 4 for Machine Learning Algebraic Geometry for Physics

We review some recent applications of machine learning to algebraic geometry and physics. Since problems in algebraic geometry can typically be reformulated as mappings between tensors, this makes them particularly amenable to supervised learning. Additionally, unsupervised methods can provide insight into the structure of such geometrical data. At the heart of this programme is the question of how geometry can be machine learned, and indeed how AI helps one to do mathematics. This is a chapter contribution to the book Machine learning and Algebraic Geometry, edited by A. Kasprzyk et al.

* 32 pages, 25 figures. Contribution to Machine learning and Algebraic Geometry, edited by A. Kasprzyk et al 
Viaarxiv icon

Murmurations of elliptic curves

Apr 21, 2022
Yang-Hui He, Kyu-Hwan Lee, Thomas Oliver, Alexey Pozdnyakov

Figure 1 for Murmurations of elliptic curves
Figure 2 for Murmurations of elliptic curves
Figure 3 for Murmurations of elliptic curves
Figure 4 for Murmurations of elliptic curves

We investigate the average value of the $p$th Dirichlet coefficients of elliptic curves for a prime p in a fixed conductor range with given rank. Plotting this average yields a striking oscillating pattern, the details of which vary with the rank. Based on this observation, we perform various data-scientific experiments with the goal of classifying elliptic curves according to their ranks.

* 25 pages, 16 figures, 2 tables 
Viaarxiv icon

Cluster Algebras: Network Science and Machine Learning

Mar 25, 2022
Pierre-Philippe Dechant, Yang-Hui He, Elli Heyes, Edward Hirst

Figure 1 for Cluster Algebras: Network Science and Machine Learning
Figure 2 for Cluster Algebras: Network Science and Machine Learning
Figure 3 for Cluster Algebras: Network Science and Machine Learning
Figure 4 for Cluster Algebras: Network Science and Machine Learning

Cluster algebras have recently become an important player in mathematics and physics. In this work, we investigate them through the lens of modern data science, specifically with techniques from network science and machine-learning. Network analysis methods are applied to the exchange graphs for cluster algebras of varying mutation types. The analysis indicates that when the graphs are represented without identifying by permutation equivalence between clusters an elegant symmetry emerges in the quiver exchange graph embedding. The ratio between number of seeds and number of quivers associated to this symmetry is computed for finite Dynkin type algebras up to rank 5, and conjectured for higher ranks. Simple machine learning techniques successfully learn to differentiate cluster algebras from their seeds. The learning performance exceeds 0.9 accuracies between algebras of the same mutation type and between types, as well as relative to artificially generated data.

* 38 pages, 27 figures 
Viaarxiv icon

From the String Landscape to the Mathematical Landscape: a Machine-Learning Outlook

Feb 12, 2022
Yang-Hui He

Figure 1 for From the String Landscape to the Mathematical Landscape: a Machine-Learning Outlook
Figure 2 for From the String Landscape to the Mathematical Landscape: a Machine-Learning Outlook

We review the recent programme of using machine-learning to explore the landscape of mathematical problems. With this paradigm as a model for human intuition - complementary to and in contrast with the more formalistic approach of automated theorem proving - we highlight some experiments on how AI helps with conjecture formulation, pattern recognition and computation.

* 10 pages, 2 figures. Based on various talks in 2021-22, this is an invited contribution to the Proceedings of "The 14th International Workshop on Lie theory and its applications in physics", to be published by Springer-Nature 
Viaarxiv icon

Machine-Learning the Classification of Spacetimes

Jan 05, 2022
Yang-Hui He, Juan Manuel Pérez Ipiña

Figure 1 for Machine-Learning the Classification of Spacetimes
Figure 2 for Machine-Learning the Classification of Spacetimes
Figure 3 for Machine-Learning the Classification of Spacetimes
Figure 4 for Machine-Learning the Classification of Spacetimes

On the long-established classification problems in general relativity we take a novel perspective by adopting fruitful techniques from machine learning and modern data-science. In particular, we model Petrov's classification of spacetimes, and show that a feed-forward neural network can achieve high degree of success. We also show how data visualization techniques with dimensionality reduction can help analyze the underlying patterns in the structure of the different types of spacetimes.

* 6 pages, 5 figures 
Viaarxiv icon

Calabi-Yau Metrics, Energy Functionals and Machine-Learning

Dec 20, 2021
Anthony Ashmore, Lucille Calmon, Yang-Hui He, Burt A. Ovrut

Figure 1 for Calabi-Yau Metrics, Energy Functionals and Machine-Learning
Figure 2 for Calabi-Yau Metrics, Energy Functionals and Machine-Learning
Figure 3 for Calabi-Yau Metrics, Energy Functionals and Machine-Learning
Figure 4 for Calabi-Yau Metrics, Energy Functionals and Machine-Learning

We apply machine learning to the problem of finding numerical Calabi-Yau metrics. We extend previous work on learning approximate Ricci-flat metrics calculated using Donaldson's algorithm to the much more accurate "optimal" metrics of Headrick and Nassar. We show that machine learning is able to predict the K\"ahler potential of a Calabi-Yau metric having seen only a small sample of training data.

* 7 pages, 5 figures 
Viaarxiv icon