Abstract:Tree tensor networks (TTNs) offer powerful models for image classification. While these TTN image classifiers already show excellent performance on classical hardware, embedding them into quantum neural networks (QNNs) may further improve the performance by leveraging quantum resources. However, embedding TTN classifiers into QNNs for multiclass classification remains challenging. Key obstacles are the highorder gate operations required for large bond dimensions and the mid-circuit postselection with exponentially low success rates necessary for the exact embedding. In this work, to address these challenges, we propose forest tensor network (FTN)-classifiers, which aggregate multiple small-bond-dimension TTNs. This allows us to handle multiclass classification without requiring large gates in the embedded circuits. We then remove the overhead of mid-circuit postselection by extending the adiabatic encoding framework to our setting and smoothly encode the FTN-classifiers into a quantum forest tensor network (qFTN)- classifiers. Numerical experiments on MNIST and CIFAR-10 demonstrate that we can successfully train FTN-classifiers and encode them into qFTN-classifiers, while maintaining or even improving the performance of the pre-trained FTN-classifiers. These results suggest that synergy between TTN classification models and QNNs can provide a robust and scalable framework for multiclass quantum-enhanced image classification.
Abstract:Although Quantum Neural Networks (QNNs) offer powerful methods for classification tasks, the training of QNNs faces two major training obstacles: barren plateaus and local minima. A promising solution is to first train a tensor-network (TN) model classically and then embed it into a QNN.\ However, embedding TN-classifiers into quantum circuits generally requires postselection whose success probability may decay exponentially with the system size. We propose an \emph{adiabatic encoding} framework that encodes pre-trained MPS-classifiers into quantum MPS (qMPS) circuits with postselection, and gradually removes the postselection while retaining performance. We prove that training qMPS-classifiers from scratch on a certain artificial dataset is exponentially hard due to barren plateaus, but our adiabatic encoding circumvents this issue. Additional numerical experiments on binary MNIST also confirm its robustness.