Abstract:In recent years, formulating various combinatorial optimization problems as Quadratic Unconstrained Binary Optimization (QUBO) has gained significant attention as a promising approach for efficiently obtaining optimal or near-optimal solutions. While QUBO offers a general-purpose framework, existing solvers often struggle with performance variability across different problems. This paper (i) theoretically analyzes Mean Field Annealing (MFA) and its variants--which are representative QUBO solvers, and reveals that their underlying self-consistent equations do not necessarily represent the minimum condition of the Kullback-Leibler divergence between the mean-field approximated distribution and the exact distribution, and (ii) proposes a novel method, the Annealed Mean Field Descent (AMFD), which is designed to address this limitation by directly minimizing the divergence. Through extensive experiments on five benchmark combinatorial optimization problems (Maximum Cut Problem, Maximum Independent Set Problem, Traveling Salesman Problem, Quadratic Assignment Problem, and Graph Coloring Problem), we demonstrate that AMFD exhibits superior performance in many cases and reduced problem dependence compared to state-of-the-art QUBO solvers and Gurobi--a state-of-the-art versatile mathematical optimization solver not limited to QUBO.
Abstract:This paper presents a local energy distribution based hyperparameter determination for stochastic simulated annealing (SSA). SSA is capable of solving combinatorial optimization problems faster than typical simulated annealing (SA), but requires a time-consuming hyperparameter search. The proposed method determines hyperparameters based on the local energy distributions of spins (probabilistic bits). The spin is a basic computing element of SSA and is graphically connected to other spins with its weights. The distribution of the local energy can be estimated based on the central limit theorem (CLT). The CLT-based normal distribution is used to determine the hyperparameters, which reduces the time complexity for hyperparameter search from O(n^3) of the conventional method to O(1). The performance of SSA with the determined hyperparameters is evaluated on the Gset and K2000 benchmarks for maximum-cut problems. The results show that the proposed method achieves mean cut values of approximately 98% of the best-known cut values.