Abstract:We propose a data-driven heuristic for NP-hard Ising and Max-Cut optimization that learns the update rule of an iterative dynamical system. The method learns a shared, node-wise update rule that maps local interaction fields to spin updates, parameterized by a compact multilayer perceptron with a small number of parameters. Training is performed using a zeroth-order optimizer, since backpropagation through long, recurrent Ising-machine dynamics leads to unstable and poorly informative gradients. We call this approach a neural network parameterized Ising machine (NPIM). Despite its low parameter count, the learned dynamics recover effective algorithmic structure, including momentum-like behavior and time-varying schedules, enabling efficient search in highly non-convex energy landscapes. Across standard Ising and neural combinatorial optimization benchmarks, NPIM achieves competitive solution quality and time-to-solution relative to recent learning-based methods and strong classical Ising-machine heuristics.
Abstract:We propose a novel algorithm that extends the methods of ball smoothing and Gaussian smoothing for noisy derivative-free optimization by accounting for the heterogeneous curvature of the objective function. The algorithm dynamically adapts the shape of the smoothing kernel to approximate the Hessian of the objective function around a local optimum. This approach significantly reduces the error in estimating the gradient from noisy evaluations through sampling. We demonstrate the efficacy of our method through numerical experiments on artificial problems. Additionally, we show improved performance when tuning NP-hard combinatorial optimization solvers compared to existing state-of-the-art heuristic derivative-free and Bayesian optimization methods.