Abstract:Machine learning is a powerful method of extracting meaning from data; unfortunately, current digital hardware is extremely energy-intensive. There is interest in an alternative analog computing implementation that could match the performance of traditional machine learning while being significantly more energy-efficient. However, it remains unclear how to train such analog computing systems while adhering to locality constraints imposed by the physical (as opposed to digital) nature of these systems. Local learning algorithms such as Equilibrium Propagation and Coupled Learning have been proposed to address this issue. In this paper, we develop an algorithm to exactly calculate gradients using a graph theoretic and analytical framework for Kirchhoff's laws. We also introduce Generalized Equilibrium Propagation, a framework encompassing a broad class of Hebbian learning algorithms, including Coupled Learning and Equilibrium Propagation, and show how our algorithm compares. We demonstrate our algorithm using numerical simulations and show that we can train resistor networks without the need for a replica or readout over all resistors, only at the output layer. We also show that under the analytical gradient approach, it is possible to update only a subset of the resistance values without a strong degradation in performance.




Abstract:We present an oscillatory neuromorphic primitive implemented with networks of coupled Wien bridge oscillators and tunable resistive couplings. Phase relationships between oscillators encode patterns, and a local Hebbian learning rule continuously adapts the couplings, allowing learning and recall to emerge from the same ongoing analog dynamics rather than from separate training and inference phases. Using a Kuramoto-style phase model with an effective energy function, we show that learned phase patterns form attractor states and validate this behavior in simulation and hardware. We further realize a 2-4-2 architecture with a hidden layer of oscillators, whose bipartite visible-hidden coupling allows multiple internal configurations to produce the same visible phase states. When inputs are switched, transient spikes in energy followed by relaxation indicate how the network can reduce surprise by reshaping its energy landscape. These results support coupled oscillator circuits as a hardware platform for energy-based neuromorphic computing with autonomous, continuous learning.