We use methods from the Fock space and Segal-Bargmann theories to prove several results on the Gaussian RBF kernel in complex analysis. The latter is one of the most used kernels in modern machine learning kernel methods, and in support vector machines (SVMs) classification algorithms. Complex analysis techniques allow us to consider several notions linked to the RBF kernels like the feature space and the feature map, using the so-called Segal-Bargmann transform. We show also how the RBF kernels can be related to some of the most used operators in quantum mechanics and time frequency analysis, specifically, we prove the connections of such kernels with creation, annihilation, Fourier, translation, modulation and Weyl operators. For the Weyl operators, we also study a semigroup property in this case.
We study and introduce new gradient operators in the complex and bicomplex settings, inspired from the well-known Least Mean Square (LMS) algorithm invented in 1960 by Widrow and Hoff for Adaptive Linear Neuron (ADALINE). These gradient operators will be used to formulate new learning rules for the Bicomplex Least Mean Square (BLMS) algorithms. This approach extends both the classical real and complex LMS algorithms.
In this paper we first write a proof of the perceptron convergence algorithm for the complex multivalued neural networks (CMVNNs). Our primary goal is to formulate and prove the perceptron convergence algorithm for the bicomplex multivalued neural networks (BMVNNs) and other important results in the theory of neural networks based on a bicomplex algebra.