Abstract:It has been demonstrated earlier that universal computation is 'almost surely' chaotic. Machine learning is a form of computational fixed point iteration, iterating over the computable function space. We showcase some properties of this iteration, and establish in general that the iteration is 'almost surely' of chaotic nature. This theory explains the observation in the counter intuitive properties of deep learning methods. This paper demonstrates that these properties are going to be universal to any learning method.
Abstract:There are enormous amount of examples of Computation in nature, exemplified across multiple species in biology. One crucial aim for these computations across all life forms their ability to learn and thereby increase the chance of their survival. In the current paper a formal definition of autonomous learning is proposed. From that definition we establish a Turing Machine model for learning, where rule tables can be added or deleted, but can not be modified. Sequential and parallel implementations of this model are discussed. It is found that for general purpose learning based on this model, the implementations capable of parallel execution would be evolutionarily stable. This is proposed to be of the reasons why in Nature parallelism in computation is found in abundance.