Alert button
Picture for Yutaka Yamaguti

Yutaka Yamaguti

Alert button

Evaluating generation of chaotic time series by convolutional generative adversarial networks

May 26, 2023
Yuki Tanaka, Yutaka Yamaguti

Figure 1 for Evaluating generation of chaotic time series by convolutional generative adversarial networks
Figure 2 for Evaluating generation of chaotic time series by convolutional generative adversarial networks
Figure 3 for Evaluating generation of chaotic time series by convolutional generative adversarial networks
Figure 4 for Evaluating generation of chaotic time series by convolutional generative adversarial networks

To understand the ability and limitations of convolutional neural networks to generate time series that mimic complex temporal signals, we trained a generative adversarial network consisting of deep convolutional networks to generate chaotic time series and used nonlinear time series analysis to evaluate the generated time series. A numerical measure of determinism and the Lyapunov exponent, a measure of trajectory instability, showed that the generated time series well reproduce the chaotic properties of the original time series. However, error distribution analyses showed that large errors appeared at a low but non-negligible rate. Such errors would not be expected if the distribution were assumed to be exponential.

Viaarxiv icon

Functional differentiations in evolutionary reservoir computing networks

Jun 20, 2020
Yutaka Yamaguti, Ichiro Tsuda

Figure 1 for Functional differentiations in evolutionary reservoir computing networks
Figure 2 for Functional differentiations in evolutionary reservoir computing networks
Figure 3 for Functional differentiations in evolutionary reservoir computing networks
Figure 4 for Functional differentiations in evolutionary reservoir computing networks

We propose an extended reservoir computer that shows the functional differentiation of neurons. The reservoir computer is developed to enable changing of the internal reservoir using evolutionary dynamics, and we call it an evolutionary reservoir computer. To develop neuronal units to show specificity, depending on the input information, the internal dynamics should be controlled to produce contracting dynamics after expanding dynamics. Expanding dynamics magnifies the difference of input information, while contracting dynamics contributes to forming clusters of input information, thereby producing multiple attractors. The simultaneous appearance of both dynamics indicates the existence of chaos. In contrast, sequential appearance of these dynamics during finite time intervals may induce functional differentiations. In this paper, we show how specific neuronal units are yielded in the evolutionary reservoir computer.

* This article has been submitted to Chaos. After it is published, it will be found at https://aip.scitation.org/journal/cha 
Viaarxiv icon