Neuronal architecture extracts statistical temporal patternsNeuronal systems need to process temporal signals. We here show how
higher-order temporal (co-)fluctuations can be employed to represent and
process information. Concretely, we demonstrate that a simple biologically
inspired feedforward neuronal model is able to extract information from up to
the third order cumulant to perform time series classification. This model
relies on a weighted linear summation of synaptic inputs followed by a
nonlinear gain function. Training both - the synaptic weights and the nonlinear
gain function - exposes how the non-linearity allows for the transfer of higher
order correlations to the mean, which in turn enables the synergistic use of
information encoded in multiple cumulants to maximize the classification
accuracy. The approach is demonstrated both on a synthetic and on real world
datasets of multivariate time series. Moreover, we show that the biologically
inspired architecture makes better use of the number of trainable parameters as
compared to a classical machine-learning scheme. Our findings emphasize the
benefit of biological neuronal architectures, paired with dedicated learning
algorithms, for the processing of information embedded in higher-order
statistical cumulants of temporal (co-)fluctuations.
arxiv.org