What is it about?
This article presents a mathematical framework for implementing dynamical systems (i.e., differential equations), using detailed synapse models, in a recurrent spiking neural network. This framework is used to train a network to represent the history of concepts, as they continuously change over time, by delaying them in memory. This network optimally compresses the history into a scale-invariant low-dimensional state-space. This "temporal code" is systematically analyzed, and demonstrated to produce neural responses that are qualitatively and quantitatively similar to "time cells" recently discovered in rodents during a delay task.
Featured Image
Why is it important?
Brains must constantly deal with time-varying information in a dynamic environment. Meaningful actions depend not only on the current state of the world, but upon how that state is changing over time. The architecture presented in this article provides a mechanism that allows spiking neurons, coupled by detailed synapse models, to optimally represent the history of time-varying information. Moreover, the framework enables the computation of nonlinear functions across this rolling window of history. This establishes a bridge for understanding how a wide class of dynamic computations might relate to neural activity.
Perspectives
Read the Original
This page is a summary of: Improving Spiking Dynamical Networks: Accurate Delays, Higher-Order Synapses, and Time Cells, Neural Computation, March 2018, The MIT Press,
DOI: 10.1162/neco_a_01046.
You can read the full text:
Contributors
The following have contributed to this page