Receive a weekly summary and discussion of the top papers of the week by leading researchers in the field.

In Patterns (New York, N.Y.)

Predictive coding is a promising framework for understanding brain function. It postulates that the brain continuously inhibits predictable sensory input, ensuring preferential processing of surprising elements. A central aspect of this view is its hierarchical connectivity, involving recurrent message passing between excitatory bottom-up signals and inhibitory top-down feedback. Here we use computational modeling to demonstrate that such architectural hardwiring is not necessary. Rather, predictive coding is shown to emerge as a consequence of energy efficiency. When training recurrent neural networks to minimize their energy consumption while operating in predictive environments, the networks self-organize into prediction and error units with appropriate inhibitory and excitatory interconnections and learn to inhibit predictable sensory input. Moving beyond the view of purely top-down-driven predictions, we demonstrate, via virtual lesioning experiments, that networks perform predictions on two timescales: fast lateral predictions among sensory units and slower prediction cycles that integrate evidence over time.

Ali Abdullahi, Ahmad Nasir, de Groot Elgar, Johannes van Gerven Marcel Antonius, Kietzmann Tim Christian

2022-Dec-09

brain-inspired machine learning, energy efficiency, predictive coding, recurrent neural networks