Karl Friston - The free-energy principle: a unified brain theory? (2010)
History /
Edit /
PDF /
EPUB /
BIB /
Created: February 16, 2018 / Updated: November 2, 2024 / Status: finished / 3 min read (~480 words)
Created: February 16, 2018 / Updated: November 2, 2024 / Status: finished / 3 min read (~480 words)
- The free-energy principle says that any self-organizing system that is at equilibrium with its environment must minimize its free energy (an information theory measure that bounds or limits (by being greater than) the surprise on sampling some data, given a generative model)
- The principle is essentially a mathematical formulation of how adaptive systems resist a natural tendency to disorder
- Biological agents must minimize the long-term average of surprise to ensure that their sensory entropy remains low
- Free energy is an upper bound on surprise, which means that if agents minimize free energy, they implicitly minimize surprise
- Free energy can be evaluated because it is a function of two things to which the agent has access:
- its sensory states
- a recognition density that is encoded by its internal states
- Agents can suppress free energy by changing the two things it depends on:
- they can change sensory input by acting on the world
- they can change their recognition density by changing their internal states
- The Bayesian brain hypothesis uses Bayesian probability theory to formulate perception as a constructive process based on internal or generative models. The underlying idea is that the brain has a model of the world that it tries to optimize using sensory inputs
- In this view, the brain is an inference machine that actively predicts and explains its sensations
- The theme underlying the Bayesian brain and predictive coding is that the brain is an inference engine that is trying to optimize probabilistic representations of what caused its sensory input
- This optimization can be finessed using a (variational free-energy) bound on surprise
- The principle of efficient coding suggests that the brain optimizes the mutual information (that is, the mutual predictability) between the sensorium and its internal representation, under constraints on the efficiency of those representations
- At its simplest, the infomax principle says that neuronal activity should encode sensory information in an efficient and parsimonious fashion
- The infomax principle can be understood in terms of the decomposition of free energy into complexity and accuracy: mutual information is optimized when conditional expectations maximize accuracy (or minimize prediction error), and efficiency is assured by minimizing complexity
- The cell assembly theory posits that groups of interconnected neurons are formed through a strengthening of synaptic connections that depends on correlated pre- and postsynaptic activity; that is, "cells that fire together wire together"
- Value is central to theories of brain function that are based on reinforcement learning and optimum control. The basic notion that underpins these treatments is that the brain optimizes value, which is expected reward or utility (or its complement - expected loss or cost)
- Friston, Karl. "The free-energy principle: a unified brain theory?." Nature Reviews Neuroscience 11.2 (2010): 127.