William Bialek
NEC Research Institute
Nearly 50 years ago, Attneave and Barlow
suggested that one goal of computation in
the brain might be to generate an ``efficient''
representation of the sensory world, where
efficiency could be quantified in terms of
information and entropy. Most generally,
this idea implies that the strategies used
by our brains should be matched, quantitatively,
the statistical structure of the physical
world around us; this has an obvious appeal
for physicists! I will review recent experiments
that provide direct and dramatic evidence
on the efficiency of the neural code in a
simple system. In particular it has been
possible to show that coding strategies adapt
to changes in the statistics of sensory inputs,
that this adaptation serves to optimize information
transmission, and that the dynamics of the
adaptation itself approaches the limiting
speed set by statistical considerations.
On the more theoretical side, I will discuss
progress toward a universal notion of efficiency
in learning, and the surprising connection
of this concept to the problem of characterizing
complexity in dynamical systems.