- Molecular Biology
- Computational Neuroscience
While the brain is totally unlike current digital computers, much of what it does can be described as computation. Associative memory, logic and inference, recognizing an odor or a chess position, parsing the world into objects, and generating appropriate sequences of locomotor muscle commands are all describable as computation. My current research
focuses on the theory of how the neural circuits of the brain produce such powerful and complex computations. Olfaction is one of the oldest and simplest senses, and much of my recent work derives from considerations of the olfactory system. One tends to think of olfaction as "identifying a known odor," but in highly olfactory animals, the problems solved are much
more complicated. Many animals use olfaction as a remote sense, to understand the environment around them in terms of identifying both where and what objects are remotely located. This involves at least coordinating when something is smelled with the wind direction, and untangling a weak signal from an odor object from a background of other odors simultaneously present. The homing of pigeons or the ability of a slug to find favorite foods are examples of such remote sensing.
Any computer does its computation by its changes in internal state. In neurobiology, the change of the potentials of neurons (and changes in the strengths of the synapses) with time is what performs the computations. Systems of differential equations can represent these aspects of neurobiology. We seek to understand neurobiological computation by studying the behavior of equations modeling the time-evolution of neural activity. The biophysics of neurons is immensely complicated. A model of this biophysics eliminates the detailed description of much of this biophysics, trying to preserve the essence of what is useful in computation and to suppress less important details. In general, a piece of computer hardware is very effective when the device physics (or biophysics) can be used directly in a
computational algorithm. Different aspects of the device properties will be useful to different computations, and the effectiveness of the computations done by biology lies in the selection and enhancement of particular biophyisical features for appropriate biological computations.
How is information represented in a brain? Light and chemo-reception generate currents across a cell membrane by means of a cascade of chemical events within a receptor cell. By
the time information has left the retina or the nose, it has been rerepresented as a sequence of action potentials. For much of neurobiology, the current paradigm describes information as represented by the "firing rates." In this paradigm, information is represented by the rate of generation of action potential spikes, and the exact timing of these spikes is unimportant.
There are a few places, for example, the binaural auditory determination of the location of a sound source, where information is encoded in the timing of action potentials. Since action potentials last only about a millisecond, the use of action potential timing seems a powerful potential means of neural computation. Much of my research in the last few years has focused on the use of action potential timing as a carrier of information and as a means of neurobiological computation.
Ideas as to how neural computations can be done can also be examined in an applied context, by seeing how well putative "neural" algorithms perform in some of the problems that
biological systems do well. Issues in associative memory are one such problem. Another is the problem of identifying words in natural speech. This problem in sequence recognition is a
difficult computational task that brains easily do, and they also recognize sequences in other sensory modalities. We use speech as a test-bed task for thinking about the computational abilities of neural networks and neuromorphic ideas.