**Biophysics: Searching for Principles**

**William
Bialek**

For
several years I have been teaching PHY 562 at Princeton, which is a Biophysics
course for PhD students in Physics, and I have the ambition of turning my
lecture notes into a book, to be published by Princeton University Press. This site has the current draft of the
book, downloadable all at once or chapter by chapter. I hope this is useful.

I should
emphasize that things still are a bit rough. The course changes every time I teach it, and whole sections
have never been written up. I try,
in spots, to give a hint (in red) about what is missing. I have given the current draft to
my editor, but this is just the start of a process, so there is plenty of time
for input, and I would appreciate any help you are willing to offer! Please donÕt hesitate to drop me a
note. If youÕd like to cite any of
the things you find here, I think you can use:

W Bialek, *Biophysics: Searching for Principles. *http://www.princeton.edu/~wbialek/PHY562.html
(2011).

For a
complete draft, click on the title at the top the page. For the individual chapters, click on
the items below.

Data you
will need for the problems can be found here.

**For Spring 2012: **I will teach the course once
again. We have twelve weeks of
lectures, and so the plan is to spend three weeks on each of the main topics,
as indicated by the dates next to the chapter titles. We will choose problems from the text, with one substantial
assignment each week. Grades
will be based 50% on homework and 50% on the final exam.

**Introduction **(Lecture M
6 Feb)

** **

** **A. About our subject

B. About this book

C. About this draft

Acknowledgments

**1.
Photon counting in vision **(Lectures W 8 Feb through W 22 Feb 2012)

** **

In
this Chapter, we will see that humans (and other animals) can detect the
arrival of individual photons at the retina. Tracing through the many steps from photon arrival to
perception we will see a sampling of the physics problems posed by biological
systems, ranging from the dynamics of single molecules through amplification
and adaptation in biochemical reaction networks, coding and computation in
neural networks, all the way to learning and cognition. For photon counting some of these
problems are solved, but even in this well studied case many problems are open
and ripe for new theoretical and experimental work. The problem of photon counting also introduces us to methods
and concepts of much broader applicability. We begin by exploring the phenomenology, aiming at the
formulation of the key physics problems.
By the end of the Chapter I hope to have formulated an approach to the
exploration of biological systems more generally, and identified some of the
larger questions that will occupy us in Chapters to come.

A. Posing the problem

B. Single molecule dynamics

C. Dynamics of biochemical networks

D. The first synapse, and beyond

E. Perspectives

**2.
Noise isn't negligible **(Lectures M 27 Feb through W 14 Mar 2012)

In
this Chapter, we will take a tour of various problems involving noise in
biological systems. Interactions between molecules involve energies of just a
few times the thermal energy.
Biological motors, including the molecular components of our muscles,
move in elementary steps that are on the nanometer scale, driven forward by
energies that are larger than the thermal energies of Brownian motion, but not
much larger. Crucial signals
inside cells often are carried by just a handful of molecules, and these
molecules inevitably arrive randomly at their targets. Human
perception can be limited by noise in the detector elements of our sensory
systems, and individual elements in the brain, such as the synapses that pass
signals from one neuron to the next, are surprisingly noisy. How do the
obviously reliable functions of life emerge from under this cloud of noise? Are
there principles at work that select, out of all possible mechanisms, the ones
that maximize reliability and precision in the presence of noise? I should admit up front that this is a
topic that always has fascinated me, and I firmly believe that there is
something deep to be found in exploration of these issues. We will see the problems of noise in
systems ranging from the behavior of individual molecules to our subjective,
conscious experience of the world.
In order to address these questions, we will need a fair bit of
mathematical apparatus, rooted in the ideas of statistical physics. I hope that, armed with this
apparatus, you will have a deeper view of many beautiful phenomena, and a
deeper appreciation for the problems that organisms have to solve.

A. Molecular fluctuations and chemical
reactions

B. Molecule counting

C. More about noise in perception

D. Proofreading and active noise reduction

E. Perspectives

**3.
No fine tuning **(Lectures
M 26 Mar through W 11 Apr 2012)

Imagine
making a model of all the chemical reactions that occur inside a cell.
Surely this model will have many thousands of variables, described thousands of
differential equations. If we write down this many differential equations
with the right general form but choose the parameters at random, presumably the
resulting dynamics will be chaotic. Although there are periodic spurts of
interest in the possibility of chaos in biological systems, it seems clear that
this sort of ÒgenericÓ behavior of large dynamical systems is not what
characterizes life. On the other hand, it is not acceptable to claim that
everything works because every parameter has been set to just the right
value---in particular these parameters depend on details that might not be
under the cell's control, such as the temperature or concentration of nutrients
in the environment. More specifically, the dynamics of a cell depend on
how many copies of each protein the cell makes, and one either has to believe
that everything works no matter how many copies are made (within reason), or
that the cell has ways of exerting precise control over this number; either
answer would be interesting. This problem—the balance between
robustness and fine tuning—arises at many different levels of biological
organization. In this section we
will look at several examples of the fine tuning problem, starting at the level
of single molecules and then moving ÒupÓ to the dynamics of single neurons, the
internal states of single cells more generally, and networks of neurons. As noted at the outset, these different
biological systems are the subjects of non-overlapping literatures, and so part
of what I hope to accomplish in this Chapter is to highlight the commonality of
the physics questions that have been raised in these very different biological
contexts.

A. Sequence ensembles

B. Ion channels and neuronal dynamics

C. The states of cells

D. Long time scales in neural networks

E. Perspectives

**4.
Efficient representation **(Lectures M 16 Apr through W 2 May 2012)

The
generation of physicists who turned to biological phenomena in the wake of
quantum mechanics noted that, to understand life, one has to understand not
just the flow of energy (as in inanimate systems) but also the flow of
information. In 1948, Shannon
proved a theorem stating that entropy, which we know and love from statistical
physics, is the unique measure of available information consistent with certain
simple and plausible requirements.
Further, entropy also answers the practical question of how much space
we need to use in writing down a description of the signals or states that we
observe. This leads to a
notion of efficient representation, and in this Chapter we'll explore the
possibility that biological systems in fact form efficient representations,
maximizing the amount of relevant information that they transmit and process,
subject to fundamental physical constraints. The idea that a
mathematically precise notion of ÒinformationÓ would be useful in thinking
about the representation of information in the brain came very quickly after
Shannon's original work. There is,
therefore, a well developed set of ideas about the how many bits are carried by
the responses of neurons, in what sense the encoding of sensory signals into
sequences of action potentials is efficient, and so on. More subtly, there is a body of work on
the theory of learning that can be summarized by saying that the goal of
learning is to build an efficient representation of what we have seen. In contrast, most discussions of
signaling and control at the molecular level has left ÒinformationÓ as a
colloquial concept. One of the
goals of this Chapter, then, is to bridge this gap. Hopefully, in the physics tradition, it will be clear how
the same concepts can be used in thinking about the broadest possible range of
phenomena. We begin, however, with the foundations.

A. Entropy and information

B. Does biology care about bits?

C. Optimizing information flow

D. Gathering information and making models

E. Perspectives

**5.
Outlook: How far can we go? **(not much here)

1.
Poisson process

2.
Correlations, power spectra and all
that

3.
Electronic transitions in large
molecules

4.
Cooperativity

5.
X-ray diffraction and biomolecular
structure

6.
Berg and Purcell, revisited

7.
Dimensionality reduction

8.
Maximum entropy

9.
Measuring information transmission