> next up previous
Next: Other Definitions of Information Up: Theory of Molecular Machines. Previous: Overview of the Derivations

Uncertainty, Entropy, and Information

Suppose that a molecular machine has $\Omega$ possible microstates, each with a particular probability Pi:

 \begin{displaymath}\sum_{i = 1}^\Omega P_i = 1
\;\;\;\mbox{and}\;\;\;
P_i \geq 0 .
\end{displaymath} (2)

The set of all possible microstates forms a sphere in a high dimensional space [Schneider, 1991], and $\Omega$ is proportional to the volume of the sphere [Callen, 1985]. Each ``microstate'' represents a particular machine configuration. We may write the uncertainty of the machine's microstates using Shannon's formula [Shannon, 1948,Shannon & Weaver, 1949,Pierce, 1980,Bharath, 1987]:

 \begin{displaymath}H \equiv - \sum_{i = 1}^\Omega P_i \log_2 P_i
\;\;\;\;\;\mbox{(bits per microstate)} .
\end{displaymath} (3)

Likewise, the Boltzmann-Gibbs entropy of a physical system, such as a molecular machine, is

 \begin{displaymath}S \equiv - k_{\mbox{\scriptsize B}}\sum_{i = 1}^\Omega P_i \l...
...joules}}
{\mbox{$\mbox{K}$ } \cdot \mbox{microstate}} \right)
\end{displaymath} (4)

where $k_{\mbox{\scriptsize B}}$ is Boltzmann's constant $(1.38 \times 10^{-23}$ joules / $\mbox{K}$) [Waldram, 1985,Weast et al., 1988]. Since $\log_2 (x) = \ln(x) / \ln(2)$,

 \begin{displaymath}S = k_{\mbox{\scriptsize B}}\ln(2)
\left[
- \sum_{i = 1}^\Omega P_i \log_2 P_i
\right] .
\end{displaymath} (5)

Substituting equation (3) into (5) gives

 \begin{displaymath}S = k_{\mbox{\scriptsize B}}\ln(2) H .
\end{displaymath} (6)

The only difference between uncertainty and entropy for the microstates of a macromolecule is in the units of measure, bits versus joules per $\mbox{K}$ respectively [von Neumann, 1963,Brillouin, 1962,Rothstein, 1951].

The entropy of a molecular machine may decrease at the expense of a larger increase of entropy in the surroundings. For a decrease in the entropy during a machine operation:

 \begin{displaymath}\Delta S = S_{after} - S_{before}
\;\;\;\;\;
\left( \frac{\mbox{joules}}
{\mbox{$\mbox{K}$ } \cdot \mbox{operation}} \right)
\end{displaymath} (7)

there is a corresponding decrease in the uncertainty of the machine:

 \begin{displaymath}\Delta H = H_{after} - H_{before}
\;\;\;\;\;\mbox{(bits per operation)} .
\end{displaymath} (8)

Using (6) we find:

 \begin{displaymath}\Delta S = k_{\mbox{\scriptsize B}}\ln(2) \Delta H .
\end{displaymath} (9)

When the uncertainty of a machine decreases during an operation, it gains some information R [Shannon, 1948,Brillouin, 1951b,Tribus & McIrvine, 1971,Rothstein, 1951] defined by:

 \begin{displaymath}R \equiv - \Delta H
\;\;\;\;\; \mbox{(bits per operation)} .
\end{displaymath} (10)

This is the information discussed in [Schneider, 1991] and measured in [Schneider et al., 1986]. It is important to notice that Hafter is not always zero. For example, a DNA sequence recognizer may accept a purine at some position in a binding site, in which case Hafter is 1 bit. Thus we cannot equate information gained (R) with the uncertainty before an operation takes place ( Hbefore) nor with the uncertainty remaining after an operation has been completed (Hafter). Use of definition (10) avoids a good deal of confusion found in the literature [Popper, 1967a,Popper, 1967b,Wilson, 1968,Ryan, 1972,Ryan, 1975].

In particular, the largest possible value of R is obtained when Hafter is as small as possible (perhaps close to zero) and Hbefore is maximized. The latter occurs only when the symbols are equally likely, in which case equation (3) collapses to $H_{equal} = \log_2 \Omega$. In the same way, if there are My symbols, the information required to chose one of them is $\log_2 M_y$. This form was used by Shannon [Shannon, 1949] and in the previous paper of this series to determine the capacity formulas.

Substituting (10) into (9) gives:

 \begin{displaymath}\Delta S = -k_{\mbox{\scriptsize B}}\ln(2) R .
\end{displaymath} (11)

This equation gives a direct, quantitative relationship between the decrease in entropy of a molecular machine and the information that it gains during an operation [Rothstein, 1951]. We must carefully note that $\Delta S$ in (11) refers only to that part of the total entropy change that accounts for the selection of states made by the machine during an operation. Since R is positive for an operation, this $\Delta S$ is always negative. For the operation to proceed, the total entropy of the universe must increase or remain the same:

 \begin{displaymath}\Delta S_{universe} = \Delta S + \Delta S_{surround} \geq 0.
\end{displaymath} (12)

For example, the equality holds when a solution of EcoRI and DNA is at equilibrium in the absence of Mg++ to prevent cutting. Priming and machine operations occur, but the entropy of the universe does not increase. In other words, the entropy of the local surroundings must increase in compensation for a molecular machine's entropy decrease during an operation, $\Delta S_{surround} \geq -\Delta S$.


next up previous
Next: Other Definitions of Information Up: Theory of Molecular Machines. Previous: Overview of the Derivations
Tom Schneider
1999-12-24