>
Next: Other Definitions of Information
Up: Theory of Molecular Machines.
Previous: Overview of the Derivations
Suppose that a molecular machine has
possible microstates,
each with a particular probability Pi:
|
(2) |
The set of all possible microstates forms a sphere in a high dimensional
space [Schneider, 1991], and
is proportional to the volume of the sphere [Callen, 1985].
Each ``microstate'' represents a particular machine configuration.
We may write the uncertainty of the machine's microstates using
Shannon's formula [Shannon, 1948,Shannon & Weaver, 1949,Pierce, 1980,Bharath, 1987]:
|
(3) |
Likewise, the Boltzmann-Gibbs entropy of a physical system,
such as a molecular machine, is
|
(4) |
where
is Boltzmann's constant
joules / )
[Waldram, 1985,Weast et al., 1988].
Since
,
|
(5) |
Substituting equation (3) into (5) gives
|
(6) |
The only difference between uncertainty and entropy
for the microstates of a macromolecule is in the units of measure,
bits versus joules per
respectively
[von Neumann, 1963,Brillouin, 1962,Rothstein, 1951].
The entropy of a molecular machine may decrease
at the expense of a larger increase of entropy in the surroundings.
For a decrease in the entropy during a machine operation:
|
(7) |
there is a corresponding decrease in the uncertainty of the machine:
|
(8) |
Using (6) we find:
|
(9) |
When the uncertainty of a machine decreases
during an operation, it gains some information
R [Shannon, 1948,Brillouin, 1951b,Tribus & McIrvine, 1971,Rothstein, 1951] defined by:
|
(10) |
This is the information discussed in [Schneider, 1991]
and measured in [Schneider et al., 1986].
It is important to notice that Hafter is not always zero.
For example, a DNA sequence recognizer may accept a purine at some
position in a binding site, in which case
Hafter is 1 bit.
Thus we cannot equate information gained (R) with
the uncertainty before an operation takes place (
Hbefore) nor with
the uncertainty remaining after an operation
has been completed (Hafter).
Use of definition (10) avoids a good deal of confusion
found in the literature
[Popper, 1967a,Popper, 1967b,Wilson, 1968,Ryan, 1972,Ryan, 1975].
In particular, the largest possible value of R is obtained when
Hafter is as small as possible (perhaps close to zero) and
Hbefore is maximized.
The latter occurs only when the symbols are equally likely, in
which case equation (3) collapses to
.
In the same way, if there are My symbols, the
information required to chose one of them is
.
This form was used by Shannon [Shannon, 1949]
and in the previous paper of this series to determine
the capacity formulas.
Substituting (10) into (9) gives:
|
(11) |
This equation gives a direct, quantitative relationship between the decrease in
entropy of a molecular machine and the information that it gains
during an operation [Rothstein, 1951].
We must carefully note that
in (11)
refers only to that part of the total entropy change that accounts
for the selection of states made by the machine during an operation.
Since R is positive for an operation, this
is always negative.
For the operation to proceed, the total entropy of the universe
must increase or remain the same:
|
(12) |
For example, the equality holds when a solution of EcoRI and DNA
is at equilibrium in the absence of Mg++ to prevent cutting.
Priming and machine operations occur, but the entropy of the universe
does not increase.
In other words, the entropy of the local surroundings must increase
in compensation for a
molecular machine's entropy decrease during an operation,
.
Next: Other Definitions of Information
Up: Theory of Molecular Machines.
Previous: Overview of the Derivations
Tom Schneider
1999-12-24