Untangling Complex Systems, page 9
namics is
S = − k ∑ ( )
B
pi lnpi , [2.5]
i
Reversibility or Irreversibility? That Is the Question!
25
where:
k is Boltzmann’s constant
B
p is the probability of the i-th microstate and the sum is extended to all the microstates
i
If a system is in a macroscopic state, which corresponds to Ω all equally probable microstates, the
definition of statistical entropy becomes
S = kBlnΩ. [2.6]
From equation [2.6], it is evident that an increase of entropy means a rise in the number of the acces-
sible microstates, Ω. A microstate is defined by specifying the distribution of structural units (1) in
the available space and (2) among the accessible energetic levels. Therefore, the total entropy is the
sum of two contributions (Denbigh 1981): configurational entropy associated with the structural
and spatial disorder, and thermal entropy associated with the distribution of the structural units
among the accessible energetic levels (see Figure 2.2). A spontaneous process does not necessar-
ily imply a growth of the structural and spatial disorder. For example, the crystallization of an
under-cooled liquid inside an adiabatic vessel is a spontaneous process. It gives rise to the solid
phase, which is spatially more ordered than the liquid one. Apparently, it violates the Second Law
of Thermodynamics. However, the solution to what seems like a paradox comes from the consid-
eration that the crystallization is an exothermic process that releases heat. Therefore, the structural
elements of the crystals have more thermal energy available, and the number of accessible energetic
states increases. In other words, the decrease of configurational entropy is counterbalanced by the
larger increase of thermal entropy.
We can make another example to understand that entropy is not merely a synonym of disorder,
but it is related to all the degrees of freedom of the structural units. The second example is the
chromosome segregation phenomenon during the bacterial cell replication (Jun and Wright 2010).
If we look at Figure 2.3, we may ask ourselves: how is it possible to have segregation of chromo-
somes if the driving force is the increase of disorder? In fact, from studying the behavior of gases,
we know that two species of particles that are initially separated by a wall, mix perfectly when the
wall is removed (see Figure 2.3a and b). We expect that when two chromosomes are intermingled together (see Figure 2.3c), they maintain this state because the entropy is maximized. Actually, this is not true. The two chromosomes spontaneously segregate, and the final state is that labeled as d in
Figure 2.3. This event can be understood if we imagine long polymer chains as random walks that
Entropy
(Statistical definition)
S = kBlnΩ
Ω: n° of microstates
Spatial distribution
Distribution of structural units
of structural units
among the energetic levels
Configurational
Thermal
entropy
entropy
FIGURE 2.2 The two contributions to the statistical entropy.
26
Untangling Complex Systems
(a)
(c)
(b)
(d)
FIGURE 2.3 Free mixing of two gases: (a) and (b) are the initial and the final states, respectively. Segregation of two polymers: (c) and (d) are the initial and final states, respectively.
cannot cross their own path (the so-called “self-avoiding random walk”). Finding intermixed but
self-avoiding conformations is a tough task. Intermixed chains have fewer conformational degrees
of freedom than the ones that are segregated. Therefore, the driving force of chromosome segrega-
tion is the maximization of entropy that measures the total degrees of freedom of a system.
According to the Second Law of Thermodynamics, the value of Ω remains constant, or rises in
an isolated system, when it hosts a reversible or an irreversible transformation, respectively. The
system evolves spontaneously towards the state hiding the largest number of microstates (Ω ):
f
Ω
S
f
f − Si =
kBln
. [2.7]
Ω i
When the number of possible microstates increases due to irreversible transformations, we lose
information about the state of the system at the molecular level. To avoid this loss of knowledge,
we should know the position and the momentum of every particle at a certain instant of time, and
then predict their evolution in time, knowing the laws that describe their dynamics. However, such
detailed knowledge is out of our reach, because when we try to retrieve information about the
microscopic world, we must account for the Indetermination Principle of Heisenberg. 2 From the
statistical definition of entropy, it is evident that the Second Principle of Thermodynamics comes
from our incapacity to know the accurate mechanical state of our system at the microscopic level.
This unavoidable uncertainty prevents us from transforming heat entirely into work without causing
modifications in any other system. In fact, heat is a “disordered” form of energy, whereas work is a
more “ordered” form of energy.
Finally, the statistical interpretation of the Second Principle shows us that irreversibility is a
matter of probability. It is not impossible that our glass, fallen from our table and shattered in many
pieces on the floor, goes back intact to its initial place. It is only a matter of probability and hence
a matter of time. If we waited enough time, we could witness the unlikely event of fragments of
our glass recomposing themselves on the table. The instant in which this may happen is completely
unpredictable. In other words, the statistical interpretation of the Second Principle tells us that it
is not impossible that an isolated system moves away from its equilibrium state. It is a matter of
probability, and such probability largely decreases with the growth of the number of particles
constituting the system itself.
2 According to the Indetermination Principle of Heisenberg, it is impossible to determine accurately and simultaneously position and momentum of a particle.
Reversibility or Irreversibility? That Is the Question!
27
2.2.3 The logical definiTion of enTroPy
In the twentieth century, with the revolution of wireless telecommunications and the appearance
of the first computers, a new era began: The Information Age. In fact, communication devices and
computers receive, store, process, and send information.
In 1948, the American engineer and mathematician Claude Shannon wrote a paper titled
“A Mathematical Theory of Communication,” which is a founding work for the field of Information
Theory. In this key paper, Shannon proposed concepts useful to theorize and devise communication
systems. The communication of information involves five essential elements (see Figure 2.4). The
primary component is the information source producing a message. The message is transformed into
a signal by the transmitter. The signal is a series of symbols and is sent by the transmitter through a
medium which is named as the communication channel. A receiver collects the signal crossing the
communication channel and reconstructs the message intended to be received by the destination.
The signal consists of a series of symbols, conveying a certain amount of information. If the
signal is encoded in binary digits, 0s and 1s, the basic unit amount of information becomes the bit.
To distinguish between two symbols, we need to receive one bit. To distinguish among 2 n symbols,
we need to retrieve n bits (see Table 2.2).
The quantity of information ( I) collected by the receiver can be quantified by determining the
difference between the uncertainty before ( H ) and after ( H ) receiving the signal:
bef
aft
I = Hbef − Haft. [2.8]
The uncertainty is given by the equation [2.9]:
M
H = −
pi log2 pi
∑
, [2.9]
i=1
where:
M is the number of symbols
p is the probability of i-th symbol
i
Information
Transmitter
Communication channel
source
Receiver
Destination
FIGURE 2.4 Schematic representation of a communication system.
TABLE 2.2
Relation Between States and Bits
2 states: 1 bit
4 states: 2 bits
8 states: 3 bits
State A: 0
State A: 00
State A: 000
State B: 1
State B: 01
State B: 001
State C: 10
State C: 010
State D: 11
State D: 100
State E: 110
State F: 101
State G: 011
State H: 111
28
Untangling Complex Systems
If the M symbols are all equally likely before receiving the signal, it derives that
pi = 1 [2.10]
M
Hbef = log2 M. [2.11]
If, after receiving the signal, all the symbols are perfectly known, then
Haft = 0 [2.12]
I = log2 M. [2.13]
Equation [2.13] represents the maximum value of the information we can get in this case. 3 On the
other hand, if there is noise over the communication channel, perturbing the transmission of the
signal, Haft ≠
0, and I < log2 M.
Once the information is received, it can be processed by using a computer. Current computers are
based on electronic circuits and the so-called Von Neumann architecture that was proposed for the
first time in the mid-forties of the twentieth century (Burks et al. 1963). Four main elements constitute
a computer (see Figure 2.5). First, a memory for storing information. Second, a Central Processing Unit (CPU) processes the information. CPU consists of two subunits: The Arithmetic Logic Unit
(ALU), which performs arithmetic and logical operations, and the Control Unit (CU), which extracts
instructions from memory, decoding and executing them. The third element of a computer is the
Information Exchanger (IE) allowing for the transfer of information in and out of the computer; it
consists of a screen, keyboards, et cetera. Finally, the fourth element is the data and instructions Bus,
working as a communication channel and binding the other three components of the computer by
conveying information among them. Electronic computers are general purpose computing machines
because the instructions to make computation are stored into the memory as sequences of bits into
BUS
CPU
MEMORY
ALU
CU
IE
FIGURE 2.5 Schematic structure of a computer having the Von Neumann architecture.
3 If the M symbols are not equally probable, I < log2 M.
Reversibility or Irreversibility? That Is the Question!
29
the software. The information inside the computer is encoded in electrical signals, and transistors act
as the basic switching elements of the CPU. One of the main tasks of Information Technology is that
of always devising more powerful computers, capable of processing larger amounts of information at
increasingly higher speeds but at lower power levels, volume, and price.
Computers are physicochemical systems. The laws of physics and chemistry dictate what they
can and cannot do. The amount of information that a physical system can store and process is
related to the number of distinct physical states that are accessible to the system. A system with Ω
accessible states can register log2Ω bits of information. If we keep in mind the statistical definition
of entropy (equation [2.6]), it is evident that information and entropy are intimately linked. The
laws of thermodynamics play key roles in computation. When the information of one bit goes into
observable degrees of freedom of the computer, such as another bit, then it has been moved and not
erased; but if it goes into unobservable degrees of freedom such as microscopic random motion of
molecules, it results in an increase of entropy of kBln 2 (Lloyd 2000). The rate at which a computer
can process information is limited by the maximum number of distinct states that the system can
pass through, per a unit of time. It has been demonstrated (Margolus and Levitin 1998) that an iso-
lated quantum system with average energy E takes time at least ∆ t = π /2 E to evolve to a distinct (orthogonal) state. From this relation, it derives that if E is the energy allocated in the l-th logic gate l
of a computer, the total number of logic operations performed per second is equal to the sum over
all logic gates of the operations per second per gate:
1
2 E
2 E
l
tot
∑ ≤
. [2.14]
∆
∑ =
t
π
π
l
l
l
In other words, the rate at which a computer can compute is limited by the total amount of energy
available: Etot.
The theory of information shows that information is strictly related to the statistical definition of
entropy (compare equation [2.9] with equation [2.5]). The analogy between H and S is not simply
formal, but also semantic. In fact, if a macroscopic system is in a certain thermodynamic state,
entailing Ω equally probable microstates, the uncertainty we have about it is proportional to the
i
logarithm in base two of Ω . If this system undergoes an irreversible transformation, it will end up
i
in a new state, entailing a larger number (Ω ) of equally likely microstates. After the spontaneous
f
event, the information (equation [2.8]) we have about the system, is
Ω
I
i
= log2
[2.15]
Ω f
This amount of I is negative: after an irreversible transformation, we have lost information about the
microscopic state of the system. Note that if we multiply equation [2.15] by − kBln 2, we obtain equa-
tion [2.7], that is the statistical definition of Δ S. Shannon confirmed, through his theory of communi-
cation, that an increase of entropy corresponds to a loss of information about the microscopic state of
the system. The strict link between the Second Principle of Thermodynamics and the Mathematical
Theory of Information is confirmed by the law of “Diminishing Information” (Kåhre 2002). This
law states that if we consider the transmission of information in a chain A→ B→ C, the information that C has about A ( I(C@A)) is less or at most equal to the information B has about A ( I(B@A)): I ( C@ A) ≤ I ( B@ A) [2.16]
Compared to direct reception, an intermediary can only decrease the amount of information. As in
any effort to produce work, and likewise in any attempt at sending information, there are unavoid-
able sources of loss.
30
Untangling Complex Systems
2.3 AN EXHAUSTING FIGHT AGAINST ENTROPY
So far, we have learned that spontaneous processes bring about an increase in the entropy in our
universe. The growth of entropy means dissipation of energy and/or loss of information. Now, we
ask ourselves:
“Is it possible to violate the Second Principle of Thermodynamics?”
This question was also raised by Isaac Asimov in his original story titled The Last Question pub-
lished in 1956. In this science-fiction short story, Asimov plays to predict the human history for
the next several trillion years. He imagines humans always devising more powerful computing
machines. One of these, called Multivac, helps humankind to solve the energy issue by unveiling
how to exploit solar energy. Armed with this discovery, humans colonize the Universe in the com-
ing millennia, because they can harvest and exploit the energy of the stars. However, one great
challenge remains to be won. How can the net amount of entropy of the Universe be massively
decreased? They pose this question to their sharpest computers, but the only answer they receive is:
“THERE IS INSUFFICIENT DATA FOR A MEANINGFUL ANSWER.”
Actually, Maxwell, Boltzmann, and Gibbs found an answer in their theory of statistical thermody-
namics. In fact, they told us that it is possible to witness a decrease in entropy. It is just a matter of
waiting for the right time; in other words, it is just a matter of probability. The more spontaneous a
process, the less probable its reverse, and the longer the time we need to wait, on average. Aware of
this, but not satisfied by this random way of violating the Second Principle of Thermodynamics, we
