Untangling complex syste.., p.6

Untangling Complex Systems, page 6

 

Untangling Complex Systems
Select Voice:
Brian (uk)
Emma (uk)  
Amy (uk)
Eric (us)
Ivy (us)
Joey (us)
Salli (us)  
Justin (us)
Jennifer (us)  
Kimberly (us)  
Kendra (us)
Russell (au)
Nicole (au)



Larger Font   Reset Font Size   Smaller Font  
Riductionism

  Universality

  Determinism

  Mechanism

  Simplicity

  Uniformity

  Predictability

  Reversibility

  FIGURE 1.6 Four epistemological pillars guiding and inspiring the scientific inquiry during the Experimental

  Period.

  17 Pierre-Simon, marquis de Laplace (1749–1827) was a French mathematician and astronomer whose work was crucial for the development of mathematical astronomy and statistics. He formulated the Laplace’s equation and the Laplace’s

  transform. The widely used Laplacian differential operator is also named after him.

  Introduction

  11

  of a clock or other automaton follow from the arrangement of its counter-weights and wheels,” as

  René Descartes stated in his Treatise of Man in 1664. All activities and qualities of bodies are

  reduced to quantitative realities, i.e., mass and motion. From this idea, a picture of a universe host-

  ing just reversible transformations emerges. All motions are reversible: when a moving object has

  covered the distance from A to B, we at once imagine that it can go back over the path from B to A.

  If, therefore, everything that happens is motion, it is clear that any event in nature should occasion-

  ally retrace its march.

  The picture of a reversible universe faded during the Industrial Revolution, in the nineteenth

  century AD. During this century, there was a change from an agrarian, handicraft economy to one

  dominated by industry and machinery manufacture. The entrepreneurs of that time tried in any

  way to optimize the functioning of their thermally or electrically powered machines. They clashed

  with the impossibility of converting all the available heat into work, and they could not avoid the

  degradation of part of the mechanical or electrical energy into heat. The industrial processes like

  the most events in the Universe we stumble across are irreversible: energy is continuously squan-

  dered into heat. In the original scientific method proposed by Galileo and Newton and consisting in

  a “sublimation” of empirical phenomena before their mathematical description and interpretation,

  the irreversible feature of natural events was simply ignored. To interpret the empirical irrevers-

  ibility, a new theory was formulated: The Thermodynamics. Its Second principle introduces a new

  state variable, Entropy,18 and it asserts that the Entropy of the Universe increases relentlessly due to the irreversible processes. After the formulation of Thermodynamics, the epistemological pillar of

  Mechanism was still maintained, but in a reviewed form. It is still valid that the Universe looks like

  a machine, but in most of cases, it works irreversibly due to frictions and other energy dissipation

  processes.

  The existence of an “Arrow of Time” in nature was corroborated by biology with the develop-

  ment of the Theory of Evolution. Such theory is attributed to Charles Darwin (1809–1882), but it

  was “in the air” in the nineteenth century (Mitchell 2009). Most likely, life on earth comes from

  one form, called LUCA, i.e., the Last Universal Common Ancestor, which evolved in the myriad life

  forms we know, nowadays. Since “there is a frequently recurring struggle for existence, it follows

  that any being, if it varies however slightly in any manner profitable to itself, under the complex

  and sometimes varying conditions of life, will have a better chance of surviving, and thus be natu-

  rally selected. From the strong principle of inheritance, any selected variety will tend to propagate

  its new and modified form” (as stated by Darwin in his famous book,19 On the Origin of Species, published in 1859). Evolutionary processes, governed by natural selection, have given rise to the

  diversity of species. Life on earth has its history: the simple early forms transformed in always more

  various, beautiful and marvelous ones (paraphrasing Darwin) along the millennia.

  In the first half of the twentieth century AD, new scientific theories blossomed. They brought

  about a reworking of two of the pillars shown in Figure 1.6. The first new theory was formulated by Albert Einstein20 (1879–1955). The Theory of Relativity (Einstein 1916) describes the behavior of

  bodies moving at very high speed, close to that of light propagating in empty space at 3 × 108 m/s

  (that means at 1.1 × 109 Km/h). According to the Relativity theory, the description of mechanical

  18 The new function Entropy was introduced by Rudolf Clausius (1822–1888), a German physicist and mathematician, in 1865. Entropy derives from the Greek “έν τροπή,” having the meaning “in transformation” or “in change.”

  19 The full title of the Darwin’s book was On the origin of Species by Means of Natural Selection, or the Preservation of Favoured Races in the Struggle for Life. In this book, Darwin exposed his ideas inferred after his long scientific expedition around the world made on board of the Beagle. The content of Darwin’s book was as much revolutionary as the De Revolutionibus orbium coelestium by Copernicus. Copernicus reorganized the spatial order of things and conferred to the earth, and hence to human beings, a position that is not the center of “Universe.” On the other hand, Darwin reorganized the temporal order of life on earth and overthrew human being from his supposed prominent position.

  20 Albert Einstein may be considered one of the utmost Philo-physicists of the twentieth century. Time Magazine has defined him as the “Person of the 20th century,” “unfathomably profound-the genius among the geniuses.”

  12

  Untangling Complex Systems

  phenomena depends on the properties of the reference system. Although the natural laws are still

  universal, time is no longer uniform and absolute as Newton thought. Its flow depends on (1) the

  velocity of the system21 and (2) the strength of the gravitational field. Even the lengths of objects change in relation to the motion22 and have been predicted to contract in gravitational fields. As soon as the Theory of Relativity was experimentally confirmed, the pillar of Universality was permanently weakened with respect to the power it had before: the features of space and time are context

  dependent. A universal space and a universal time do not exist. What exists is a spacetime that can

  be curved by gravitational fields.

  The second revolutionary scientific theory formulated in the first half of twentieth century was

  Quantum Mechanics. It describes the behavior of microscopic bodies, such as molecules, atoms,

  and subatomic particles moving at ordinary velocities. Its foundations were established by many sci-

  entists: Max Planck (1858–1947), Niels Bohr (1885–1962), Werner Heisenberg (1901–1976), Louis

  de Broglie (1892–1987), Erwin Schrödinger (1887–1961), Max Born (1882–1970), Albert Einstein,

  John von Neumann (1903–1957), Paul Dirac (1902–1984), Wolfgang Pauli (1900–1958), David

  Hilbert (1862–1943), among others. Monumental discoveries in its early development were the con-

  cept of “Quantization” of physical variables and the “Uncertainty Principle.” The latter claims a

  fundamental limit on the accuracy with which certain pairs of physical properties of a particle can

  be simultaneously determined. For example, position and momentum. It means that it is impos-

  sible to simultaneously measure the present position while also determining the future motion of

  a particle with unlimited accuracy. This Principle struck a hard blow to the epistemological pillar

  of Determinism and Predictability. Laplace’s dream of predicting the future through the laws of

  physics and the determination of the instantaneous state of all the particles in the Universe was shat-

  tered irreparably. In the natural world, particles normally remain in an uncertain, non-deterministic

  “blurred” path and their evolution can be predicted just in probabilistic terms.

  The dream of predicting natural phenomena was shattered even earlier, at the end of nineteenth

  century, and for macroscopic bodies, by Henri Poincaré (1854–1912).23 He found out that a sys-

  tem as simple as that constituted by three orbiting planets exhibits a dynamic that is aperiodic and

  extremely sensitive to the initial conditions. In essence, Poincaré was the first Philo-physicist to

  experience Deterministic Chaos. In fact, Deterministic Chaos appears when a deterministic sys-

  tem (whose dynamics can be described by differential equations) exhibits aperiodic behavior that

  depends sensitively on the initial conditions, thereby rendering long-term prediction impossible.

  “If we knew exactly the laws of nature and the situation of the universe at the initial moment, we

  could predict exactly the situation of that same universe at a succeeding moment. But even if it

  were the case that the natural laws had no longer any secret for us, we could still only know the

  initial situation approximately,” as Poincaré (1908) stated in his essay “Science and Method.” In

  fact, the instrumental determination of the value of any variable is affected by a certainly unavoid-

  able uncertainty. Therefore, as Poincaré continues, “it may happen that small differences in the

  initial conditions produce very great ones in the final phenomena. A small error in the former will

  produce an enormous error in the latter. Prediction becomes impossible, and we have the fortu-

  itous phenomenon.” With the advent of the electronic computers in the 1950s, the solution of non-

  linear differential equations, even much more complex than that formulated by Poincaré, could

  be determined numerically and in a reasonable time. Therefore, many computational experiments

  were performed. One of these was particularly relevant. It was carried out by the meteorologist

  21 The duration of a phenomenon is longer in a moving system than in one at rest; moreover, the simultaneity of two events depends on the motion of the observer.

  22 At speed close to that of light, objects appear to be smaller than what they appear as when stationary or moving at ordinary speeds.

  23 Henri Poincaré was a French mathematician, theoretical physicist, engineer, and philosopher of science. He was indeed a polymath and was the first person to discover a chaotic deterministic system which laid the foundations of modern chaos theory.

  Introduction

  13

  Edward Lorenz (1917–2008). He formulated a simplified model of the dynamics of the terrestrial

  atmosphere to make weather forecasts. He described the dynamic of the terrestrial atmosphere by

  a system of three non-linear differential equations and re-discovered deterministic chaos. Lorenz

  found that if he started his simulations from two slightly different initial conditions, the resulting

  dynamical behavior would soon become completely different. This result implies that the weather

  is intrinsically unpredictable: tiny uncertainties in defining the initial conditions of the atmosphere

  are amplified rapidly, eventually leading to embarrassing forecasts. It was coined the “Butterfly

  Effect” to refer to this idea of sensitive dependence on the initial conditions for the dynamics of

  non-linear chaotic systems.

  The first half of twentieth century was dramatic not only for natural sciences but also for math-

  ematics. In the year 1900, at the International Congress of Mathematicians in Paris, the German

  mathematician David Hilbert (1862–1943) made a list of unsolved problems in mathematics. The

  most important ones were those regarding mathematics itself and what can be proved by using

  mathematics. They can be summarized in three fundamental questions (Mitchell 2009): (1) Is math-

  ematics complete? (2) Is mathematics consistent? (3) Is every statement in mathematics decidable?

  In the nineteenth century, mathematics was dominated by the axiomatic methodology. According to

  the axiomatic approach, any branch of mathematics must start from the formulation of a series of

  fundamental assumptions, the axioms, and then generate all relevant statements by logical deduc-

  tion. Therefore, the concept of “truth” is reduced to the concept of “provable from the axioms.”

  Of course, the success of this methodology depends on the ability to formulate “good axioms.” A

  mathematical statement is acceptable as axiom if it is rather simple and elementary enough to be

  considered “obviously true” (Devlin 2002). Mathematics is complete if every mathematical state-

  ment can be proved or disproved from a given finite set of axioms. Mathematics is consistent if only

  the true statements can be proved. Finally, every statement in mathematics is decidable if there is

  a definite procedure that tells us in finite time whether a statement is true or false. Until 1930, the

  three fundamental questions mentioned earlier remained unanswered, but Hilbert was confident that

  the answers would be three “yes.” In fact, Hilbert and many others were convinced they were on the

  verge of discovering an automatic way to prove or disprove any mathematical statement. However,

  Hilbert, along with the entire community of mathematicians and philosophers, remained astounded

  when a twenty-five-year-old mathematician named Kurt Gödel (1906–1978) presented a proof of

  the so-called Incompleteness Theorem.24 This theorem states that if mathematics is consistent, then it is incomplete and there are true statements that cannot be proved. If it were inconsistent, then

  there would be false statements that could be proved, and the entire building of mathematics would

  crash down.25 Therefore, the answer to question (I) is “no.” But the surprises had not ended, here. In fact, in 1935, the twenty-three-year-old Alan Turing (1912–1954), a graduate student at Cambridge,

  demonstrated that the answer to question (III) mentioned earlier, was “no,” again. Turing invented

  the computing machine that could solve any computational problem for which an algorithm could

  be devised. He found that there are mathematical problems that cannot be solved by his machine

  (known as the Turing Machine), and it follows that any other automatic computer cannot solve

  these problems. They are problems for which algorithms cannot be written, even in principle. The

  example studied by Turing was the problem of predicting whether the Turing Machine, once it is

  set in motion, will ever finish its calculation and halt. By analyzing this problem, Turing was able to

  demonstrate that there can be no general procedure for telling whether mathematical propositions

  24 Gödel used arithmetic to demonstrate his “Incompleteness Theorem.”

  25 The proof given by Gödel is complicated, but it can be intuitively understood by saying (Mitchell 2009): “This statement is not provable.” Let us call this sentence “statement S.” Now, let us suppose that “statement S” could indeed be proved.

  But then it would be false because it states that it cannot be proved. That would mean that a false statement could be proved, and mathematics would be inconsistent. If we assume that “statement S” cannot be proved, then it would mean

  that “statement S” is true. The result is that mathematics is consistent, but it would be incomplete because there is a true statement that cannot be proved. Therefore, mathematics is either inconsistent or incomplete.

  14

  Untangling Complex Systems

  are true or false. A result of Turing’s work was to partition all possible mathematical problems into

  two sets. One set contains all those problems for which algorithms can never be written; they are

  unsolvable. The other set includes all those problems that can be solved by algorithms (Lewis and

  Papadimitriou 1978). Some of the solvable problems are tractable, but others are intractable because

  they cannot be solved accurately and in a reasonable time.

  Just as Quantum Mechanics and Chaos Theory shattered Laplace’s dream of the unlimited pre-

  dictive power of science, Gödel’s and Turing’s results shattered Hilbert’s dream of the unlimited

  computing power of mathematics.

  In the last decades of the twentieth century, among the epistemological pillars depicted in

  Figure 1.6, the only one that was not debunked by the evolution of the scientific theories was that sustaining the Simplicity of nature, which inspired the reductionist approach. The reason is the

  kind of systems that had been investigated until then. In the seventeenth, eighteenth, and nineteenth

  centuries, scientists faced problems of “simplicity,” involving two or a few more variables. In the

  first half of the twentieth century, scientists also faced problems of the so-called “disorganized

  complexity” involving billions of variables (Weaver 1948). They developed powerful techniques of

  probability theory and statistical mechanics to rationalize the behavior of “disorganized complex”

  systems. 26 In the second half of the twentieth century, scientists turned their attention to problems of “organized complexity.” Organized Complexity is a peculiarity of systems that are in out-of-equilibrium conditions and are constituted of many strongly interacting elements.27 Such systems

  exhibit emergent properties. Emergent properties are characteristics not directly traceable to the

  system’s components, but rather to how these parts interact together within the entire system as a

  whole. For instance, a living organism can be dissected in its elementary constituents, biopolymers,

  and other molecules, but in dissecting it, we kill it, and we cannot learn what gives it life. The out-

 

Add Fast Bookmark
Load Fast Bookmark
Turn Navi On
Turn Navi On
Turn Navi On
Scroll Up
Turn Navi On
Scroll
Turn Navi On
183