A reading of Vlatko Vedral’s Decoding Reality, a treatise on the physical nature of information and it’s roll in just about everything, gives a novel spin on an old question in evolution: is there a natural tendency for life to increase in complexity and biodiversity? There have been arguments for and against this idea.
Some science heavy-weights, like the late Stephen Gould, suggested an opposite trend. He viewed the fantastical array of morphology and extinct taxa in the Cambrian-period (some 530 mya) as revealed by Burgess Shale fossils, as evidence that diversity has in fact decreased. It seemed that most modern phyla appeared early on in Earth's history, and we've since lost many of the most bizarre taxa. (The Fossils were so weird looking that early discovers gave new taxa names such as Hallucigenia. Researcher Simon Conway Morris is said to have opened a box of fossils to exclaim, “oh fuck, not another new phylum!”) The suggestion was that the greatest variety of life existed early on, and has since gone through a winnowing process.
Hallucigenia sparsa, an extinct animal from the Cambrian Period
However, more recent assessments of the Burgess Shales, and literal rearrangements of body parts, suggests that the past wasn't so weird, and many of the Cambrian freaks are actually related to modern extant phyla. Furthermore, most quantifications on the complexity of life, such as the number of fossil species plotted over time, show an unambiguous upward trend (despite ~6 mass-extinction events).
Philosophers have also weighed in on the question of Life’s increasing complexity, and done so from some fairly basic principles, such as the Second Law of Thermodynamics, or Entropy, the tendency for a closed system to go from organized to disorganized. For example, hot and cold want to equalize, great artworks weather and fade, litter scatters in the wind, and my desktop gets more cluttered.
Life seems pretty miraculous when compared to this Fundamental law. It seems that most creatures are hell bent on moving matter around into orderly patterns. Arguably, a basic definition for Life may be the maintenance and control of orderly gradients over membranes, which otherwise want to equalize and lose their dynamism. Your very neurons are in fact just changes in concentration gradients of K+, Cl-, and Ca2+: the brain demands a lot of energy to do so, consuming 25% of a human's glucose.
Vedral offers an interesting take on this seeming contradiction between the Universal law of Dissorderliness, and Life’s tendency to increase in complexity. He does it by linking Entropy with Information, the key that both have the same basic equation, thanks to Claude Shannon's work at the Bell Labs during the 1940’s. The information content of a phenomenon is the log of how probable the event is. This is both physical and intuitive. Consider the News: a report on an Icelandic Volcano eruption downing all European air-traffic is more news-worthy (and informative) than a report on a regular traffic congestion. One has a much lower probability of occurring than the other, and therefore its occurrence is informative.
Information, therefore, is inversely related to the probability of something happening, and has the same functional form as Entropy, the tendency for low-probability states to decay into more probably states (e.g., a clean room goes from organized to disorganized--the reverse being highly unlikely to happen on its own).
The information analogy finds obvious utility in genetics. Considering the phenotypic output of our genes--you’d guess that the genome is a highly ordered thing-- and thereby existing in a highly improbable, low entropy state. How the heck could something so ordered and useful come about naturally, when disorganization is the rule of the universe?
Enter entropy, manifesting as random mutations. Our cellular machinery employs considerable energy checking and repairing such mutations. Cancer is one consequence of the failure to do so. But at the same time, these mutations are key to natural selection: every so often, a beneficial mutation occurs and increases the heritable variation within a gene pool, thereby giving natural selection something to “select” upon. Entropy breeds variation! Think of it another way: it would be a highly improbable, extremely low-entropy affair if our cellular machines could replicate themselves perfectly and produce vast populations of identical individuals ad infinitum. Such a state of affairs is just begging for a lesson in thermodynamics.
(The “beneficial” variation also comes with an immense amount of “junk” variation too. Previously, it was thought that only 4% of a human’s genome actually does anything useful. This number has been recently bumped up to 44%-- nonetheless, that is still a lot of “junk” within the our precious Codebook for Life)
“Endless forms most beautiful” as Darwin wrote, and that’s what we have to look forward because of the degenerative tendency of the universe.