.post-body { line-height:100%; } -->

Sunday 24 April 2016

Order, Order!

Entropy in Thermodynamics, Statistical Mechanics, Evolution and Information Theory.


One of the most common misconceptions, and possibly the one most difficult to properly treat, is that evolution is a violation of the law of entropy. In this essay, I want to provide a comprehensive treatment of entropy. This will not be easy in the word limit (nor in the time I have to write this), not least because it is a concept horribly misunderstood, often because of the way it's treated in popular science books by people who really should know better. By the time I am finished, one of two things will have happened. Either I will have simplified entropy to the state where it can be easily understood, or I will have demonstrated that I don't actually understand it myself.

So what is entropy? Let's begin with what it is not, and a quote from a paper specifically dealing with entropy in evolution:

Styer 2008 wrote: Disorder is a metaphor for entropy, not a definition for entropy.

Metaphors are valuable only when they are not identical in all respects to their targets. (For example, a map of Caracas is a metaphor for the surface of the Earth at Caracas, in that the map has a similar arrangement but a dissimilar scale. If the map had the same arrangement and scale as Caracas, it would be no easier to navigate using the map than it would be to navigate by going directly to Caracas and wandering the streets.) The metaphor of disorder for entropy is valuable and thus imperfect. For example, take some ice cubes out of your freezer, smash them, toss the shards into a bowl, and then allow the ice to melt. The jumble of ice shards certainly seems more disorderly than the bowl of smooth liquid water, yet the liquid water has the greater entropy. [1]


The problem is that there are different definitions of entropy from different branches of science and, while they are all related, they're not actually equivalent. In thermodynamics, entropy is a measure of how much energy in a system is unavailable to do work. In statistical mechanics, it's a measure of uncertainty, specifically the uncertainty of a system being in a particular configuration. This can loosely be described as the number of different ways a system could be reconfigured without changing its appearance. This latter is also related to information entropy or Shannon entropy.

So, beginning with statistical mechanics: In statistical mechanics, entropy is a measure of uncertainty or probability. If a system is in an improbable configuration, it is said to be in a state of low entropy.

The classic analogy employed here is the desktop strewn with bits of paper. You can move one piece of paper without appreciably altering the appearance of the desktop. Statistically speaking, this configuration, or one of the many configurations it could have while still remaining untidy, is more probable than one in which the desktop is tidy. Thus, it is in a state of high entropy.

This, of course, is where the idea of entropy as disorder actually comes from, and it does not reflect the definition of entropy employed in thermodynamics, although there is a relationship, as I shall show.

Moving on, let's deal with the definition of entropy in thermodynamics. This will require the laying of a little groundwork:

Firstly, let's look at what the Law of Entropy actually states, as even this is a source of confusion.

The Second Law of Thermodynamics states that, in general, the entropy of a system will not decrease, except by increasing the entropy of another system. [2]

This is an important point. It does not state that entropy will always increase, as many suggest, only that it will not, in general, decrease. There are no physical principles that prohibit the persistence of a thermodynamic state (except with due reference to the Uncertainty Principle, of course). Indeed, entropy in this instance can be defined as a tendency towards equilibrium, which is just such a state! Measurement of this tendency can be quite simply stated as the amount of energy in a system that is unavailable to perform work.

It is probably apposite at this point to deal with the three main classes of thermodynamic system. [3]

1. Open system: An open thermodynamic system is the easiest to understand. It is simply a system in which both energy and matter can be exchanged in both directions across the boundary of the system. Indeed, it is not stretching the point too much to state that an open system is one with no boundary. We define a boundary only because that defines the limits of the system, but from a thermodynamic perspective, the boundary is only a convenience of definition. This is distinct from the two other main classes of thermodynamic system, in which the boundary actually plays an important role in the operation of the system.

2. Closed system: A closed thermodynamic system is one in which heat and work may be cross the boundary, but matter may not. This type of system is further divided based on the properties of the boundary. A closed system with an adiabatic boundary allows exchange of heat, but not work, while a rigid boundary allows no heat exchange, but does allow work.

3. Isolated system: An isolated system is a theoretical construct that, apart from the universe itself, probably does not exist in reality. It is a system in which no heat, work or matter can be exchanged across the boundary in either direction. There are two important things to note from my statement of this. The first is that my usage of the word 'universe' is in line with my standard usage, and does not describe 'that which arose from the big bang', but 'that which is'. The second is that we know of no system from which gravity, for example, can be excluded, and since gravity can apparently cross all boundaries, there can be no such thing as an isolated system within our universe unless a barrier to gravity can be found, hence the statement that there is probably no such thing as an isolated system except the universe itself.

Now that that's out of the way, let's attempt a rigorous definition of entropy in a thermodynamic sense.

Entropy in a thermodynamic system can be defined a number of ways, all of which are basically just implications of a single definition. Rigorously defined, it is simply a tendency toward equilibrium. This can be interpreted in a number of ways:

1. The number of configurations of a system that are equivalent.
2. A measure of the amount of energy in a system that is unavailable for performing work.
3. The tendency of all objects with a temperature above absolute zero to radiate energy.

Now, going back to our analogy of the untidy desktop, this can now be described as an open system, because heat, matter and work can be exchanged in both directions across its boundary. As stated before, this is a system in which statistical entropy is high, due to the high probability of its configuration when measured against other possible configurations (there are more configurations of the system that are untidy than there are configurations that are tidy). In other words, and in compliance with the first of our interpretations above, it is a system which has a high number of equivalent configurations, since there are many 'untidy' configurations of the system, but only a few 'tidy' configurations. To bring the desktop to a state of lower entropy, i.e. a tidy desktop, requires the input of work. This work will increase the entropy of another system (your maid, or whoever does the tidying in your office), giving an increase in entropy overall. This, of course, ties the two definitions from different areas of science together, showing the relationship between them. They are not, of course, the same definition, but they are related. It is also the source of the idea of entropy as disorder.

In evolution, the role of the maid is played by that big shiny yellow thing in the sky. The Earth is an open system, which means that heat, work and matter can be exchanged in both directions across the boundary. The input of energy allowing a local decrease in entropy is provided in two forms, but mainly by the input of high-energy photons from the Sun. This allows photosynthesising organisms to extract carbon from the atmosphere in the form of CO2 and convert it into sugars. This is done at the expense of an increase in entropy on the sun. Indeed, if Earth and Sun are thought of as a single system, then a local decrease in entropy via work input by one part of the system increases the entropy of the system overall.

Now, a little word on information entropy:

In information theory, there is a parameter known as Shannon entropy, which is defined as the degree of uncertainty associated with a random variable. What this means, in real terms, is that the detail of a message can only be quantitatively ascertained when entropy is low. In other words, the entropy of a message is highest when the highest number of random variables are inherent in the transmission or reception of a message. This shows a clear relationship between Shannon entropy and the definition of entropy from statistical mechanics, where we again have the definition of entropy as uncertainty, as defined by Gibbs. A further relationship is shown when we look at the equations for entropy from statistical thermodynamics, formulated by Boltzmann and Gibbs in the 19th Century, and Shannon's treatment of entropy in information. Indeed, it was actually the similarity of the equations that led Claude Shannon to call his 'reduction in certainty' entropy in the first place!

Shannon: \[H=-\sum_i p_{i}\ ln_{b}\;p_{i}\]


Where pi is the probability of a given message and b is the base of the logarithm employed.

 Gibbs: \[S=-k_{B}\sum_i p_{i}\; ln\; p_{i}\]

Where pi is the probability of a microstate i.

Note that where all thermodynamic states are equiprobable, the latter equation reduces to the Boltzmann equation:


\[S=-k_{B}\; ln\; W\]

Where W is the number of microstates.

And where all message states are equiprobable, the former equation reduces to the equivalent equation:

\[H=ln_{b}\; |M|\]

Where |M| is the cardinality of the message space M.

The relationship here is clear, and the motivation for Shannon's labelling this parameter 'Entropy'.

Now, a bit on what is the most highly entropic entity that is currently known in the cosmos, namely the black hole. In what sense is it highly entropic? Let's look at those definitions again:

1. The number of configurations of a system that are equivalent: Check. This can be restated as the number of internal states a system can possess without affecting its outward appearance.


2. A measure of the amount of energy in a system that is unavailable for doing work: Check. All the mass/energy is at the singularity, rendering it unavailable.


3. The tendency of all objects with a temperature above absolute zero to radiate energy: Check. The black hole does this by a very specialised mechanism, of course. Energy cannot literally escape across the boundary, because to do so would require that it travelled at greater than the escape velocity for the black hole which is, as we all know, in excess of c. The mechanism by which it radiates is through virtual particle pair production. Where an electron/positron pair are produced, via the uncertainty principle, at the horizon of the black hole, one of several things can occur. Firstly, as elucidated by Dirac [4] the electron can have both positive charge and negative energy as solutions. Thus, a positron falling across the boundary imparts negative energy on the black hole, reducing the mass of the singularity* via E=mc2, while the electron escapes, in a phenomenon known as Hawking radiation, thus causing the black hole to eventually evaporate. This is Hawking's solution to the 'information paradox', but that's a topic for another time. Secondly, as described by Feynman [5], we can have a situation in which the electron crosses the boundary backwards in time, scatters, and then escapes and radiates away forwards in time. Indeed, it could be argued that Feynman, through this mechanism, predicted Hawking radiation before Hawking did!

Now, the classic example of a system is a collection of gas molecules collected in the corner of a box representing a highly ordered, low entropy state. As the gas molecules disperse, entropy increases. But wait! As we have just seen, the black hole has all its mass/energy at the singularity, which means that the most highly entropic system in the cosmos is also one of the most highly ordered! How can this be? Simple: Entropy is not disorder.

Finally, just a few more words from the paper by Styer:

Styer 2008 wrote:This creationist argument also rests upon the misconception that evolution acts always to produce more complex organisms. In fact evolution acts to produce more highly adapted organisms, which might or might not be more complex than their ancestors, depending upon their environment. For example, most cave organisms and parasites are qualitatively simpler than their ancestors.
So, when you come across the canard that evolution violates the law of entropy, your first response should be to ask your opponent for a definition of entropy, as it is almost certain that a) he is using the wrong definition and/or b) that he has no understanding of what entropy actually is.



References:
[1] Daniel F Styer Evolution and Entropy: American Journal of Physics 76 (11) November 2008
[2] http://www.upscale.utoronto.ca/PVB/Harr ... hermo.html
[3] http://en.wikipedia.org/wiki/Thermodynamic_system
[4] P.M Dirac The Quantum Theory of the Electron 1928
[5] http://www.upscale.utoronto.ca/GeneralI ... atter.html

Further reading:
Simple Nature: Benjamin Crowell
http://www.upscale.utoronto.ca/GeneralI ... tropy.html


*Note that the existence of the singularity has not been established, but this doesn't materially affect any of the points made here.

No comments:

Post a Comment

Note: only a member of this blog may post a comment.