Выбрать главу

The concept of entropy proved extremely useful for nineteenth-century steam engine designers because it allowed them to understand that the efficiency of a steam engine depends on the temperature difference between the furnace and the environment. If there is no temperature difference, no net energy can be transferred, and no work can be done. A larger temperature difference allows more work to be done because it allows for more energy to flow without violating the Second Law. But nowhere in the elegant logical edifice constructed by Joule, Clausius and many others, now known as classical thermodynamics, is there any mention of what entropy actually is; it’s just a very useful quantity.

What is entropy?

When John Wheeler brought his hot and cold teacups into contact, he was worried about increasing the disorder of the Universe. The link between entropy and disorder was appreciated by James Clerk Maxwell, whose work on electromagnetic theory led Einstein to special relativity. Maxwell realised that the Second Law is different to the other laws of Nature known to the nineteenth-century physicists in that it is inherently statistical. In 1870, he wrote: ‘The 2nd law of thermodynamics has the same degree of truth as the statement that if you throw a tumblerful of water into the sea you cannot get the same tumblerful out of the water again.’28

In 1877, Ludwig Boltzmann reinforced this idea with a brilliant new insight. Boltzmann understood that entropy puts a number on ignorance; in particular, on our ignorance of the exact configuration of the component parts of a system. Take Maxwell’s tumblerful of water. Before throwing it into the sea, we know that all the water molecules are in the tumbler. Afterwards, we have far less idea where they are, and the entropy of the system has increased. This idea is powerfully general – things that shuffle and jiggle will, if left alone, tend to mix and disperse, and our ignorance increases as a result.

Boltzmann’s insight connects Clausius’s definition of entropy, based on temperature and energy, with the arrangement of the internal constituents of a system. The resulting methodology, which treats matter as being built up out of component parts about which we have limited knowledge, is a branch of physics known as statistical mechanics. The subject is a challenging one, both technically and philosophically. David L. Goodstein begins his textbook States of Matter with the following paragraph: ‘Ludwig Boltzmann, who spent much of his life studying statistical mechanics, died in 1906, by his own hand. Paul Ehrenfest, carrying on the work, died similarly in 1933. Now it is our turn to study Statistical Mechanics.’29

A nice way to see the connection between entropy, temperature and the arrangement of the constituents of a system is to think about a particularly simple physical system – a collection of atoms in a box. The behaviour of atoms is the province of quantum theory, which we’ll explore in more detail later. For now, we need only one idea; atoms confined inside a box can only have certain specific energies. We say that the system has a discrete set of ‘energy levels’. This is where quantum mechanics gets its name; ‘quantised’ means ‘discrete’, as in a discrete set of energies. The lowest possible energy an atom can have is known as the ground state. If all the atoms are in the ground state, the temperature of the box of atoms is zero degrees Kelvin (-273 degrees Celsius). If energy is added, some of the atoms will move to higher energy levels. The parameter that determines how the atoms are distributed among the available energy levels is the temperature. The higher the temperature, the higher up the ladder of available energy levels the atoms can climb (as illustrated in Figure 9.2). The details of how much energy must be transferred into the box to change the configuration of the atoms depends on the types of atom present and the size of the box, but the key point is that there exists a single quantity – the temperature – which tells us how the atoms are most likely to be arranged across the allowed energy levels.

Figure 9.2. Atoms occupying the energy levels for a bunch of atoms in a box. At zero temperature (on the left) all the atoms are in the lowest energy level (the ground state). As the temperature increases (from left to right), atoms increasingly occupy higher energy levels.

Let’s now imagine placing a different box of different atoms in contact with our original box. The details of the energy levels will be different, but what’s important is that if the two boxes are at the same temperature, no net energy will be transferred between the boxes and the internal configurations will not change in a discernible way. This is the meaning of the nineteenth-century concept of temperature. To put it another way, if we place two systems in contact such that energy can be exchanged and nothing happens overall, then the two systems are at the same temperature. This is known as the Zeroth Law of Thermodynamics, because it was an afterthought. The Zeroth Law was always an essential part of the logical structure of classical thermodynamics because it is necessary to pin down the concept of temperature, but it wasn’t designated as a law until the early twentieth century, by which time everybody had got so used to speaking of the First and Second Laws of Thermodynamics that they didn’t want to change.

Richard Feynman came up with a nice analogy for temperature in his book The Character of Physical Law.30 Imagine sitting on a beach as the clouds sweep in off the ocean and it begins to rain. You grab your towels and rush into a beach hut. The towels are wet, but not as wet as you, and so you can start to dry yourself. You get drier until every towel is as wet as you, at which point there is no way of removing any more water. You could explain this by inventing a quantity called ‘ease of removing water’ and say that you and the towels all have the same value of that quantity. This doesn’t mean that everything contains the same amount of water. A big towel will contain more water than a small towel, but because they all have the same ‘ease of removing water’, there can be no net transfer of water between them. The reason why an object has a particular ‘ease of removing water’ is complicated and related to its internal atomic structure, but we don’t need to know the details if all we’re concerned about is getting dry. The analogy with thermodynamics is that the amount of water is the energy, and the ‘ease of removing water’ is the temperature. When we say that two objects have the same temperature, we don’t mean that they have the same energy. We mean that if we place them in contact their atoms or molecules will jiggle around and collide, just as the molecules in Joule’s paddle collided with molecules in the water and imparted energy to them, but if the objects are at the same temperature, the net transfer of energy will be zero and nothing will change on average.

Now recall Wheeler’s description of entropy; ‘Whatever is composed of the fewest number of units arranged in the most orderly way … has the least entropy.’ What does ‘order’ mean? Imagine we decide to select an atom at random from the box and ask: Which energy level did that atom come from? At zero temperature, we know the answer. The atom came from the ground state. The entropy in this case is zero.‡ This is what Wheeler means by ‘units’ being arranged in an orderly way. We know exactly what we are going to get when we pull an atom out of the box; we are not in the least bit ignorant. If we raise the temperature, the atoms will spread out across the available energy levels, and if we now select an atom at random, we can’t be sure which energy level it will come from. The atom could come from the ground state, or from one of the higher energy levels. This means our ignorance has increased as a result of raising the temperature. Equivalently, the entropy is larger, and continues to rise with increasing temperature as the atoms become more distributed among the allowed energy levels.