Выбрать главу

Temperature, energy and the change in entropy are quantities that appear in classical thermodynamics without any knowledge of the underlying structure of the ‘thing’ being studied (by ‘thing’ we mean anything from a box of gas to a galaxy of stars). Thanks to Boltzmann, we now understand that these quantities are intimately related to the constituents of the thing, how those constituents are arranged and how they share the total energy. Temperature, for example, tells us how fast the molecules in a box of gas are moving around on average. Similarly, entropy tells us about the number of possible internal configurations a thing can have. Boltzmann’s tombstone bears an inscription of his famous equation for the entropy of a system, which makes the connection with the component parts explicit:

S = kB log W

In this equation, W is the number of possible internal configurations and the entropy, denoted S, is proportional to the logarithm of W. Accordingly, larger W means larger entropy. The logarithm and Boltzmann’s constant kB are not important for what follows, other than to note that they allow us to put a precise number on the entropy that also agrees with Clausius’s definition in terms of energy and temperature. The important point is that W is the total number of different ways that the component parts of a system could be arranged in a manner consistent with what we know about the system. For atoms in a box, if the temperature is zero, there is only one way the atoms could be arranged, and so W = 1 and the entropy is zero.§ If the temperature is raised and some of the atoms hop into higher energy levels, there are more possible arrangements and so W is larger and the entropy is larger.

For a gas in a room, the component parts are atoms or molecules and the things we know about the system might be the volume of the room, the total weight of the gas inside and the temperature. Computing the entropy is then an exercise in counting the different ways the atoms could be arranged inside the room given what we know. One possible arrangement would be that all the atoms, bar one, are sitting still in one corner of the room, while a single lone atom carries almost all the energy. Or maybe the atoms share out the energy equally and are distributed uniformly around the room. And so on. Crucially, there are vastly more ways to arrange the atoms in the room such that the atoms are spread out across the room and share the energy reasonably evenly between them, compared to arrangements where all the atoms are in one corner or the energy is distributed very unevenly. Boltzmann understood that if the energy in the room is allowed to get shuffled around among the atoms because the atoms collide, then all the different arrangements will be more-or-less equally likely. Given that insight and given the numerical dominance of arrangements where the atoms are scattered all over the room, it follows that if we are in a ‘typical’ room then we are very likely to find the atoms distributed in a roughly uniform fashion. When everything has settled down and things are evenly distributed, we say that the system is in thermodynamic equilibrium. The entropy is then as big as it can be, and every region of the room is at the same temperature.

Now, here is why the Second Law embodies the idea of change. If we begin with a system far from equilibrium, which is to say that the component parts are distributed in an unusual way, then as long as the components can interact with each other and share their energy, the system will inexorably head towards equilibrium because that’s the most likely thing to happen. We can now appreciate why Maxwell’s observation that the Second Law has a statistical element to it was so insightful. The Second Law deals, ultimately, with what is more or less likely, and it is far more likely that a system will head towards thermodynamic equilibrium because there are so many more ways for it to be in thermodynamic equilibrium.

This ‘one way’ evolution of a system is often called the thermodynamic arrow of time because it draws a sharp distinction between the past and the future: the past is more ordered than the future. In our Universe as a whole, the arrow of time can be traced back to the mysterious highly-ordered, low-entropy state of the Big Bang.

Entropy and information

Suppose we know the precise details of every atom in a room and choose to think in these terms rather than in terms of the volume, weight of gas and temperature. Then the entropy would be zero because we know the configuration exactly. This means that an omniscient being has no need for entropy. For mortal beings and physicists, however, the vast numbers of atoms in rooms and other large objects makes keeping track of their individual motions impossible and, as a result, entropy is a very useful concept. Entropy is telling us about the amount of information that is hidden from us when we describe a system in terms of just a few numbers. Seen in this way, entropy is a measure of our lack of knowledge, our ignorance. The connection between entropy and information was made explicit in 1948 by Claude Shannon in one of the foundational works in what is now known as information theory, which is central to modern computing and communications technology.

Returning to our gas-filled room, there will be many possible configurations of the atoms inside that are consistent with our measurements of the volume, weight and temperature. The logarithm of that number, according to Boltzmann, is the entropy. Importantly, though, the atoms are actually in a particular configuration at some moment of time. We just don’t know what it is. Let’s imagine that we make a measurement and determine precisely which configuration. What have we learnt? To be more specific, exactly how much information have we gained? Following Shannon, the amount of information gained is defined to be the minimum number of binary digits (bits) required to distinguish the measured configuration from all the other possible configurations. Imagine, for example, that there are only four possible configurations. In binary code, we would label those configurations as 00, 01, 10 and 11. That means we gain two bits of information when we measure it. If there are eight possible configurations, we would label them 000, 001, 010, 011, 100, 101, 110 and 111. That’s three bits. And so on. If there were a million possible configurations, it would take us some time to write all the combinations out by hand, but we don’t need to because there is a simple formula that tells us how many bits we’d need. If the number of configurations is W, the number of bits is:

N = log2 W

This is very similar to Boltzmann’s formula for the entropy of the box of gas. If you know a little mathematics, you’ll notice that the logarithm is now in base 2 rather than the natural logarithm in Boltzmann’s formula, but that just leads to an overall numerical factor.¶ The key point is that the information gained in our measurement of the precise state of the gas is directly proportional to the entropy of the gas before the measurement. Specifically:

This is the key to understanding the fundamental importance of entropy. Entropy tells us about the internal structure of a thing – it is intimately related to the amount of information that the thing can store and as such it is intimately linked to the fundamental building blocks of the world. It is a window into the underlying structure of cups of tea, steam engines and stars. And, if we follow Bekenstein and associate an entropy with the area of the event horizon of a black hole, it is a window into the underlying structure of space and time.

The entropy of a black hole

The immediate issue is that black holes as we have described them so far have no component parts. They are pure spacetime geometry and quite featureless. Superficially, therefore, a black hole would appear to have zero entropy. Throw a couple of cups of tea into a black hole and its mass will increase, but that’s all and so the entropy should still be zero. This was Wheeler’s point. To save the Second Law, in true Eddingtonian spirit, Bekenstein guessed that black holes must have an entropy, and that must be proportional to the area of the horizon. Bekenstein did more than just guess though. In an ingenious back of the envelope calculation, he also estimated the numerical value of the entropy of a black hole and discovered something very deep.