
So, entropy in classical thermodynamics is defined only for systems which are in thermodynamic equilibrium.Īs long as the temperature is therefore a constant, it's a simple enough exercise to differentiate equation 1, and arrive at equation 2. The temperature of the system is an explicit part of this classical definition of entropy, and a system can only have "a" temperature (as opposed to several simultaneous temperatures) if it is in thermodynamic equilibrium. Suffice for now to point out that what they called heat content, we would now more commonly call the internal heat energy. Later in the 19th century, the molecular theory became predominant, mostly due to Maxwell, Thomson and Ludwig Boltzmann, but we will cover that story later. It was Thomson who seems to have been the first to explicity recognize that this could not be the case, because it was inconsistent with the manner in which mechanical work could be converted into heat. Carnot & Clausius thought of heat as a kind of fluid, a conserved quantity that moved from one system to the other. At this time, the idea of a gas being made up of tiny molecules, and temperature representing their average kinetic energy, had not yet appeared.

In equation 1, S is the entropy, Q is the heat content of the system, and T is the temperature of the system. The specific definition, which comes from Clausius, is as shown in equation 1 below. The concept was expanded upon by Maxwell (Theory of Heat, Longmans, Green & Co. But it was Clausius who first explicitly advanced the idea of entropy (On Different Forms of the Fundamental Equations of the Mechanical Theory of Heat, 1865 The Mechanical Theory of Heat, 1867). My goal here is to shwo how entropy works, in all of these cases, not as some fuzzy, ill-defined concept, but rather as a clearly defined, mathematical & physical quantity, with well understood applications.Ĭlassical thermodynamics developed during the 19th century, its primary architects being Sadi Carnot, Rudolph Clausius, Benoit Claperyon, James Clerk Maxwell, and William Thomson (Lord Kelvin). It did not take long for Claude Shannon to borrow the Boltzmann-Gibbs formulation of entropy, for use in his own work, inventing much of what we now call information theory. But the advent of statistical mechanics in the late 1800's created a new look for entropy.


There is no such thing as an "entropy", without an equation that defines it.Įntropy was born as a state variable in classical thermodynamics. Remember in your various travails, that entropy is what the equations define it to be. That means that entropy is not something that is fundamentally intuitive, but something that is fundamentally defined via an equation, via mathematics applied to physics. But it should be remembered that entropy, an idea born from classical thermodynamics, is a quantitative entity, and not a qualitative one. However, the entropic quantity we have defined is very useful in defining whether a given reaction will occur.The popular literature is littered with articles, papers, books, and various & sundry other sources, filled to overflowing with prosaic explanations of entropy. It is evident from our experience that ice melts, iron rusts, and gases mix together. This apparent discrepancy in the entropy change between an irreversible and a reversible process becomes clear when considering the changes in entropy of the surrounding and system, as described in the second law of thermodynamics.
