
Heat capacity in bits - AJP
Absolute T may be expressed as the rate of energy increase per unit increase in state uncertainty, but what does that mean for heat capacity's units?
P. Fraundorf

Since humans' discovery (nearly a million years ago) of uses for manmade fire, "hot" is one of the first concepts that our children learn the meaning of. Still, most of us are quite happy using historical unit names to measure temperature T, like "Fahrenheit" or "Celsius", even though we now know that reciprocal temperature (AKA "coldness" β ≡ 1/kT) is most generally a measure (e.g. in GB/nJ) of the correlation information lost (to the outside world) about the state of a system, per unit of heat energy added. Hence natural units for temperature are "energy per unit information".
Entropy-first approaches to thermal physics, which have taken over senior undergraduate classes, provide insight into the assumptions that underpin the ideal gas law, equipartition, and chemistry's law of mass action. The moreover allow extension of thermal concepts to inverted population (negative absolute temperature) states. The extension to correlation-first approaches even extends them to layered complexity, but alas introductory physics courses still do "entropy last". How might we change that?
Colleague & friend Ed Jaynes, among others, uncovered a deep distinction between the "describing the dice" and the "taking the best guess" parts of thermal physics, expanding applicability of J. Williard Gibb's powerful ensemble methods to statistical inference in general, and to the physical relationship between organisms and codes of all sorts (both molecular and idea based). Here we just explore a few of the leads thereby uncovered.
Absolute T may be expressed as the rate of energy increase per unit increase in state uncertainty, but what does that mean for heat capacity's units?
