"What is entropy? Let me first emphasize that it is not a hazy concept or idea, by a measurable physical quantity just like the length of a rod, the temperature at any given point of a body, the heat of fusion of a given crystal or the specific heat of any given substance. At the absolute zero point of temperature (roughly - 273°C) the entropy of any substance is zero. When you bring the substance into any other state by slow, reversible little steps (even if thereby the substance changes its physical or chemical nature or splits into two or more parts of different physical or chemical nature) the entropy increases by an amount which is computed by dividing every little portion of heat you had to supply in the procedure by the absolute temperature at which it was applied — and by summing up all the small contributions" (p. 71).
"Much more important for us here is the bearing on the statistical concept of order and disorder, a connection that was revealed by the investigations of Boltzmann and Gibbs in statistical physics. This too is an exact quantitative connection, and is expressed by
entropy = k log D,
where k is the so-called Boltzmann constant ( = 3.2983×10⁻²⁴ cal./°C) and D a quantitative measure of the atomistic disorder of the body in question. To give an exact explanation of this quantity D in brief non-technical terms is well-nigh impossible. The disorder it indicates is partly that of heat motion, party that which consists in different kinds of atoms or molecules being mixed at random, instead of being neatly separated ..." (p. 72).
as quoted in Erwin Schrodinger's "What is life?"
No comments:
Post a Comment