Sunday, June 15, 2025

Explaining life in terms of the statistical theory of entropy

"How would we express in terms of the statistical theory the marvellous faculty of a living organism, by which it delays the decay into thermodynamical equilibrium (death)? We said before: 'It feeds upon negative entropy', attracting, as it were, a stream of negative entropy upon itself, to compensate the entropy increase it produces by living and thus to maintain itself on a stationary and fairly low entropy level.

"If D is a measure of disorder, its reciprocal 1/D, can be regarded as a direct measure of order. Since the logarithm of 1/D is just minus the logarithm of D, we can write Boltzmann's equation thus:

- (entropy) = k log (1/D)

Hence the awkward expression 'negative entropy' can be replaced by a better one: entropy, taken with the negative sign, is itself a measure of order. Thus the device by which an organism maintains itself stationary at a fairly high level of orderliness ( = fairly low level of entropy) really consists in continually sucking orderliness from its environment. This conclusion is less paradoxical than it appears at first sight. Rather could it be blamed for triviality. Indeed, in the case of higher animals we know the kind of orderliness they feed upon well enough, viz. the extremely well-ordered state of matter in more or less complicated organic compounds, which serve them as food stuffs. After utilizing it they return it in a very much degraded form — not entirely degraded, however, for plants can still make use of it. (These, of course, have their most powerful supply of 'negative entropy' in the sunlight.)" (pp. 73-73)

from Erwin Schrodinger's "What is life?"

No comments:

Post a Comment