Entropy: Second law of

Entropy: Second law of thermodynamics: 


Entropy is a measure of the energy distribution through a system. As energy becomes more dispersed or more evenly distributed in a system, the possibility of that energy's being used for mechanical work is decreased, and entropy increases.

Natural processes always move towards an increase in disorder, which is measured by entropy. The second law of thermodynamics, which states that entropy can never decrease, is thus an argument for the irreversibility of time.

In statistical mechanics, Ludwig Boltzmann expressed entropy as S = k (lnP) where S is entropy, k is Boltzmann's constant, and lnP is the natural log of probablilty. Another term for P is "complexions" which are the number of possible ways a system can be arranged to yield a specific state. The less probable something is, the lower its entropy. Hence increase in entropy corresponds to states of greater probability. 

The "complexions" can be defined as volumes in phase space. If each point (or small volume) describes a complete and specific state of the system, more probable conditions that can be achieved in more ways will define larger volumes in the phase space than less probable ones that can be achieved in fewer ways. The statistical idea of entropy depends upon the hypothesis that the system's trajectory will wander aimlessly, or ergodically, in phase space. Therefore, over long periods of time, the system is more likely to be located in a larger volume than in a smaller one. 

Entropy is usually applied as a statistical concept to an isolated, or closed system. It does not apply to individual elements, or microstates, within the system but describes the macroscopic state. While it cannot be directly transported, entropy undergoes virtual transport in the exchanges of matter or energy across the boundaries of systems. For example, the earth's capacity to support organic evolution can be seen as a result of its capacity to discard entropy into outer space by its release of heat through water condensation. (see Yates, introduction) 

In an open system in steady state, there is no net increase in entropy. 

K-flows, named after the Russian mathematician Andrei Kolmogorov are a measure of chaos. K-entropy is a measure of the average rate at which trajectories starting from points extremely close together are moving apart. In a chaotic system an initial deviation will soon become as large as the true "signal" itself. Calculators or computers that round off numbers, to no matter how many digits, rapidly lead to errors in equations that both expand the number of digits and are sensitive to infinitesimally small differences in the numbers. However, in these cases K is positive but not infinite. It is infinite in totally random paths.

The concept of the irreversibility of REF= "Time.html#28"> time finds its mathematical formulation in the second law of thermodynamics, for if entropy can only increase, then it is an indication of time's direction. This is the main starting point for Prigogine and Stengers' Order out of Chaos. 

Claude Shannon, on the advice of John von Neumann, called transmission loss entropy. For Shannon, its inverse is information. Norbert Wiener: Just as the amount of information in a system is a measure of its degree of organization, so the entropy of a system is a measure of its degree of disorganization. Metric entropy is information production rate. It can be measured, for example, in bits per second (see Crutchfield) 

see entropy: interpretations