# entropy

Entropy is a measure of the disorder or unavailability of useful energy within a closed system. More entropy means less energy available for doing work. When a system undergoes a reversible
change the entropy changes by an amount equal to the energy transferred
to the system by heat divided by the thermodynamic temperature at which this occurs (i.e.,
the infinitesimal entropy change *δS* when a quantity of heat *δQ* is transferred at absolute temperature *T* is defined as *δS* = *δQ*/*T*).

Another way to think of entropy is the number of rearrangements of the ingredients of a system that leave its overall appearance intact, or alternatively as the amount of information about the microscopic motion of the atoms making up the system which is not determined by a description of the macroscopic state of that system.

Any change taking place in a system which results in an increase in entropy
has a positive entropy change (Δ*S*). Most spontaneous thermodynamic
processes are accompanied by an increase in entropy. Entropy has units of
joules per degree K per mole.

See also enthalpy.