Worlds of David Darling
Encyclopedia of Science
   
Home > Encyclopedia of Science

thermodynamics





A division of physics concerned with the interconversion of heat, work, and other forms of energy, and with the states of physical systems. Being concerned only with bulk matter and energy, classical thermodynamics is independent of theories of their microscopic nature; its axioms are sturdily empirical, and from them theorems are derived with mathematical rigor. Classical thermodynamics is basic to engineering, parts of geology, metallurgy, and physical chemistry.

Building on earlier studies of the thermodynamic functions temperature and heat, Sadi Carnot pioneered the science by his investigations of the cyclic heat engine in 1824, and in 1850 Clausius stated the first two laws. Thermodynamics was first developed by Joshua Gibbs, Hermann von Helmholtz, William Thomson (Lord Kelvin), and James Clerk Maxwell.

In thermodynamics, a system is any defined collection of matter: a closed system is one that cannot exchange matter with its surroundings; an isolated system can exchange neither matter nor energy. The state of a system is specified by determining all its properties such as pressure, volume, etc. A system in stable equilibrium is said to be in an equilibrium state, and has an equation of state (e.g., the general gas law) relating its properties. (See also phase equilibria.) A process is a change from one state A to another B, the path being specified by all the intermediate states. A state function is a property or function of properties which depends only on the state and not on the path by which the state was reached; a differential dX of a function X (not necessarily a state function) is termed a perfect differential if it can be integrated between two states to give a value XAB (= integral from A to B of dX) which is independent of the path from A to B. If this holds for all A and B, X must be a state function.


Laws of thermodynamics

There are four basic laws of thermodynamics, all having many different formulations that can be shown to be equivalent.

The zeroth law states that, if two systems are each in thermal equilibrium with a third system, then they are in thermal equilibrium with each other. This underlies the concept of temperature.

The first law states that for any process the difference of the heat Q supplied to the system and the work W done by the system equals the change in the internal energy U: ΔU = Q - W. U is a state function, though neither Q nor W separately is. Corollaries of the first law include the law of conservation of energy, Hess' law (see thermochemistry), and the impossibility of perpetual motion machines of the first kind. For more, see first law of thermodynamics.

The second law (in Clausius' formulation) states that heat cannot be transferred from a colder to a hotter body without some other effect, i.e., without work being done. Corollaries include the impossibility of converting heat entirely into work without some other effect, and the impossibility of perpetual motion machines of the second kind. It can be shown that there is a state function entropy, S, defined by ΔS = ∫dQ/T, where T is the absolute temperature. The entropy change ΔS in an isolated system is zero for a reversible process and positive for all irreversible processes. Thus entropy tends to a maximum. It also follows that a heat engine is most efficient when it works in a reversible Carnot cycle between two temperatures T1 (the heat source) and T2 (the heat sink), the efficiency being (T1 - T2)/T2. For more, see second law of thermodynamics.

The third law states that the entropy of any finite system in an equilibrium state tends to a finite value (defined to be zero) as the temperature of the system tends to absolute zero. The equivalent Nernst heat theorem states that the energy change for any reversible isothermal process tends to zero as the temperature tends to zero. Hence absolute entropies can be calculated from specific heat data. Other thermodynamic functions, useful for calculating equilibrium conditions under various constraints, are: enthalpy (or heat content) H = U + pV; the Helmholtz free energy A = U - TS; and the Gibbs free energy G = H - TS. The free energy represents the capacity of the system to perform useful work. For more, see third law of thermodynamics.


Quantum statistical thermodynamics

Quantum statistical thermodynamics, based on quantum mechanics, arose in the 20th century. It treats a system as an assembly of particles in quantum states. The entropy is given by S = k log P, where k is the Boltzmann constant and P the statistical probability of the state of the system. Thus entropy is a measure of the disorder of the system.


Related category

   • HEAT AND THERMODYNAMICS