# complexity

Complexity is a phenomenon that has two distinct and almost opposite meanings. The first,
and probably the oldest mathematically, goes back to Andrei Kolmogorov's
attempt to give an algorithmic foundation
to notions of randomness and probability
and to Claude Shannon's study of communication
channels via his notion of information. In both cases, complexity is synonymous
with *disorder* and a lack of structure. The more random a process
is, the greater its complexity. An ideal gas, for example, with its numerous
molecules bouncing around in complete disarray, is complex as far as Kolmogorov
and Shannon are concerned. Thus, in this sense, complexity equates to the
degree of complication. The second, and more recent notion of "complexity"
refers instead to how structured, intricate, hierarchical, and sophisticated
a natural process is. In particular, it's a property associated with dynamical
systems in which new, unpredictable behavior arises on scales above
the level of the constituent components. The distinction between these two
meanings can be revealed by answering a simple question about a system:
Is it complex or is it merely complicated? Measures of complexity include algorithmic complexity, fractal
dimensionality, Lyapunov exponents, and logical
depth.