The information content of a message is conventionally quantified in terms of bits (binary digits). Each bit represents a simple alternative – in terms of a message, a yes-or-no; in terms of the components in an electrical circuit, that a switch is open or closed. Mathematically the bit is represented as 0 or 1. Complex messages can be represented as a series of bit alternatives. Five bits of information only are needed to specify any letter of the alphabet, given an appropriate code. Thus able to quantify information, information theory employs statistical methods to analyze practical communications problems. The errors that arise in the transmission of signals, often termed noise, can be minimized by the incorporation of redundancy. Here more bits of information than are strictly necessary to enclose a message are transmitted, so that if some are altered in transmission, there is still enough information to allow the signal to be correctly interpreted. Clearly, the handling of redundant information costs something in reduced speed of or capacity for transmission, but the reduction in message errors compensates for this loss. Information theoreticians often point to an analogy between the thermodynamic concept of entropy and the degree of misinformation in a signal.
Related category COMPUTERS, ARTIFICIAL INTELLIGENCE, AND CYBERNETICS
Home • About • Copyright © The Worlds of David Darling • Encyclopedia of Alternative Energy • Contact