Skip to main content icon/video/no-internet

Information theory is the quantitative study of signal transmission. Primarily applicable to information technology and communications engineering, in human communication theory, it serves primarily as a metaphor for linear transmission between human senders and receivers. Although information theory has historical significance, contemporary theories of human communication rarely refer to it directly. Originating in physics, engineering, and mathematics, the theory addresses uncertainty in code systems, message redundancy, noise, channel capacity, and feedback. This entry defines basic concepts from the field, applies these to language and human communication, and summarizes insights about information transmission.

Information is a measure of uncertainty in a system of signals. In a counterintuitive way, information theory states that the higher the information in a system, the greater the uncertainty. This is because more information entails a larger number of states, which decreases clarity. The concept of entropy is the starting place for understanding this seemingly contradictory idea.

Entropy, taken from thermodynamics in physics, is the randomness or lack of predictability within a system. Highly entropic situations have little organization, reduced predictability, and therefore great uncertainty. In low-entropy systems, there is more organization, greater predictability, and therefore less uncertainty. Two dice have more entropy than one die, and a the has more entropy than a coin flip. In thermodynamics, as atoms heat up, they “go crazy” and move all over the place in a frantic, entropic system. As they cool off, they slow down, assume a more organized order, and are more predictable. When entropy is high, there is more information; when it is low, there is less.

To grasp this correlation, think of making predictions based on a set of signs. If you worked in a chaotic organization in which so much is going on that you can never tell from one moment to the next what is going to happen, you would be experiencing “too much” information to process and predict outcomes. If the organization is simpler, with fewer variables to keep track of, prediction is easier because the information level would be lower. This is like cracking a code: Complex codes have more information and are harder than simple ones to decipher. A completely predictable situation is said to have negentropy. An example of negentropy is that every time you tap a certain place on your knee, your leg jerks. This is a completely predictable situation.

Another way to understand the concept of information is to think of the number of choices you could make in predicting an outcome: the more choices, the lower the predictability, and the greater the information inherent in the system. A complex system has many possible outcomes, choices, or alternatives, while a simple system has fewer. This is why there is more information in throwing a the than in tossing a coin. In the former, you have a 17% chance of being right; in the latter, a 50% chance.

The inverse of information is redundancy, which is a measure of predictability in the system. Like information, redundancy is a quantitative measure—the ratio of the entropy to the maximum amount possible in the system. Entropy is maximized when all alternatives are equally possible, as would be the case with a six-sided die. When a the is thrown, all sides have about a 17% chance of landing up. If the the is fixed and has two sides with one dot, there would be about a 35% chance of predicting this outcome, which is a measure of redundancy in the system. With the fake die, there would be more redundancy than with a fair one.

...

  • Loading...
locked icon

Sign in to access this content

Get a 30 day FREE TRIAL

  • Watch videos from a variety of sources bringing classroom topics to life
  • Read modern, diverse business cases
  • Explore hundreds of books and reference titles

Sage Recommends

We found other relevant content for you on other Sage platforms.

Loading