Skip to main content icon/video/no-internet

Claude Shannon's mathematical theory of communication concerns quantitative limits of mediated communication. The theory has a history in cryptography and of measuring telephone traffic. Paralleling work by U.S. cybernetician Norbert Wiener and Soviet logician Andrei N. Kolmogorov, the theory was first published after declassification in 1948. Due to Wilbur Schramm's initiative, it appeared in 1949 as a book with a brief commentary by Warren Weaver. The theory provided a scientific foundation to the emerging discipline of communication, but is now recognized as addressing only parts of the field.

For Shannon, “The fundamental problem of communication is reproducing at one point either exactly or approximately a message selected at another point” (Shannon & Weaver, 1949, p. 3). Shannon did not want to confound his theory by psychological issues and considered meanings irrelevant to the problem of using, analyzing, and designing mediated communication. The key to Shannon's theory is that messages are distinguished by selecting them from a set of possible messages—whatever criteria determine that choice. His theory has 22 theorems and seven appendices. Its basic idea is outlined as follows.

The Basic Measure

Arguably, informed choices are better than chance, and selecting a correct answer from among many possible answers to a question is more difficult and requires more information than selecting one from among few. For example, guessing the name of a person is more difficult than guessing his or her gender. So his or her name would provide more information than his or her gender, the former often implying information about the latter. Intuitively, communication that eliminates all alternatives conveys more information than one that leaves some of them uncertain. Furthermore, two identical messages should provide the information of any one, and two different messages should provide more information than either by itself.

To define quantities associated with selecting messages, in his second theorem, Shannon proved that the logarithm function was the only one that conforms to the above intuitions. Logarithms increase monotonically with the number of alternatives available for selection and are additive when alternatives are multiplicative. Although the base of this logarithm is arbitrary, Shannon set it to two, thereby acknowledging that the choice among two equally likely alternatives—answering a yes or no question or turning a switch on or off—is the most elementary choice conceivable. His basic measure, called entropy H, is

None

where px is the probability of message x occurring in the set of possible messages X. The minus sign assures that entropies are positive quantities. With Nx as the size of the set X of possible messages, H's range is

None

H averages the number of binary choices needed to select one message from a larger set, or the number of binary digits, bits for short, needed to enumerate that set. H is interpretable as a measure of uncertainty, variation, disorder, ignorance, or lack of information. When alternatives are equally likely,

None

None

Entropies and Communication

The additivity of H gives rise to a calculus of communication. For a sender S and a receiver R one can measure three basic entropies.

1. The uncertainty of messages s at sender S, occurring with probability p<>

...

  • Loading...
locked icon

Sign in to access this content

Get a 30 day FREE TRIAL

  • Watch videos from a variety of sources bringing classroom topics to life
  • Read modern, diverse business cases
  • Explore hundreds of books and reference titles

Sage Recommends

We found other relevant content for you on other Sage platforms.

Loading