Steve Jackson

By Toby Taylor and Rachel Moyle:

Information Theory

 

- Home -

(Note: This paper was written at the University of South Carolina)

INFORMATION THEORY

The Information Theory, otherwise known as the Theory of Communication, is one that has been around almost as long as modern communication itself. With roots stretching from the early 20th century, its applications are now far-reaching and deal with such complex fields as computer science, psychology, and medical diagnosis. Simply, Information Theory is the study of the transmission of communication from one point to another. In transmission, communication is influenced by outside factors such as noise and redundancy. In short, the theory maps out a more advanced version of the children's game "telephone" in which a message is passed through a chain of children. At the end of the game, the message rarely resembles its original state. Instead the last child relays a humorous, distorted version.

Communication is as old as time. According to American Scientist it can be defined as, "that which links any organism together." In order to communicate, a society must assign symbols to language, thus enabling any given message to be received. For instance, the ancient Egyptians used hieroglyphics as a means of transmitting information (--- 1952). In 1929, two telephone engineers named Claude Shannon and Warren Weaver took the idea of assigning symbols to language and came up with a complex theory in which they explained how symbols were responsible for the transmission of information. Working for the Bell Laboratories, they produced the first formal transmission model of information (Crowley 1982). Their book, The Mathematical Theory of Communication, is considered by many researchers in all fields to be the "information bible," offering a complex mathematical interpretation of a theory that is actually quite simple. At the core of this theory the team suggests that all messages follow a set path. Upon sending the message, the information source must encode the language into symbols, readying it to travel along a communication channel. In this channel, factors such as noise and redundancy can influence the transmission. When the message is received in symbol form, it must be decoded into language that can be understood (Ash 1965).

Messages can be encoded and decoded through an infinite number of methods. The English-speaking person writes the letter "A" on a piece of paper and it is interpreted by the receiver as the sound "A". Francis Bacon suggested that communication could take place using merely two varying symbols. This concept became known as the "Bilateral Code". Today, the Morse Code is a prime example of his theory. Using only dots and dashes, any message can be communicated. Ancient civilizations, however, realized the usefulness of such a system long before Bacon. Congo tribes used drum beats with high and low pitch frequencies to communicate in the bush. Other tribes used short and long smoke signals to serve the same purpose (--- 1952). There is even a reference to the importance of binary information in Matthew 5:37: "But let your communication be Yea, yea; Nay, nay; for whatsoever is more than these cometh of evil." Before personal computers became the norm, punched cards transported information using on-or-off teletype systems. Today computers are programmed in binary code, a system of ones and zeros that stops messages at certain points and reroutes the information to other destinations

Just as there are many different forms of communication, there are many different channels through which information may flow. All channels have a varying degree of memory. Those channels with no memory are the simplest kinds of channel models (Gallager 1968). As the level of effectiveness of the channel increases, so does the quality and effectiveness of the information being transmitted. Likewise, if a channel is ineffective, the message being transmitted may appear garbled or distorted. The function and importance of the channel has been a topic of controversy between Shannon and another prominent researcher in the field, Norman Wiener. Shannon argued the properties of the channel were the most important elements of communication. Wiener on the other hand, claimed the channel was a fixed entity. Instead, he argued that randomly generated "noise" was responsible for the transmission of the message (Ash 1965). Shannon and Weaver had come up with the three basic criteria that the transmission of information depended upon: 1) the condition and capacity of the channel through which the communication occurs 2) the influence of noise from the general environment and 3) the amount of redundancy in the message. Wiener came up with the idea of feedback, which became the fourth criterion (Crowley 1982).

The efficiency of a channel depends not only directly on how a channel operates, but also on several outside factors. There are two primary factors that can hinder the transmission of a message. When speaking in a crowded restaurant, it is difficult for two people to hear each other if they are sitting at opposite ends of the table. Similarly, a message can't be relayed in its original form if the channel is interrupted by external noises. Examples of noise are snow on the television screen and static heard on radio channels The shorter the message, the easier it is for it to overcome the noise as it passes through the channel. Those transmissions that are repeated more frequently come to be encoded with fewer symbols for convenience. Consequently, they have come to hold less meaning. Because of a greater number of symbols, a longer message has a greater chance of being broken up by external circumstances. (Skyttner 1998). (International 1997).

Researchers claim that communication is also hindered by redundancy as well as noise. In conversation, a message is often communicated in more words than necessary. For instance, a redundant sentence would give a full description of the subject and the action such as, "The nimble young man ran aptly over the soft, green grass." In a noisy setting this would be an effective way to communicate because if a few words were lost in the channel due to noise, the receiver would still understand the basic message. However, in a channel that lacks interference a shorter sentence could be constructed that would still be efficient. An example would be, "Boy ran." One problem with eliminating all redundancy is that if a single error occurs in transmission, it would result in a misunderstood message. Therefore, artificial redundancy reduces the error rate of a message (Americana 1996). The English language has a redundancy rate of fifty percent. The redundancy rate of language could be reduced if symbols were applied to sounds such as "ing" and "th" rather than individual letters that must then make up a sound. If, however, such a code could be devised, a single misused symbol could influence the entire meaning of a message (--- 1952).

R. V. L. Hartley was a major influence on both Shannon and Weaver in their research. He was the first person to attempt to measure information. A given amount of information received by an organism can be measured by determining the total work that it does for the organism (Cherry 1961). This idea served as a catalyst for further research in the field.

Though the Information Theory was originally drafted in the early Twentieth century, it has remained the backbone of the study of communication. When it was written, Shannon and Weaver urged the public not to apply it to human interactions. Disregarded, researchers in various fields applied it to their specific areas of study, regardless of their wishes. Today it is used to explain everything from computer networking to psychological evaluations. Shannon and Weaver would be disappointed, considering that they regarded the theory in a strictly mathematical sense. The theory has received no real criticism, merely differences of opinion within the theory as mentioned above. In reference to the question of redundancy, some people do question the size that a message can be reduced to before it loses its meaning. Some argue that there is still much redundancy that can be eliminated while others claim that any further reduction would sacrifice the value of the message. It is believed that the numerous possible applications have contributed to its acceptance by a broad range of researchers.

Abramson, Norman. Introduction. Information Theory and Coding. New York: McGraw-Hill, 1963.

Ash, Robert. Preface. Information Theory. New York: Interscience Publishers, 1965.

Cherry Collin. On Human Communication 'A Review, a Survey, and a Criticism'. Cambridge: MIT Press, 1978.

---. "The Communication of Information (An historical Overview)". American Scientist. Vol. 40, No 4 (October 1952): 640-645.

Crowley, D.J. (1999) Understanding Communication - The Signifying Web. New York: Gordon and Breach, Science Publishers, Inc., 1982.

Gallager, Robert G. Introduction. Information Theory and Reliable Communication. New York; John Wiley and Sons, Inc., 1968.

"Information Theory". Encyclopedia Americana. 1996 ed.

"Information Theory". Encyclopedia Britannica. 1997 ed.

"Information Theory". International Encyclopedia of Information and Library Science. Ed. John Feathers and Paul Sturges. Routledge, England: Routledge, 1997.

Skyttner, Lars. "Information Theory - A Psychology Study in Old and New Comcepts" Kybernetes. 27.3 (1998) Online. MCB University Press.

This page design copyright 1999 by Steve N. Jackson.

Contents copyright 1999 by Steve N. Jackson and Authors.

 

Version 7.09 (19 July).