next up previous
Next: Defining Information Up: A Discipline Independent Definition Information Previous: Introduction

(pdf of full article)

Definitions of Information

One of the most common ways to define information is to describe it as one or more statements or facts that are received by a human and that have some form of worth to the recipient. For example, the Sesame Street character ``Cookie Monster" describes information as ``news or facts about something," or, as the first definition in the Random House College Dictionary suggests for information, ``knowledge communicated or received concerning a particular fact or circumstance; news." Cookie Monster's definition is consistent with the common notions that information must:
1.
be something, although the exact nature (substance, energy, or abstract concept) isn't clear;
2.
provide ``new" information: a repetition of previously received messages isn't informative;
3.
be ``true:" a lie or false or counterfactual information is mis-information, not information itself;
4.
be ``about" something.
This approach to information, like most human-centered approaches to information, leads one to emphasize the meaning and use of message, ``what the message is about?" and ``what is known already?" over the information carrying messenger and the message itself [Art73,AD75,BR76,Far80,Har84,Lev77,MM83]. When the message is essentially random, or the message is of no value to the recipient, such as a repeated message previously received and understood, it is colloquially said that no information was received and no information was transmitted.

Some individuals equate information with meaning [Mil87]. Hearing a statement isn't enough to make an event an informative act; its meaning must be perceived to make the statement informative. Arguing against this approach, Bar Hillel points out that ``it is psychologically almost impossible not to make the shift from the one sense of information, ... i.e. information = signal sequence, to the other sense, information = what is expressed by the signal sequences" [BH55]. As Stonier reminds us, ``we must not confuse the detection and/or interpretation of information with information itself" [Sto90]. For many who have worked with quantitative models of information in the engineering disciplines, this concern with meaning lies outside the scope of the traditional mathematical theory of information; communication engineers seldom concern themselves professionally with the meaning of messages.

In an approach similar to defining information as meaning, information is often understood in terms of knowledge that is transmitted to a sentient being. For example, Peters defines information as `` knowledge with the human body taken out of it" [Pet88]. Similarly, information may be understood as ``that which occurs within the mind upon the absorption of a message" [Pra82].

Information and its cousin entropy have long been studied as fundamental characteristics of physical systems and structures. Systems of molecules are often studied by considering an imaginary being, Maxwell's demon, a hypothetical gatekeeper between two sections of a closed system of molecules [LR90]. Assume that molecules move in a frictionless way. If the demon opens and closes the door at just the right times, can it essentially allow particles with higher levels of energy through to one side of the doorway and particles with lower levels of energy through to the other? Is this a perpetual motion machine, with the demon expending no energy, pumping energy to wherever it is wanted? This would violate the second law of thermodynamics. If the demon can't perform as described, why is it limited in its capabilities?

Attempts to solve this paradox have centered on the relationship between information and structure of the systems and the information needed and used by the demon in decision making, as well as the energy involved in making an observation and remembering the state of the system. The description of the positions of the molecules, the structure of the system, can be described using concepts and formulae consistent with some communication models, such as Shannon's model of communications.

The form or structure of systems is viewed by some as being equivalent to information. Thus, ``information is what remains after one abstracts from the material aspects of physical reality" [Res89]. The information in a structure is an immaterial ghost that co-exists with the physical object about which it informs. The word ``form" comes from the same etymological roots as ``information." Form or state of nature may be reflected in the set of characteristics' values in the output of a producing process. The randomness associated with an information producing process about a system's form and structure may be understood as the inverse of information. Consider a pile of coins on a counter; whether they are ``heads" or ``tails" is information, while if we have no such information, their heads or tails orientation is random to us. Receiving information about the orientation of a coin may result in the removal of uncertainty, decreasing the ignorance or lack of information about the structure.

Information has long been understood as a concept appropriate for discussion in the humanities and social sciences [Rit91]. Electrical engineers began using the term to describe data transmission during the first half of the twentieth century. Instead of providing a definition of information, these engineers focused on measuring information, as they attempted to maximize information transmitted or received, or minimize noise, or both. The social science literature of the 1950s and 1960s used ideas about information measurement developed by Shannon and Weaver in the late 1940s crossing back from engineering to the liberal arts. Outside of electrical engineering, Shannon's formal ideas about information are used most profitably today in the computing and cognitive sciences.

All of these ideas about information serve to facilitate discourse for those describing discipline specific concepts, each used in solving a particular set of problems. Electrical engineers wish to study the capacity of pieces of hardware and the physical connections between them. Linguists wish to understand how information is transmitted by languages and the nature of what lies at the core of communication. Mathematicians and computer scientists wish to study the processes by which software transforms input into output and the fundamental characteristics of transforming processes.

Researchers in these disciplines want tools that manipulate the phenomena of their domains. These needs have produced ideas about information having varying degrees of overlap, as well as areas where they fail to intersect. Given the number of definitions or metaphors that have been proposed for information, how does one compare them? We propose a commonality within these definitions; this underlying commonality can be defined, studied, and measured.

We suggest here a general definition of information: information is produced by all processes and it is the values of characteristics in the processes' output that are information. This captures most concepts of information in individual disciplines. The number of possible values in the output and their relative frequencies of occurrence may be used in measuring the amount of information present.


next up previous
Next: Defining Information Up: A Discipline Independent Definition Information Previous: Introduction
Bob Losee
1999-03-10