next up previous
Next: A Hierarchical Model of Up: A Discipline Independent Definition Information Previous: Defining Information

(pdf of full article)

A Process

Information is always informative about something, being a component of the output or result of the process. This ``aboutness" or representation is the result of a process or function producing the representation of the input, which might, in turn, be the output of another function and represent its input, and so forth. Consider a common process such as cooking. Baking a cake begins with ingredients and a set of instructions, either written, spoken, or in the mind of the cook. Following the instructions, the cook transforms the ingredients into a sloppy mess which, after an appropriate amount of baking, results in a cake, if one is careful and perhaps lucky.

Examining the cake provides information about both the process and the original ingredients, assuming that the cake may be examined without the act of observation changing the cake. The choice of high quality ingredients or the addition of a special flavoring will affect the outcome, ideally in a beneficial way. Varying the process, such as the amount of time in the oven or the temperature at which the cake is cooked, also changes the final product, and an examination of the final product provides information about the process used as well as about the ingredients. Note that the information will seldom allow one to fully reconstruct the producing process and its input, and any prior knowledge about the process or its input will aid in the reconstruction. The cooking process changes one set of ingredients, one set of materials, into another set of materials: the cake. The change from one set of materials to the cake provides information about the process and the original materials and the baking process. We may speak of the cooking process as carrying information about the original materials.

A cook can't move backwards from a cooked cake to regenerate the original ingredients; baking is almost always an irreversible process. Processes may be totally reversible, allowing the process to move backwards from the final state to the initial state. Reversible processes are such that no information is unrecoverable (lost) during the operation of this process; thus, given the output, one can still move back to the input. A simple reversible process is one that increments the input by 1 and returns the incremented value. One can always take the output and, knowing the nature of the process, move backward to the unique input that produced the output. On the other hand, non-reversible processes may lose information as they operate. Given the output of a non-reversible process, one can't always tell which input produced the output. The square function that produces the number 4, for example, could take +2 or -2 as its argument, what it takes as input; knowing the result does not provide all the information needed to determine the input to the function. Information about the sign of the original number is lost when squaring occurs, making the reversal of the process not possible for all cases. Similarly, if the reader is told that the sum of two numbers is 7, it is impossible to determine whether the two initial numbers were 6 and 1, 4 and 3, or some other combination. One can imagine a reversible variant of this function that produces the sum and one of the original numbers. One can always move from these two outputs back to the original values.

Other types of processes produce information. An interesting phenomena is found at the quantum level in physics. Consider two particles that are produced from a single process such that they are moving in opposite directions. Many pairs of particles produced from a single creating process will each have a characteristic which does not ``take on" a value (for either particle) until this value is observed or measured by instrumentation. Because these particles (will) have opposite characteristic values, measuring the value of one of the particles causes or forces the other particle, no matter at what distance, to take on the opposite value for the characteristic. A measuring process here makes information appear or become available--we try to avoid saying that information was ``created" by the measuring process. The measuring process takes the particles that are valueless in regards to the characteristic as input and produces particles that have values.

All processes produce information: making cakes and measuring characteristics of sub-atomic particles, physical processes and processes commonly understood as non-physical, describable and indescribable processes. An understanding of the information produced by processes requires some understanding of the nature of a process. Processes may be complex, or they may be simple and easily described and studied. All produce information about the input and the process. The author believes that all processes can be described, given enough time and resources. However, even if some processes can not be described it is still useful to recognize the output of the process as ``about" the process itself and the input. Furthermore, the notion of information as the values in the output of a process is helpful in understanding information phenomenon.

Processes consistent with assumptions defined by mathematicians may be defined as mathematical functions, such as those obtained by pressing mathematical operator keys on a calculator. These functions take one or more arguments as input and return a single value. Each input will produce the same given output each time a deterministic function is used, acting mechanically, always giving the same output from a given input. The process of addition, being deterministic and given common mathematical assumptions, will always produce 5 from inputs 2 and 3. Consider the increment function, which returns the value one more than the amount assigned to the argument. This assignment is referred to as the value instantiated or temporarily assigned to the argument. The increment function may be formally defined as f(x) = x+1. Given a valued assigned to the variable x in parentheses, the function will have as its returned value the value to the right hand side of the equal sign, that is, x+1. For this function, f(2) would return the value (or have the value) 3.

Other functions may have probabilistic characteristics and may be able to emulate random processes. The values returned vary depending on occurrences independent of the input. A coin toss might be emulated by a probabilistic function $f({\mathit heads}, 1/2)$ which returns the value heads approximately one half of the time.

Processes also may be defined as algorithms, sets of rules to be followed which produce an output, often in a certain order. A function may be thought of as a computer program or mechanical device that takes the characteristics of its input and produces output with its own characteristics. Every process may be defined functionally and every process may be defined as one or more functions. This interrelationship between functions, algorithms, and processes is governed by Church's Thesis, which formally describes the way in which several different descriptive languages or paradigms (e.g., processes, functions, agents, and algorithms) are capable of describing the same processes [KMA82]. For this reason, we can use the terms function, process, and algorithm interchangeably in many circumstances.

Processes always produce an effect - some change in the world - and thus can communicate information about the process and the input. Information occurs when the process produces something. Information is


  
Figure 1: The value of the output of a process is informative about the process and its input.
\begin{figure}\begin{center}
\setlength{\unitlength}{3em}
\begin{picture}
(8,3)...
...rge about Input \\ \large\& Process}}}}
}
\end{picture}\end{center}
\end{figure}

the value currently attached or instantiated to a characteristic or variable returned by a function, f(x), or a process. The value returned by a function is informative about the function's argument x, or about the function f(), or about both.
This is graphically shown in Figure 1. We use x in our definition to represent either a single variable or set of variables. For the sake of linguistic simplicity, we treat x as a single variable below. The information is fully contained in f(x), e.g., the received signal, but may not be fully contained in x, what is sent; the process may affect the output. A message is the information present at a point, with the term ``message" being shorthand for this information. The received message is the value of the set of characteristics of y=f(x) and not necessarily of x, the transmitted message, and f(x) is informative about x and about f. The input x to the function f(x)is informative about the process that produced it, that is, some other function. For example, a tree falling in a forest produces information in that the process produces an output: pressurized air waves that are perceived as noise by those with unimpaired hearing, the noise being informative about the falling of the tree. The tree itself is informative about the growth process, the original seed, and the nutrients in the soil, among other factors.

The process and its input cause the information to exist in the output, ignoring for purposes here the claimed ability of some to forsee the future. As with any causal phenomena, the cause must temporally precede the result. The existence of information thus always comes after the process that produced it has occurred.

Two factors affect the information in y=f(x): the processing function itself and the initial values of variables such as x.These factors are combined by the process. The input, when processed, produce an output that is informative about both the input and the process. A function transforms or maps the input into an output with each input being ``mapped" into a particular output.

The mapping f of a value is from one domain X into another domain Y. Each domain represents a set of possible values, with X being the set of possible values upon which the function operates and Y the set of possible values that can be produced by the function. For example, a small domain of male names may be represented as

\begin{displaymath}\strut\displaystyle
X = \{{\mathit abe, \quad bob, \quad
charlie, \quad don}\}.
\end{displaymath}

A function f capitalizing the first letter in a name and mapping from X onto Y might have

\begin{displaymath}\strut\displaystyle
Y = \{{\mathit Abe, \quad Bob, \quad
Charlie, \quad Don}\}.
\end{displaymath}

System information is the set of values produced by all possible inputs to a function, Y=f(X), over the domain of X. X is the domain of possible messages that might be transmitted and Y the domain of possible received messages, with the values in Y being the information about the process and set X. When describing the actions of a function, one may refer to the operation on the domain X as f(X), or one may refer to the action being applied to a specific x as f(x). For example, a square root function may be defined as operating on any positive integer, while it may also be applied to specific number, e.g. 16.The particular value x is bound to X, taking a specific value. That is, the characteristic X has in a particular instance, the value x. The mapping is for all possible values in the domain X onto Y=f(X).

Each function physically implements a channel, a causal mechanism that converts or transmits values from the initial domain X to domain Y. The function itself (as opposed to the value of the function) represents the information transmission process, while the value of the function provides information about the function and its input. System information is based on the relationships between the characteristics' values in the output. The quantity of information is measured by counting the relationships or a surrogate for the relationships. As these relations are produced by functions or processes, and only by functions, we may claim that information is contained in and only in the values returned by a function.

What is not information? Given our definition, information is not the process itself. The input to the process is not information about the process, although it clearly may be information about another process. The output is also not information by itself--the values in the output are information only in the sense that they are information about the process and the input, that is, information in the context of the process and its input.


next up previous
Next: A Hierarchical Model of Up: A Discipline Independent Definition Information Previous: Defining Information
Bob Losee
1999-03-10