VI. Information Gathering: Searching For More Valuable Information
The previous section provides a theoretical foundation of determining best decision rules given information structures. It is possible to compare values of different information structures to prescribe how information should be gathered. Following Marschak and Radner (1978, p. 51), the criterion for comparing two distinct information structures is based on the total expected payoffs yielded by the two information gathering functions. That is, let and be distinct information structures with respect to random variable Xi. is not more valuable than , relative to a payoff function and a probability distribution , if
where is the maximum expected payoff,
Note that the comparison of (9) does not take into account the cost of information. There are two approaches to such comparison: fineness and garbling (Marschak and Radner, 1978). One information structure is said to be as fine as another if is a subpartition of ; that is, if every set in is contained in some set in . If and are distinct, and is as fine as , then we shall say that is finer than (Marschak and Radner, 1978, p. 53). Two information structures are noncomparable in the sense that none of the information structures is finer than the other. "The finest possible information structure is complete information, defined by"
(Marschak and Radner, 1978, p. 54).
"The least fine (or "coarsest") information structure is the partition consisting of the set Xi itself." (Marschak and Radner, 1978, p. 54) This structure contains no information. "It is clear that this information structure gives no information at all that has not already been incorporated into the formulation of the decision problem." (Marschak and Radner, 1978, p. 54) It is desirable to obtain finer information structures due to the following theorem which is directly derived from Marschak and Radner (1978). The proof can be found in Marschak and Radner (1978, pp. 54-55).
(Marschak and Radner, 1978, p. 54).
To provide the other approach to comparing information structures, namely garbling, two notions on payoff-adequate partitions and noisy information structures are introduced.
Definition 3: Payoff-adequate Partition (Marschak and Radner, 1978, p.59)
Definition 4: Noiseless Information Structure (Marschak and Radner, 1978, p. 60)
Noiseless information in some sense means cleanliness or purity of information. If the signal yielded through an information structure can be fully related to the payoff of a particular decision situation, then the information structure is noiseless. "An average stock price is an example of a noiseless information structure, whereas a market survey would . . . turn out to be noisy" because the outcome depends on many unknown variables (Marschak and Radner, 1978, p. 60).
Based on the two definitions the well known Bayes's theorem relating the prior and posterior probabilities, and the conditional probabilities can be described as follows:
Theorem 3: Bayes's Theorem (Marschak and Radner, 1978, p. 63)
A more concrete definition of comparing information structures with respect to a particular payoff-adequate partition is given as follows:
Definition 5: Garbling Condition (Marschak and Radner, 1978, pp. 64-65)
The problem of garbling is indicative of accuracy of information, and thus is crucial in planning research because it is one of the most fundamental forms of uncertainty. Problematic garbling pervades in information based on which plans are made, regardless of whether the information is about environment, values, related decisions, or search among alternatives (Hopkins, 1981). Planning in some sense is equivalent to eliminating problematic garbling and acting accordingly.
The planning problem under the current formulation boils down to searching for the coarsest information structure for which the current information structure available is a garbling of, and is noiseless with respect to the payoff-adequate function. Put simply, the information gathered in a particular planning situation must be both accurate and relevant to the planner's payoffs. Before giving the complete proof of this corollary of my model of information searching, I need first to prove the following theorem of my own.
Proof: According to (10), Grouping according to Zi, and , and from (7), we have
Because is noiseless with respect to Zi, by definition, (|zi) is equal to one for some that contains zi, those for all other s being zero. In addition for (|Zi) = 1; otherwise . Equation (15) simplifies to
Consider the conditional probability , for every . Because is noisy, for every . Let
It follows that
because (|zi) is, by definition of noiseless, equal to one for some , and we can set to (|zi) for that , and set all other values to arbitrary values so that and Equation (17) holds, i.e., (|zi) is equal to zero for all other (s). Equations (17) and (18) imply that is at least as valuable as so that (Marschak and Radner, 1978, p. 65).
Theorems 4 and 5 together imply the following corollary
Proof. Since is a de-garbling of and is noiseless, from Theorems 4 (de-garbling is at least as valuable as garbling) and 5 (noiseless is at least as valuable as noisy), it follows that the partition induced by is at least as valuable as that induced by .