VI. Information Gathering: Searching For More Valuable Information The previous section provides a theoretical foundation of determining best decision rules given information structures. It is possible to compare values of different information structures to prescribe how information should be gathered. Following Marschak and Radner (1978, p. 51), the criterion for comparing two distinct information structures is based on the total expected payoffs yielded by the two information gathering functions. That is, let and be distinct information structures with respect to random variable
where is the maximum expected payoff,
Note that the comparison of (9) does not take into account the cost of information. There are two approaches to such comparison: fineness and garbling (Marschak and Radner, 1978). One information structure is said to be as fine as another if is a subpartition of ; that is, if every set in is contained in some set in . If and are distinct, and is as fine as , then we shall say that is finer than (Marschak and Radner, 1978, p. 53). Two information structures are noncomparable in the sense that none of the information structures is finer than the other. "The finest possible information structure is complete information, defined by"
(Marschak and Radner, 1978, p. 54). "The least fine (or "coarsest") information structure is the partition consisting of the set
To provide the other approach to comparing information structures, namely garbling, two notions on payoff-adequate partitions and noisy information structures are introduced.
Noiseless information in some sense means cleanliness or purity of information. If the signal yielded through an information structure can be fully related to the payoff of a particular decision situation, then the information structure is noiseless. "An average stock price is an example of a noiseless information structure, whereas a market survey would . . . turn out to be noisy" because the outcome depends on many unknown variables (Marschak and Radner, 1978, p. 60). Based on the two definitions the well known Bayes's theorem relating the prior and posterior probabilities, and the conditional probabilities can be described as follows:
A more concrete definition of comparing information structures with respect to a particular payoff-adequate partition is given as follows:
The problem of garbling is indicative of accuracy of information, and thus is crucial in planning research because it is one of the most fundamental forms of uncertainty. Problematic garbling pervades in information based on which plans are made, regardless of whether the information is about environment, values, related decisions, or search among alternatives (Hopkins, 1981). Planning in some sense is equivalent to eliminating problematic garbling and acting accordingly. The planning problem under the current formulation boils down to searching for the coarsest information structure for which the current information structure available is a garbling of, and is noiseless with respect to the payoff-adequate function. Put simply, the information gathered in a particular planning situation must be both accurate and relevant to the planner's payoffs. Before giving the complete proof of this corollary of my model of information searching, I need first to prove the following theorem of my own.
Because is noiseless with respect to z) is equal to one for some that contains _{i}z, those for all other s being zero. In addition for (|_{i}Z) = 1; otherwise . Equation (15) simplifies to_{i}
Consider the conditional probability , for every . Because is noisy, for every . Let
It follows that
because (| z) for that , and set all other values to arbitrary values so that and Equation (17) holds, i.e., (|_{i}z) is equal to zero for all other (s). Equations (17) and (18) imply that is at least as valuable as so that (Marschak and Radner, 1978, p. 65)._{i}Theorems 4 and 5 together imply the following corollary
page 37 |