This section adapts Definition 1 (http://planetmath.org/1introduction#Thmdefn1) to distributed stochastic systems. The first step is to replace elements of state space with stochastic maps , or equivalently probability distributions on , which are the system’s inputs. Individual elements of correspond to Dirac distributions.
Second, replace function with mechanism . Since we are interested in the compositional structure of measurements we also consider submechanisms . However, comparing mechanisms requires that they have the same domain and range, so we extend to the entire system as follows
We refer to the extension as by abuse of notation. We extend mechanisms implicitly whenever necessary without further comment. Extending mechanisms in this way maps the quale into a cloud of points in labeled by objects in .
In the special case of the initial object , define
Subsystems differing by non-existent edges (Remark 2 (http://planetmath.org/3distributeddynamicalsystems#Thmrem2)) are mapped to the same mechanism by this construction, thus making the fact that the edges do not exist explicit within the formalism.
Composing an input with a submechanism yields an output , which is a probability distribution on . We are now in a position to define
A measuring device is the dual to the mechanism of a subsystem. An output is a stochastic map . A measurement is a composition .
Recall that stochastic maps of the form correspond to probability distributions on . Outputs as defined above are thus probability distributions on , the output alphabet of . Individual elements of are recovered as Dirac vectors: .
When for some we have
Definition 8 requires some unpacking. To relate it to the classical notion of measurement, Definition 1 (http://planetmath.org/1introduction#Thmdefn1), we consider system where the alphabets of and are the sets and respectively, and the mechanism of is . In other words, system corresponds to a single deterministic function .
Proposition 5 (classical measurement).
Proof: By Corollary 2 (http://planetmath.org/2stochasticmaps#Thmthm2) measurement is conditional distribution
which generalizes the preimage. Effective information follows immediately.
Effective information can be interpreted as quantifying a measurement’s precision. It is high if few inputs cause to output out of many – i.e. has few elements relative to – and conversely is low if many inputs cause to output – i.e. if the output is relatively insensitive to changes in the input. Precise measurements say a lot about what the input could have been and conversely for vague measurements with low .
The point of this paper is to develop techniques for studying measurements constructed out of two or more functions. We therefore present computations for the simplest case, distributed system , in considerable detail. Let be the graph
The remainder of this section and most of the next analyzes measurements in the lattice.
Proposition 6 (partial measurement).
The measurement performed on when outputs , treating as extrinsic noise, is conditional distribution
where . The effective information generated by the partial measurement is
Proof: Treating as a source of extrinsic noise yields which takes . The dual is
The computation of effective information follows immediately.
A partial measurement is precise if the preimage has small or empty intersection with for most , and large intersection for few .
Propositions 5 and 6 compute effective information of a measurement relative to the null context provided by complete ignorance (the uniform distribution). We can also compute the effective information generated by a measurement in the context of a submeasurement:
Proposition 7 (relative measurement).
The information generated by measurement in the context of the partial measurement where is unobserved noise, is
To interpret the result decompose into a family of functions labeled by elements of , where . The precision of the measurement performed by s . It follows that the precision of the relative measurement, Eq. (7), is the expected precision of the measurements performed by family taken with respect to the probability distribution generated by the noisy measurement.
In the special case of relative precision is simply the difference of the precision of the larger and smaller subsystems:
Corollary 8 (comparing measurements).
- 1 E T Jaynes (1985): Entropy and Search Theory. In CR Smith & WT Grandy, editors: Maximum-entropy and Bayesian Methods in Inverse Problems, Springer.
|Date of creation||2014-04-22 19:36:22|
|Last modified on||2014-04-22 19:36:22|
|Last modified by||rspuzio (6075)|