4. Measurement
This section adapts Definitionย 1 (http://planetmath.org/1introduction#Thmdefn1) to distributed stochastic systems. The first step is to replace elements of state space with stochastic maps , or equivalently probability distributions on , which are the systemโs inputs. Individual elements of correspond to Dirac distributions.
Second, replace function with mechanism . Since we are interested in the compositional structure of measurements we also consider submechanisms . However, comparing mechanisms requires that they have the same domain and range, so we extend to the entire system as follows
(1) |
We refer to the extension as by abuse of notation. We extend mechanisms implicitly whenever necessary without further comment. Extending mechanisms in this way maps the quale into a cloud of points in labeled by objects in .
In the special case of the initial object , define
Remark 3.
Subsystems differing by non-existent edges (Remarkย 2 (http://planetmath.org/3distributeddynamicalsystems#Thmrem2)) are mapped to the same mechanism by this construction, thus making the fact that the edges do not exist explicit within the formalism.
Composing an input with a submechanism yields an output , which is a probability distribution on . We are now in a position to define
Definition 8.
A measuring device is the dual to the mechanism of a subsystem. An output is a stochastic map . A measurement is a composition .
Recall that stochastic maps of the form correspond to probability distributions on . Outputs as defined above are thus probability distributions on , the output alphabet of . Individual elements of are recovered as Dirac vectors: .
Definition 9.
The effective information generated by in the context of subsystem is
(2) |
The null context, corresponding to the empty subsystem , is a special case where is replaced by the uniform distribution on . To simplify notation define
Here, is the Kullback-Leibler divergence or relative entropy [1]. Eq.ย (2) expands as
(3) |
When for some we have
(4) |
Definitionย 8 requires some unpacking. To relate it to the classical notion of measurement, Definitionย 1 (http://planetmath.org/1introduction#Thmdefn1), we consider system where the alphabets of and are the sets and respectively, and the mechanism of is . In other words, system corresponds to a single deterministic function .
Proposition 5 (classical measurement).
The measurement performed when deterministic function outputs is equivalent to the preimage . Effective information is .
Proof: By Corollaryย 2 (http://planetmath.org/2stochasticmaps#Thmthm2) measurement is conditional distribution
which generalizes the preimage. Effective information follows immediately.
Effective information can be interpreted as quantifying a measurementโs precision. It is high if few inputs cause to output out of many โ i.e. has few elements relative to โ and conversely is low if many inputs cause to output โ i.e. if the output is relatively insensitive to changes in the input. Precise measurements say a lot about what the input could have been and conversely for vague measurements with low .
The point of this paper is to develop techniques for studying measurements constructed out of two or more functions. We therefore present computations for the simplest case, distributed system , in considerable detail. Let be the graph
with obvious assignments of alphabets and the mechanism of as . To make the formulas more readable let , and . We then obtain lattice
The remainder of this section and most of the next analyzes measurements in the lattice.
Proposition 6 (partial measurement).
The measurement performed on when outputs , treating as extrinsic noise, is conditional distribution
(5) |
where . The effective information generated by the partial measurement is
(6) |
Proof: Treating as a source of extrinsic noise yields which takes . The dual is
The computation of effective information follows immediately.
A partial measurement is precise if the preimage has small or empty intersection with for most , and large intersection for few .
Propositionsย 5 and 6 compute effective information of a measurement relative to the null context provided by complete ignorance (the uniform distribution). We can also compute the effective information generated by a measurement in the context of a submeasurement:
Proposition 7 (relative measurement).
The information generated by measurement in the context of the partial measurement where is unobserved noise, is
(7) |
To interpret the result decompose into a family of functions labeled by elements of , where . The precision of the measurement performed by s . It follows that the precision of the relative measurement, Eq.ย (7), is the expected precision of the measurements performed by family taken with respect to the probability distribution generated by the noisy measurement.
In the special case of relative precision is simply the difference of the precision of the larger and smaller subsystems:
Corollary 8 (comparing measurements).
References
- 1 Eย T Jaynes (1985): Entropy and Search Theory. In CRย Smith & WTย Grandy, editors: Maximum-entropy and Bayesian Methods in Inverse Problems, Springer.
Title | 4. Measurement |
---|---|
Canonical name | 4Measurement |
Date of creation | 2014-04-22 19:36:22 |
Last modified on | 2014-04-22 19:36:22 |
Owner | rspuzio (6075) |
Last modified by | rspuzio (6075) |
Numerical id | 6 |
Author | rspuzio (6075) |
Entry type | Feature |