1. Introduction
Any classical physical system (by which we simply mean any deterministic function) can be taken as a measuring apparatus or input/output device. For example, a thermometer takes inputs from the atmosphere and outputs numbers on a digital display. The thermometer categorizes inputs by temperature and is blind to, say, differences in air pressure.
Classical measurements are formalized as follows:
Definition 1.
Given a classical physical system with state space , a measuring device is a function . The output is the reading and the pre-image is the measurement.
From this point of view a thermometer and a barometer are two functions, and , mapping the state space of configurations (positions and momenta) of atmospheric particles to real numbers. When the thermometer outputs , it specifies that the atmospheric configuration was in the pre-image which, assuming the thermometer perfectly measures temperature, is exactly characterized as atmospheric configurations with temperature . Similarly, the pre-images generated by the barometer group atmospheric configurations by pressure.
The classical definition of measurement takes a thermometer as a monolithic object described by a single function from atmospheric configurations to real numbers. The internal structure of the thermometer – that is composed of countless atoms and molecules arranged in an extremely specific manner – is swept under the carpet (or, rather, into the function).
This paper investigates the structure of measurements performed by distributed systems. We do so by adapting Definition 1 to a large class of systems that contains networks of Boolean functions [10], Conway’s game of life [7, berlekamp:82] and Hopfield networks [9, 2] as special cases.
Our motivation comes from prior work investigating information processing in discrete neural networks [4, 5]. The brain can be thought of as an enormously complicated measuring device mapping sensory states and prior brain states to subsequent brain states. Analyzing the functional dependencies implicit in cortical computations reduces to analyzing how the measurements performed by the brain are composed out of submeasurements by subdevices such as individual neurons and neuronal assemblies. The cortex is of particular interest since it seemingly effortlessly integrates diverse contextual data into a unified gestalt that determines behavior. The measurements performed by different neurons appear to interact in such a way that they generate more information jointly than separately. To improve our understanding of how the cortex integrates information we need to a formal language for analyzing how context affects measurements in distributed systems.
As a first step in this direction, we develop methods for analyzing the geometry of measurements performed by functions with overlapping domains. We propose, roughly speaking, to study context-dependence in terms of the geometry of intersecting pre-images. However, since we wish to work with both probabilistic and deterministic systems, things are a bit more complicated.
We sketch the contents of the paper. Section §2 (http://planetmath.org/2stochasticmaps) lays the groundwork by introducing the category of stochastic maps . Our goal is to study finite set valued functions and conditional probability distributions on finite sets. However, rather than work with sets, functions and conditional distributions, we prefer to study stochastic maps (Markov matrices) between function spaces on sets. We therefore introduce the faithful functor taking functions on sets to Markov matrices:
where is functions from to . Conditional probability distributions can also be represented using stochastic maps.
Working with linear operators instead of set-valued functions is convenient for two reasons. First, it unifies the deterministic and probabilistic cases in a single language. Second, the dual of a stochastic map provides a symmetric treatment of functions and their corresponding inverse image functions. Recall the inverse of function is , which takes values in the powerset of , rather than itself. Dualizing a stochastic map flips the domain and range of the original map, without introducing any new objects:
(1) |
see Corollary 2 (http://planetmath.org/2stochasticmaps#Thmthm2)
Section §3 (http://planetmath.org/3distributeddynamicalsystems) introduces distributed dynamical systems. These extend probabilistic cellular automata by replacing cells (space coordinates) with occasions (spacetime coordinates: cell at time ). Inspired by [8, 1], we treat distributed systems as collections of stochastic maps between function spaces so that processes (stochastic maps) take center stage, rather than their outputs. framework bares a formal resemblance to the categorical approach to quantum mechanics developed in [1]. Although the setting is abstract, it has the advantage that it is scalable: using a coarse-graining procedure introduced in [3] we can analyze distributed systems at any spatiotemporal granularity.
Distributed dynamical systems provide a rich class of toy universes. However, since these toy universes do not contain conscious observers we confront Bell’s problem [6]: “What exactly qualifies some physical [system] to play the role of ‘measurer’?” In our setting, where we do not have to worry about collapsing wave-functions or the distinction between macroscopic and microscopic processes, the solution is simple: every physical system plays the role of measurer. More precisely, we track measurers via the category of subsystems of . Each subsystem is equipped with a mechanism which is constructed by gluing together the mechanisms of the occasions in and averaging over extrinsic noise.
Measuring devices are typically analyzed by varying their inputs and observing the effect on their outputs. By contrast this paper fixes the output and varies the device over all its subdevices to obtain a family of submeasurements parametrized by all subsystems in . The internal structure of the measurement performed by is then studied by comparing submeasurements.
We keep track of submeasurements by observing that they are sections of a suitably defined presheaf. Sheaf theory provides a powerful machinery for analyzing relationships between objects and subobjects [11], which we adapt to our setting by introducing the structure presheaf , a contravariant functor from to the category of measuring devices on . Importantly, is not a sheaf: although the gluing axiom holds, uniqueness fails, see Theorem 4 (http://planetmath.org/3distributeddynamicalsystems#Thmthm4). This is because the restriction operator in is (essentially) marginalization, and of course there are infinitely many joint distributions that yield marginals and .
Section §4 (http://planetmath.org/4measurement) adapts Definition 1 to distributed systems and introduces the simplest quantity associated with a measurement: effective information, which quantifies its precision, see Proposition 5 (http://planetmath.org/4measurement#Thmthm5). Crucially, effective information is context-dependent – it is computed relative to a baseline which may be completely uninformative (the so-called null system) or provided by a subsystem.
Finally entanglement, introduced in §5 (http://planetmath.org/5entanglement), quantifies the obstruction (in bits) to decomposing a measurement into independent submeasurements. It turns out, see discussion after Theorem 10 (http://planetmath.org/5entanglement#Thmthm10), that entanglement quantifies the extent to which a measurement is context-dependent – the extent to which contextual information provided by one submeasurement is useful in understanding another. Theorem 9 (http://planetmath.org/5entanglement#Thmthm9) shows that a measurement is more precise than the sum of its submeasurements only if entanglement is non-zero. Precision is thus inextricably bound to context-dependence and indecomposability. The failure of unique descent is thus a feature, not a bug, since it provides “elbow room” to build measuring devices that are not products of subdevices.
Space constraints prevent us from providing concrete examples; the interested reader can find these in [4, 5, 3]. Our running examples are the deterministic set-valued functions
which we use to illustrate the concepts as they are developed.
References
- 1 Samson Abramsky & Bob Coecke (2009): Categorical Quantum Mechanics. In K Engesser, D M Gabbay & D Lehmann, editors: Handbook of Quantum Logic and Quantum Structures: Quantum Logic, Elsevier.
- 2 DJ Amit (1989): Modelling brain function: the world of attractor neural networks. Cambridge University Press.
- 3 David Balduzzi (2011): Detecting emergent processes in cellular automata with excess information. preprint .
- 4 David Balduzzi & Giulio Tononi (2008): Integrated Information in Discrete Dynamical Systems: Motivation and Theoretical Framework. PLoS Comput Biol 4(6), p. e1000091, doi:10.1371/journal.pcbi.1000091.
- 5 David Balduzzi & Giulio Tononi (2009): Qualia: the geometry of integrated information. PLoS Comput Biol 5(8), p. e1000462, doi:10.1371/journal.pcbi.1000462.
- 6 J S Bell (1990): Against ‘Measurement’. Physics World August, pp. 33–40.
- 7 Martin Gardner (1970): Mathematical Games - The Fantastic Combinations of John Conway’s New Solitaire Game, Life. Scientific American 223, pp. 120–123.
- 8 G ’t Hooft (1999): Quantum gravity as a dissipative deterministic system. Classical and Quantum Gravity 16(10).
- 9 JJ Hopfield (1982): Neural networks and physical systems with emergent computational properties. Proc. Nat. Acad. Sci. 79, pp. 2554–2558.
- 10 Stuart Kauffman, Carsten Peterson, Björn Samuelsson & Carl Troein (2003): Random Boolean network models and the yeast transcriptional network. Proc Natl Acad Sci U S A 100(25), pp. 14796–9, doi:10.1073/pnas.2036429100.
- 11 S MacLane & Ieke Moerdijk (1992): Sheaves in Geometry and Logic: A First Introduction to Topos Theory. Springer.
Title | 1. Introduction |
---|---|
Canonical name | 1Introduction |
Date of creation | 2014-04-22 16:27:47 |
Last modified on | 2014-04-22 16:27:47 |
Owner | rspuzio (6075) |
Last modified by | rspuzio (6075) |
Numerical id | 11 |
Author | rspuzio (6075) |
Entry type | Feature |
Classification | msc 94A17 |
Classification | msc 60J20 |
Classification | msc 81P15 |
Classification | msc 18F20 |