3. Distributed dynamical systems
Probabilistic cellular automata provide useful toy models of a
wide range of physical and biological systems. A cellular
automaton consists of a collection
of cells, each equipped with
neighbors. Two important important examples are
Example 2 (Conway’s game of life).
The cellular automaton is a grid of deterministic cells
with outputs . A cell outputs 1 at time iff:
(i) three of its neighbors outputted 1s at time or
(ii) it and two neighbors outputted 1s at . Remarkably,
a sufficiently large game of life grid can implement any
deterministic computation [3].
Example 3 (Hopfield networks).
These are probabilistic cellular automata [4, 1], again with outputs . Cell fires with probability proportional to
Temperature controls network stochasticity. Attractors are embedded into a network by setting the connectivity matrix as .
It is useful to take a finer perspective on cellular automata by decomposing them into spacetime coordinates or occasions [2]. An occasion is a cell at a time point . Two occasions are linked if there is a connection from ’s cell to ’s (because they are neighbors or the same cell) and their time coordinates are and respectively for some , so occasions form a directed graph. More generally:
Definition 4.
A distributed dynamical system consists of
the following data:
-
1.
Directed graph. A graph with a finite set
of vertices or occasions and edges .
-
2.
Alphabets. Each vertex has finite alphabet of outputs and finite alphabet of inputs, where .
-
3.
Mechanisms. Each vertex is equipped with stochastic map .
![](dds.png)
Taking any cellular automaton over a finite time interval
initializing the mechanisms at time
with fixed values (initial conditions) or
probability distributions (noise sources) yields a distributed
dynamical system, see Fig. 1. Each cell of the
original automaton corresponds to a series of occasions in
the distributed dynamical system, one per time step.
Cells with memory – i.e. whose outputs depend on their neighbors outputs over multiple time steps – receive inputs from occasions more than one time step in the past. If a cell’s mechanism changes (learns) over time then different mechanisms are assigned to the cell’s occasions at different time points.
The sections below investigate the compositional structure
of
measurements: how they are built out of submeasurements.
Technology for tracking subsystems and submeasurements is
therefore necessary. We introduce two closely related
categories
:
Definition 5.
The category of subsystems of
is a Boolean lattice with objects given by sets of ordered
pairs of vertices and arrows given by inclusions . The initial and terminal
objects are and .
Remark 2.
Subsystems are defined as ordered pairs of vertices, rather than subgraphs of the directed graph of . Pairs of occasions that are not connected by edges are ineffective; they do not contribute to the information-processing performed by the system. We include them in the formalism precisely to make their lack of contribution explicit, see Remark 3 (http://planetmath.org/4measurement#Thmrem3).
Let and similarly for . Set the input alphabet of as the product of the output alphabets of its source occasions and similarly the output alphabet of as the product of the output alphabets of its target occasions .
Definition 6.
The category of measuring devices on has objects for . For define arrow
where and are shorthands for projections as in Example 1 (http://planetmath.org/2stochasticmaps#Thmeg1)
The reason for naming the category of measuring
devices will become clear in §4 (http://planetmath.org/4measurement)
below. The two categories are connected by contravariant
functor :
Theorem 4 (structure presheaf).
Proof: Functor is trivially a presheaf since it is contravariant. It is an interesting presheaf because the gluing axiom holds.
For gluing we need to show that for any collection of subsystems and sections such that for all , there exists section such that for all . This reduces to finding a conditional distribution that causes diagram
![](information-theoretic-distributed-measurement-3.1.png)
in to commute. The vertices are conditional distributions and the arrows are marginalizations, so rewrite as
![](information-theoretic-distributed-measurement-3.2.png)
where and similarly for the vertical arrow. It is easy to see that
satisfies the requirement.
For to be a sheaf it would also have to satisfy
unique descent: the section satisfying the gluing axiom must
not only exist for any collection
with compatible restrictions but must
also be unique. Descent in is not unique
because there are many distributions
satisfying the requirement
above: strictly speaking is a marginalization operator
rather than restriction. For example, there are many
distributions that marginalize to give and
besides the product distribution .
The structure presheaf depends on the graph structure and alphabets; mechanisms play no role. We now construct a family of sections of using the mechanisms of ’s occasions. Specifically, given a subsystem , we show how to glue its occasions’ mechanisms together to form joint mechanism . The mechanism of the entire system is recovered as a special case.
In general, subsystem is not isolated: it receives
inputs along edges contained in but not in
. Inputs along these edges cannot be assigned a fixed
value since in general there is no preferred element of
. They also cannot be ignored since is defined
as receiving inputs from all its sources. Nevertheless, the
mechanism of should depend on alone. We
therefore treat edges not in as sources of extrinsic
noise by marginalizing with respect to the uniform distribution
as in Corollary 3 (http://planetmath.org/2stochasticmaps#Thmthm3).
For each vertex let . We then have projection . Define
(1) |
It follows immediately that is itself a distributed dynamical system defined by its graph, whose alphabets are inherited from and whose mechanisms are constructed by marginalizing.
Next, we tensor the mechanisms of individual occasions and glue
them together using the diagonal map . The
diagonal map used here11which is surjective in the
sense that all rows contain non-zero entries generalizes
and removes redundancies in
, which may, for example, include the
same source alphabets many times in different factors.
Let mechanism be
(2) |
The dual of is
(3) |
Finally, we find that we have constructed a family of sections of :
The construction used to glue together the mechanism of the entire system can also be used to construct the mechanism of any subsystem, which provides a window – the quale – into the compositional structure of distributed processes.
References
- 1 DJ Amit (1989): Modelling brain function: the world of attractor neural networks. Cambridge University Press.
- 2 David Balduzzi (2011): Detecting emergent processes in cellular automata with excess information. preprint .
- 3 ER Berlekamp, JC Conway & RK Guy (1982): Winning Ways for your Mathematical Plays. 2, Academic Press.
-
4
JJ Hopfield (1982):
Neural networks and physical systems with emergent
computational properties.
Proc. Nat. Acad. Sci. 79,
pp. 2554–2558.
Title 3. Distributed dynamical systems Canonical name 3DistributedDynamicalSystems Date of creation 2014-04-30 18:41:46 Last modified on 2014-04-30 18:41:46 Owner rspuzio (6075) Last modified by rspuzio (6075) Numerical id 15 Author rspuzio (6075) Entry type Feature Classification msc 60J20 Classification msc 81P15 Classification msc 18F20