Association schemes were introduced by statisticians in the 1950’s to analyze designs of statistical experiments. Today, association schemes are useful not only in experimental designs, but in other areas of mathematics such as combinatorics (coding theory) and group theory (permutation groups).
There are several equivalent ways to define an association schemes. Three useful ones are illustrated here:
Let be a non-empty set with elements, and a positive integer.
Definition 1. An association scheme on is a partition on into sets called associate classes, such that
for , there is an integer such that, for any ,
If we write for the set , then the second condition says that for any , the value is a constant, depending only on and , and not on the particular elements of . This implies that, for any , the relation is a union of (some of) the ’s.
The definition above can be restated in graph theoretic terminology. First, think of is a set of vertices, and two-element subsets of are edges. The complete graph on is just the set of all two-element subsets of . We may color the edges of the graph. Say there are colors labeled through . For each color , let be the set of edges with color . Then each is just a symmetric relation on , and that all the ’s, together with the diagonal relation, partition the set . This is basically the first condition of the definition above. In this regard, we can redefine an association scheme graph theoretically, as follows:
for any , there is an integer such that if (the edge has color ), then
In words, the definition says that, for any color , and any given edge with color , the number of triangles (a triangle in a graph is a cycle consisting of three edges) with as an edge, and two other edges with colors respectively, is .
The first definition can also be viewed in terms of matrices, and adjacency matrices more specifically. Given a finite set , a binary relation on naturally corresponds to matrix called the adjacency matrix of . Entry is if the -th element and the -th element are related by , and otherwise. If is reflexive, then has all ’s in its diagonal, and if is symmetric, then so is . Also, it is easy to see that the composition of two binary relations is the same as the product of their corresponding adjacency matrices. Then the comment in the paragraph after the first definition is the same as saying that the adjacency matrix of is a linear combination of the adjacency matrices of . This gives us the third definition below:
Definition 3. An association scheme is a finite set of non-zero matrices whose entries are ’s and ’s, such that
, the matrix whose entries are all ’s, and
for any , is a linear combination of .
By the definitions of the matrices and the second condition, for every pair , exactly one of the matrices has in cell , and all others have in the corresponding cell. As a result, the matrices are linearly independent.
Also, in view of the discussion above, it is easy to see that
Some terminology. is called the rank of the association scheme . Any , an element is said to be an -associate of if . For each , define:
So is the set of all -associates of . Then
Some basic properties of the intersection numbers:
for all .
- 1 R. A. Bailey, Association Schemes, Designed Experiments, Algebra and Combinatorics, Cambridge University Press (2004)
|Date of creation||2013-03-22 19:15:03|
|Last modified on||2013-03-22 19:15:03|
|Last modified by||CWoo (3771)|