## You are here

Hometopological entropy

## Primary tabs

# topological entropy

Let $(X,d)$ be a compact metric space and $f\colon X\to X$ a continuous map. For each $n\geq 0$, we define a new metric $d_{n}$ by

$d_{n}(x,y)=\max\{d(f^{i}(x),f^{i}(y)):0\leq i<n\}.$ |

Two points are $\epsilon$-close with respect to this metric if their first $n$ iterates are $\epsilon$-close. For $\epsilon>0$ and $n\geq 0$ we say that $F\subset X$ is an $(n,\epsilon)$-separated set if for each pair $x,y$ of points of $F$ we have $d_{n}(x,y)>\epsilon$. Denote by $N(n,\epsilon)$ the maximum cardinality of an $(n,\epsilon)$-separated set (which is finite, because $X$ is compact). Roughly, $N(n,\epsilon)$ represents the number of “distinguishable” orbit segments of length $n$, assuming we cannot distinguish points that are less than $\epsilon$ apart. The topological entropy of $f$ is defined by

$h_{{top}}(f)=\lim_{{\epsilon\to 0}}\left(\limsup_{{n\to\infty}}\frac{1}{n}\log N% (n,\epsilon)\right).$ |

It is easy to see that this limit always exists, but it could be infinite. A rough interpretation of this number is that it measures the average exponential growth of the number of distinguishable orbit segments. Hence, roughly speaking again, we could say that the higher the topological entropy is, the more essentially different orbits we have.

Topological entropy was first introduced in 1965 by Adler, Konheim and McAndrew, with a different (but equivalent) definition to the one presented here. The definition we give here is due to Bowen and Dinaburg.

## Mathematics Subject Classification

37B40*no label found*

- Forums
- Planetary Bugs
- HS/Secondary
- University/Tertiary
- Graduate/Advanced
- Industry/Practice
- Research Topics
- LaTeX help
- Math Comptetitions
- Math History
- Math Humor
- PlanetMath Comments
- PlanetMath System Updates and News
- PlanetMath help
- PlanetMath.ORG
- Strategic Communications Development
- The Math Pub
- Testing messages (ignore)

- Other useful stuff

## Recent Activity

new question: Prove that for any sets A, B, and C, An(BUC)=(AnB)U(AnC) by St_Louis

Apr 20

new image: information-theoretic-distributed-measurement-dds.png by rspuzio

new image: information-theoretic-distributed-measurement-4.2 by rspuzio

new image: information-theoretic-distributed-measurement-4.1 by rspuzio

new image: information-theoretic-distributed-measurement-3.2 by rspuzio

new image: information-theoretic-distributed-measurement-3.1 by rspuzio

new image: information-theoretic-distributed-measurement-2.1 by rspuzio

Apr 19

new collection: On the Information-Theoretic Structure of Distributed Measurements by rspuzio

Apr 15

new question: Prove a formula is part of the Gentzen System by LadyAnne

Mar 30

new question: A problem about Euler's totient function by mbhatia

## Comments

## entropy of ergodic and mixing processes

Very interesting definition. If I grok it correctly, it seems to be saying that ergodic processes will have a low entropy in general, which is surprising and counter-intuitive. The only systems I can think of that would have high entropy would be dissipative systems, where, on iteration, large portions of the space ''X'' are abandoned during iteration, and never visited again. So ... its this entropy in fact a measure of dissipation? Can any other intuitive interpretations can be added?

--linas

## Re: entropy of ergodic and mixing processes

Never mind. I've (once again) got the definition exactly upside-down.

## Re: entropy of ergodic and mixing processes

Soo .. is there any way to simply erase/retract/edit/modify this line of posts? I clearly misread the defn, (again), so the comment I just made earlier is nonsense.