# proof of PTAH inequality

In order to prove the PTAH inequality two lemmas are needed. The first lemma is quite general and does not depend on the specific $P$ and $Q$ that are defined for the PTAH inequality.

The setup for the first lemma is as follows:

We still have a measure space $X$ with measure $m$. We have a subset $\Lambda\subseteq{\mathbb{R}}^{n}$. And we have a function $p:X\times\Lambda\to\mathbb{R}$ which is positive and is integrable in $x$ for all $\lambda\in\Lambda$. Also, $p(x,\lambda)\log p(x,\lambda^{\prime})$ is integrable in $x$ for each pair $\lambda,\lambda^{\prime}\in\Lambda$.

Define $P:\Lambda\to\mathbb{R}$ by

 $P(\lambda)=\int p(x,\lambda)dm(x)$

and $Q:\Lambda\times\Lambda\to\mathbb{R}$

by

 $Q(\lambda,\lambda^{\prime})=\int p(x,\lambda)\log p(x,\lambda^{\prime})dm(x).$

Lemma 1 (1) $P(\lambda)\log\frac{P(\lambda^{\prime})}{P(\lambda)}\geq Q(\lambda,\lambda^{% \prime})-Q(\lambda,\lambda)$
(2) if $Q(\lambda,\lambda^{\prime})\geq Q(\lambda,\lambda)$ then $P(\lambda^{\prime})\geq P(\lambda)$. If equality holds then $p(x,\lambda)=p(x,\lambda^{\prime})$ a.e [m].

Proof It is clear that (2) follows from (1), so we only need to prove (1). Define a measure $d\nu(x)=\frac{p(x,\lambda)dm(x)}{P(\lambda)}$. Then

 $\int d\nu(x)=1$

so we can use Jensen’s inequality for the logarithm.

 $\displaystyle Q(\lambda,\lambda^{\prime})-Q(\lambda,\lambda)$ $\displaystyle=$ $\displaystyle\int p(x,\lambda)[\log p(x,\lambda^{\prime})-\log p(x,\lambda)]dm% (x)$ $\displaystyle=$ $\displaystyle\int p(x,\lambda)\log\frac{p(x,\lambda^{\prime})}{p(x,\lambda)}dm% (x)$ $\displaystyle=$ $\displaystyle P(\lambda)\int\log\frac{p(x,\lambda^{\prime})}{p(x,\lambda)}d\nu% (x)$ $\displaystyle\leq$ $\displaystyle P(\lambda)\log\int\frac{p(x,\lambda^{\prime})}{p(x,\lambda)}d\nu% (x)$ $\displaystyle=$ $\displaystyle P(\lambda)\log\int\frac{p(x,\lambda^{\prime})}{P(\lambda)}dm(x)$ $\displaystyle=$ $\displaystyle P(\lambda)\log\frac{P(\lambda^{\prime})}{P(\lambda)}.$

The next lemma uses the notation of the parent entry.

Lemma 2 Suppose $r_{i}\geq 0$ for $i=1,\ldots,n$ and $\theta=(\theta_{1},\ldots,\theta_{n})\in\sigma$. If $\sum_{j}r_{j}>0$ then

 $\prod_{i=1}^{n}{\theta_{i}}^{r_{i}}\leq\prod_{i}(\frac{r_{i}}{\sum_{j}r_{j}})^% {r_{i}}.$

Proof. Let $\lambda=(\lambda_{i})\in\sigma$. By the concavity of the $\log$ function we have

 $\sum_{i}\lambda_{i}\log x_{i}\leq\log\sum_{i}\lambda_{i}x_{i}$

where $x_{i}>0$ for $\i=1,\ldots,n$.

so that

 $\prod_{i}{x_{i}}^{\lambda_{i}}\leq\sum_{i}\lambda_{i}x_{i}=\prod_{i}(\sum_{j}% \lambda_{j}x_{j})^{\lambda_{i}}.$ (1)

It is enough to prove the lemma for the case where $r_{i}>0$ for all $i$. We can also assume $\theta_{i}>0$ for all $i$, otherwise the result is trivial.

Let $\rho=\sum_{j}r_{j}>0$ and $\lambda_{i}=\frac{r_{i}}{\rho}$ so that $\rho\lambda_{i}=r_{i}$.

Raise each side of (1) to the $\rho$ power:

 $\prod_{i}{x_{i}}^{r_{i}}\leq\prod_{i}(\sum_{j}\lambda_{j}x_{j})^{r_{i}}$ (2)

so that

 $\prod_{i}(\frac{x_{i}}{\sum_{j}\lambda_{j}x_{j}})^{r_{i}}\leq 1$ (3)

Multiply (3) by $\prod(\frac{r_{i}}{\rho})^{r_{i}}$ to get:

 $\prod_{i}(\frac{r_{i}x_{i}}{\sum_{j}r_{j}x_{j}})^{r_{i}}\leq\prod_{i}(r_{i}/% \rho)^{r_{i}}.$ (4)

Claim: There exist $x_{i}>0$, $i=1,\ldots,n$ such that

 $\theta_{i}=\frac{r_{i}x_{i}}{\sum_{j}r_{j}x_{j}}.$ (5)

If so, then substituting into (4)

 $\prod_{i}{\theta_{i}}^{r_{i}}\leq\prod_{i}(\frac{r_{i}}{\rho})^{r_{i}}=\prod_{% i}(\frac{r_{i}}{\sum_{j}r_{j}})^{r_{i}}$

So it remains to prove the claim. We have to solve the system of equations $\theta_{i}\sum_{j}r_{j}x_{j}=r_{i}x_{i}$, $i=1,\ldots,n$ for $x_{i}$. Rewriting this in matrix form, let $A=(a_{ij})$, $R=\textrm{diag}(r_{1},\ldots,r_{n})$, and $x=\textrm{diag}(x_{1},\ldots,x_{n})$, where $a_{ii}=\theta_{i}-1$ and $a_{ij}=\theta_{i}$ if $i\not=j$, $i,j=1,\ldots,n$. The columns sums of $A$ are $0$, since $\theta\in\sigma$. Hence $A$ is singular and the homogenous system $ARx=0$ has a nonzero solution, say $x$. Since $R$ is nonsingular, it follows that $Rx\not=0$. It follows that $r_{i}x_{i}\not=0$ for some $i$ and therefore $\sum_{j}r_{j}x_{j}\not=0$. If necessary, we can replace $x$ by $-x$ so that $\sum_{j}r_{j}x_{j}>0$. From (5) it follows that $x_{j}>0$ for all $j$.

Now we can prove the PTAH inequality. Let $r_{i}(\lambda)=\int a_{i}(x)\prod_{j}{\lambda_{j}}^{a_{j}(x)}dm(x)$.

We calculate $\frac{\partial P}{\partial\lambda_{i}}$ by differentiating under the integral sign. If $\lambda_{i}>0$ then

 $\frac{\partial P}{\partial\lambda_{i}}=r_{i}(\lambda)/\lambda_{i}.$

Thus

 $\lambda_{i}\frac{\partial P}{\partial\lambda_{i}}=r_{i}(\lambda).$ (6)

If $\lambda_{i}=0$ then by writing

 $r_{i}(\lambda)=\int_{E}a_{i}(x)\ldots dm(x)+\int_{E^{c}}{\lambda_{i}}^{a_{i}(x% )}\ldots dm(x)$

where $E=\{x\in X|a_{i}(x)=0\}$ it is clear that each integral is 0, so that $r_{i}(\lambda)=0$. So again, (6) holds. Therefore,

 $\frac{r_{i}(\lambda)}{\sum_{j}r_{j}(\lambda)}=\frac{\lambda_{i}\partial P/% \partial\lambda_{i}}{\sum_{j}\lambda_{j}\partial P/\lambda_{j}}=\overline{% \lambda_{i}}.$

Then

 $\displaystyle Q(\lambda,\lambda^{\prime})$ $\displaystyle=$ $\displaystyle\int\prod_{j}{\lambda_{j}}^{a_{j}(x)}\log\prod_{i}({\lambda_{i}}^% {\prime})^{a_{i}(x)}dm(x)$ $\displaystyle=$ $\displaystyle\sum_{i}\log{\lambda_{i}}^{\prime}\int a_{i}(x)\prod_{j}{\lambda_% {j}}^{a_{j}(x)}dm(x)$ $\displaystyle=$ $\displaystyle\sum_{i}r_{i}(\lambda)\log{\lambda_{i}}^{\prime}$ $\displaystyle=$ $\displaystyle\log\prod_{i}({\lambda_{i}}^{\prime})^{r_{i}(\lambda)}$ $\displaystyle\leq$ $\displaystyle\log\prod_{i}(\frac{r_{i}(\lambda)}{\sum_{j}r_{j}(\lambda)})^{r_{% i}(\lambda)}$ $\displaystyle=$ $\displaystyle\log\prod_{i}({\overline{\lambda_{i}}})^{r_{i}(\lambda)}$ $\displaystyle=$ $\displaystyle Q(\lambda,\overline{\lambda}).$

Now by Lemma 1, with $\overline{\lambda}=\lambda^{\prime}$ we get $P(\overline{\lambda})\geq P(\lambda)$.

Title proof of PTAH inequality ProofOfPTAHInequality 2013-03-22 16:55:00 2013-03-22 16:55:00 Mathprof (13753) Mathprof (13753) 29 Mathprof (13753) Proof msc 26D15