Kullback divergence vs chi-square divergence
If the probability measures $P$ and $Q$ are mutually absolutely continuous, Kullback divergence $K(P,Q)=\int \log\left(\frac{\mathrm{d}P}{\mathrm{d}Q}\right)\mathrm{d}P$, and chi-square divergence $ \chi^2(Q,P) = \int \left( \frac{\mathrm{d}Q}{\mathrm{d}P}−1\right)^2 \mathrm{d}P$, how to prove that
$$ K(P,Q) \leqslant \frac{1}{2}\chi^2(Q,P)$$
$\endgroup$ 61 Answer
$\begingroup$In addition to saying that $P$ and $Q$ are mutually absolutely continuous, you also need to require that expectations, defining Kullback divergence and $\chi^2$ divergence exist.
The inequality as stated does not hold. Indeed, consider $P$ a measure corresponding to $\operatorname{Beta}(2,1)$ random variable, and $Q$ to uniform random variable, i.e. $$ \mathrm{d}P = 2 x \mathbf{1}_{(0,1)}(x) \mathrm{d} x \qquad \mathrm{d}Q = \mathbf{1}_{(0,1)}(x) \mathrm{d} x $$ Then, it is easy to compute that $$ K(P,Q) = \int_0^1 \log(2x) \cdot 2 x \mathrm{d} x = \log(2) - \frac{1}{2} \approx 0.19131472\ldots $$ whereas $$ \chi^2(Q,P) = \int_0^1 \left( 2x-1 \right)^2 \cdot 2 x \mathrm{d} x = \frac{1}{3} $$ Clearly $K(P,Q) > \frac{1}{2} \chi^2(Q,P)$.
N.B. If $\operatorname{Beta}(2,2)$ is used for $Q$ and $\mathcal{U}(0,1)$ for $P$, then the integral defining $\chi^2(Q,P)$ is easily seen to diverge.
$\endgroup$ 1More in general
"I put together a video thread hitting all the highlights and lowlights from today's lengthy impeachment hearing, which again exposed how weak the Republican position is. You can check it out from the link, no twitter account required! https://threadreaderapp.com/thread/17704536…"