1 and 2 norm inequality
While looking over my notes, my lecturer stated the following inequality; $$\|x\|_2 \leq \|x\|_1 \leq \sqrt{n}\|x\|_2$$ where $x \in \mathbb{R^n}.$ There was no proof given, and I've been trying to prove it for a while now. I know the definitions of the $1$ and $2$ norm, and, numerically the inequality seems obvious, although I don't know where to start rigorously.
Thank you.
$\endgroup$5 Answers
$\begingroup$We will show the more general case, i.e.:
$\|\ \cdot \|_1$ , $\|\ \cdot \|_2$, and $\|\ \cdot \|_{\infty}$ are all equivalent on $\mathbb{R}^{n}$. And we have$$\|\ x \|_{\infty} \leq \|\ x \|_{2} \leq \|\ x \|_{1} \leq n \|\ x \|_{\infty}\ $$
Every $x \in \mathbb{R}^{n}$ has the representation $x = ( x_1 , x_2 , \dots , x_n )$. Using the canonical basis of $\mathbb{R}^{n}$, namely $e_{i}$, where $e_i = (0, \dots , 0 , 1 , 0 , \dots , 0 )$ for $1$ in the $i^{\text{th}}$ position and otherwise $0$, we have that$$\|\ x \|_{\infty} = \max_{1\leq i \leq n} | x_i | = \max_{1\leq i \leq n} \sqrt{ | x_i |^{2} } \leq \sqrt{ \sum_{i=1}^{n} | x_ i |^{2} } = \|\ x \|_2 $$Additionally,$$ \|\ x \|_2 = \sqrt{ \sum_{i=1}^{n} | x_i |^{2} } \leq \sum_{i=1}^{n} \sqrt{ | x_ i |^{2} } = \sum_{i=1}^{n} |x_i| = \|\ x \|_1$$Finally,$$ \|\ x \|_1\ = \sum_{i=1}^{n} |x_i| \leq \sum_{i=1}^{n} | \max_{1 \leq j \leq n} x_j | = n \max_{i \leq j \leq n} | x_j | = n \|\ x \|_{\infty}$$showing the chain of inequalities as desired. Moreover, for any norm on $\mathbb{R}^{n}$ we have that:$$\|\ x - x_{n} \|\ \to 0 \hspace{1cm} \text{as} \space\ \space\ n \to \infty $$so that they are equivalent, as this holds for any $x \in \mathbb{R}^{n}$ under any norm actually.
$\endgroup$ 4 $\begingroup$The inequality $ ||x||_1 \leq \sqrt{n} ||x||_2 $ is a consequence of Cauchy-Schwarz. To see this
$$\sqrt{n} ||x||_2 =\sqrt{1+1+\cdots+1}\sqrt{\sum_{i} x_i^2 }\geq ||x||_1$$
For the first, the function $f(x)=\sqrt{x}$ is concave and $f(0)=0$, hence $f$ is subadditive
Therefore $ f(\sum_{i} x_i^2 )\leq \sum_{i} f(x_i^2) =||x||_1 $
$\endgroup$ $\begingroup$Another way to show that $||x||_1 \leq \sqrt{n}||x||_2$ is as follows:
\begin{align*} \begin{array}{c} \displaystyle 0\leq \sum_{i\neq j}\big(|x_i|-|x_j|\big)^2 = (n-1)\sum_{i=1}^{n} |x_i|^2 - 2\sum_{i\neq j}|x_i||x_j| \\ \displaystyle\Rightarrow\quad 2\sum_{i\neq j}|x_i||x_j|\leq (n-1)\sum_{i=1}^{n} |x_i|^2 \\ \displaystyle\Rightarrow\quad \sum_{i=1}^{n} |x_i|^2 + 2\sum_{i\neq j}|x_i||x_j|\leq n\sum_{i=1}^{n} |x_i|^2 \\ \displaystyle\Rightarrow\quad \left( \sum_{i=1}^n |x_i| \right)^2 \leq n\sum_{i=1}^{n} |x_i|^2 \\ \displaystyle\Rightarrow \quad ||x||_1\leq \sqrt{n}||x||_2 \end{array} \end{align*}
$\endgroup$ $\begingroup$$\lVert x \rVert_2 \le \lVert x \rVert_1$ is equivalent to $\lVert x \rVert_2^2 \le \lVert x \rVert_1^2$ (norms are non-negative) which can be shown in an elementary way:
$$\lVert x \rVert_2^2 = \sum_i \lvert x_i\rvert^2 \le \left( \sum_i \lvert x_i \rvert \right)^2 = \lVert x \rVert_1^2$$
By expanding the product $\left(\sum_i z_i \right)^2 = \sum_i z_i^2 + \sum_{i \neq j} z_i z_j$ where the second sum of cross-terms is $\ge 0$ since all $z_i$'s are $\ge 0$.
Intuition for inequalities: if $x$ has one component $x_0$ much larger (in magnitude) than the rest, the other components become negligible and $\lVert x \rVert_2 \approx (\sqrt{x_0})^2 = \lvert x_0 \rvert \approx \lVert x \rVert_1 $. On the other hand, if the components of $x$ are about equal (in magnitude), $\lVert x \rVert_2 \approx \sqrt{n x_i^2} = \sqrt n \lvert x_i \rvert$, while $\lVert x \rVert_1 \approx n \lvert x_i \rvert$.
$\endgroup$ $\begingroup$Hint: Bound $\left \lVert x\right \rVert_2$ by $\left \lVert x'\right \rVert_2$ where $x'$ is the vector with all its components equal to the maximum of the components of $x$.
$\endgroup$More in general
"Zoraya ter Beek, age 29, just died by assisted suicide in the Netherlands. She was physically healthy, but psychologically depressed. It's an abomination that an entire society would actively facilitate, even encourage, someone ending their own life because they had no hope. Th…"