Markov Inequality
(If we don't know what the random variable distribution, but only know its mean, then we can use Markov Inequality)
If x \geq 0 with E(X) is finite, then for any positive constant c, then \mathsf {P(|X|\geq c)\leq \frac{E(|X|)}{c}}
Proof
\mathsf{ P(x\geq c) = \sum_{x:x\geq c} P(X=x)=\sum_{x:x\geq c} \frac{x}{c}P(X=x)}
\mathsf {=\frac{1}{c}\sum_{x:x\geq c}x \cdot P(X=x)\leq \frac{1}{c} \sum_{x}x \cdot P(X=x)= \frac{E(X)}{c} }
Note)
The Chebyshev's Inequality is a special case of Markov's inequality by considering \mathsf{Y=\frac{(X-\mu)^2}{\sigma ^2}}
Chebyshev's Inequality
Let X be a random variable with mean \mu and variance \sigma ^2,
then for any positive constant k, \mathsf {P(|X-\mu|\geq k \sigma) \leq \frac{1}{k^2} }
Proof (1)
\mathsf {P(|X-\mu|\geq k \sigma) = \sum_{x:|x-\mu|\geq k\sigma}P(X=x) \leq \sum_{x:|x-\mu|\geq k\sigma}(\frac{|x-\mu|}{k \sigma})^2 \cdot P(X=x) }
\mathsf {\leq \frac{1}{k^2\sigma^2}E((X-\mu)^2) = \frac{\sigma^2}{k^2 \sigma^2} =\frac{1}{k^2}}
Proof (2)
\mathsf {P(|x-\mu|\geq k \sigma) = P((x-\mu)^2\geq k^2 \sigma^2\leq \frac{E(x-\mu)^2)}{k^2 \sigma^2}=\frac{1}{k^2}}
Example Chebyshev's Inequality)
Let X be a random variable with mean 3, variance 1. What's the maximum of P(X\geq 5)?
Solution) \mathsf{P(X\geq 5)=P(X-3) \geq 2\leq P(|x-3|) \geq 2\cdot 1 \leq \frac{1}{2^2} = \frac{1}{4}}
Cauchy-Schwarz' Inequality
Let X, Y be two random variables, \mathsf {(E(XY))^2 \leq E(X^2)E(Y^2) }, where the equality holds iff P(aX=bY)=1 for some a,b \in \mathbb{R}
Proof
Let Z=aX-bY, \mathsf { 0 \leq E(Z^2) = a^2E(X^2)-2abE(XY)+b^2E(Y^2) }
Thus the RHS is a quadratic in the variable a with at most one real root. Its discriminant must be non-positive.\mathsf { \rightarrow E(XY)^2 - E(X^2)E(Y^2) \leq 0 } if b \neq 0
No comments:
Post a Comment