Probability
-
Always True:
-
\(\ P(x,y) = P(x|y) \cdot P(y) = P(y|x) \cdot P(x) \)
-
Chain Rule:
\(
P(X_4, X_3, X_2, X_1) \\
= P(X_4 | X_3 X_2, X_1) \cdot P(X_3, X_2, X_1) \\
= P(X_4 | X_3 X_2, X_1) \cdot P(X_3 | X_2, X_1) \cdot P( X_2, X_1)\\
= P(X_4 | X_3 X_2, X_1) \cdot P(X_3 | X_2, X_1) \cdot P( X_2 | X_1) \cdot P(X_1)
\)
-
Bayes' Rule:
\(P(y|x) = \frac{P(x|y) \cdot P(y)}{P(x)} \)
-
Union bound: \(P(A \vee B) \leq P(A) + P(B) \)
-
Linearity of Expectation: \(E[X + Y] = E[X] + E[Y]\)
-
Independence
-
\(x\) and \(y\) are independent iff:
\(\qquad \forall x, y; P(x, y) = P(x) \cdot P(y)\)
or,
\(\qquad \forall x, y; P(x|y) = P(x) \)
-
If \(X, Y\) are independent, \(E[XY] = E[X]E[Y]\)
-
If \(X_1, X_2, \ldots\) are pairwise independent:
\(Var[X_1 + X_2 + \ldots] = Var[X_1] + Var[X_2] + \ldots \)
-
Conditional Independence
-
\(x\) and \(y\) are independent given \(z\) iff:
\(\qquad P(x,y|z) = P(x|z) \cdot P(y|z)\)
or:
\(\qquad P(x|z, y) = P(x|z) \)
-
Bounds
-
Theorem: Chernoff/Hoeffding Bound
Suppose \(X_1, \ldots, X_t\) are i.i.d. in \(\{0, 1\}\), \(p = E[X_i]\), and \(\epsilon > 0\),
\(\operatorname{Pr}\left[\left|\frac{1}{t} \cdot \sum\limits_{i=1}^{t} X_{i}-p\right| \geq
\epsilon\right]
\leq 2 e^{-2 \epsilon^{2} t}\)
-
Markov's Inequality:
if \(X \geq 0\) is a random variable, then \(\forall t, P(X \geq t) \leq \frac{E[X]}{t}\)
-
Markov's inequality applied to variance:
\(Var[X] = E[(X-E[X])^2] \geq 0\)
\(P(|X - E[X]| \geq t) = P((X - E[X])^2 \geq t^2) \leq \frac{Var[X]}{t^2}\)