Chapter - 2 Transformations and Expectations
2.1 Distributions of functions of a Random Variable
- Covariance of two random variables is defined as
$$
Cov(X,Y) = E[(X-E[X])(Y-E[Y])]
$$
$$
-\sqrt{Var(X)\cdot Var(Y)}\leq Cov(X,Y) \leq \sqrt{Var(X)\cdot Var(Y)}
$$
- Bernoulli trail - Single event with two outcomes success with probability p and failure with probability 1-p.
- Binomial distribution - n different bernoulli trials - $f_X(x) = P(X = x) = {n \choose x}p^x(1-p)^{n-x}$.
- Exponential distribution with parameter $\lambda$ is $f_X(x) = \lambda e^{-\lambda x} \, \forall \lambda \geq 0 \, \, |\, \, 0 \, \, \forall \lambda < 0$
$$
\lim_ {n \to \infty} \left(1+\frac{a_n}{n}\right)^n = e^a
$$
- The above approximation is useful in showing MGF of binomial tends to MGF of Poisson (take $\theta = \lambda/n$).
2.3 Moments and Moment Generating Functions
- The moment generating function for any given Random variable is
$$
M_X(t) = E\left[e^{tx}\right] = \int_{- \infty}^{\infty}e^{tx}f_X(x)dx
$$
- The $n$th moment of the moment generating function is
$$
m_n=E[X^n] = M_X^{(n)}(0) = \left. \frac{d^n}{dt^n} M_X(t) \right|_{t=0}
$$
- If a sequence of moment generating functions converge to a single moment generating function, Then CDFs also converge to that moment generating function’s CDF. In other words, the corresponding sequence of CDFs also converge.
$$
\sum_{k=1}^{\infty} \frac{m_ks^k}{k!} < \infty \,\, for\,\,some\,\,s>0
$$
$$
M_{aX+b}(t) = e^{bt}M_X(at) \,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\, (2.3.15)
$$
- MGF exists if $M_X(t)$, exists for $-h \leq t \leq h$ for some $h > 0$.