Mathematical Statistics

short pdf version here

Moments of Random Variables

Random variable $x$ with support $[ \underline{x}, \bar{x} ]$

N-th moment = $\mathbb{E}x^n$ For k-vectors

Moment Generating Function Standard Normal: $e^{t^2/2}$

Correlation Coefficient

Bias-variance Tradeoff

Hypothesis testing

Two-sided Confidence interval Test statistic where $\omega = \sqrt{\sigma^2/n} = \frac{1}{n-k} \sum \hat{u_i}^2$

P-value = $[1-\Phi(s)]\times 2$
Power = Pr(reject $H_0$$H_1$ is true) : 
$\pi(\theta) = P[z+s>z_{1-\alpha/2}]$

Cauchy Schwartz Inequality: $Cov(X,Y)^2 \leq \sigma_x\sigma_y$

Markov Inequality: $P[X>\epsilon]\leq E[X^r]/\epsilon^r$


  • Standard normal : $z \sim (0,1) ; \mu = 0 , \sigma = 1$
  • Chi-squared : $\chi^2_n = \sum^n z_i^2, \mu = n, \sigma = 2n$
  • t : let $x\sim\chi^2_n, Y = z/\sqrt{x/n} \rightarrow y \sim t_n$
  • F : let $x_1\sim \chi^2{n1}$, and $x_2\sim \chi^2{n2}$; $y = \frac{x_1/n_1}{x_2/n_2} \rightarrow Y \sim F{n_1,n_2}$


$\rightarrow_p$ - Convergence in Probability $\rightarrow_d$ - Convergence in Distribution

  • Law of Large Numbers: $X_1, \ldots,X_n$ are IID; $\mathbb{E}[X_1]<\infty \Rightarrow n^{-1}\sum^nX_i \rightarrow_p \mathbb{E}X_1 \text{ as } n \rightarrow \infty$
  • Cramer Convergence : $X_n \rightarrow_d X; Y_n \rightarrow_p c\Rightarrow$
    • $X_n+Y_n \rightarrow_d X+c$
    • $X_nY_n \rightarrow_d cX$
    • $X_n/Y_n\rightarrow_dX/c$
  • Slutsky’s Theorem: $X_n \rightarrow_p X; h(.)$ is continuous $\Rightarrow ; h(X_n) \rightarrow_p h(x)$
  • Central Limit Theorem: $X_1, \ldots,X_n$ are IID; $\mathbb{E}[X_1]=0;0<\mathbb{E}X_1^2 < \infty \Rightarrow \sqrt{n}\sum^nX_i \rightarrow_d N(0,\mathbb{E}X^2) \text{ as } n \rightarrow \infty$
  • Continuous Mapping Theorem: $X_n \rightarrow_d X; h(.)$ is continuous $\Rightarrow ; h(X_n) \rightarrow_d h(x)$
  • Delta Method: $\sqrt{n}(\hat{\theta}-\theta) \rightarrow_d Y \Rightarrow \sqrt{n}(h(\hat{\theta})-h(\theta)) \rightarrow_d[\partial h(\theta)/\partial\theta’]Y$