Estimation

Bias and Variance

The bias, variance and mean squared error of an estimator. The efficiency is used to compare two estimators.

Consistency

Introducing consistency, a concept about the convergence of estimators. We start from the convergence of non-random number sequences to convergence in probability, then to consistency of estimators and its properties.

The Method of Moments

A fairly simple method of constructing estimators that's not often used now.

Maximum Likelihood Estimator

Under parametric family distributions, there's a much better way of constructing estimators - the maximum likelihood estimator.

Sufficiency

Introducing sufficient statistics for the inference of parameters. The factorization theorem comes in handy!

Optimal Unbiased Estimator

Introducing the Minimum Variance Unbiased Estimator and the procedure of deriving it.