The bias, variance and mean squared error of an estimator. The efficiency is used to compare two estimators.
Introducing consistency, a concept about the convergence of estimators. We start from the convergence of non-random number sequences to convergence in probability, then to consistency of estimators and its properties.
A fairly simple method of constructing estimators that's not often used now.
Under parametric family distributions, there's a much better way of constructing estimators - the maximum likelihood estimator.
Introducing sufficient statistics for the inference of parameters. The factorization theorem comes in handy!
Introducing the Minimum Variance Unbiased Estimator and the procedure of deriving it.