PH140 732 - Statistical Theory II

Description
Introduces modern statistical theory; sets principles of inference based on decision theory and likelihood (evidence) theory; derives the likelihood function based on design and model assumptions; derives the complete class theorem between Bayes and admissible estimators; derives minimal sufficient statistics as a necessary and sufficient reduction of data for accurate inference in parametric models; derives the minimal sufficient statistics in exponential families; introduces maximum likelihood and unbiased estimators; defines information and derives the Cramer-Rao variance bounds in parametric models; introduces empirical Bayes (shrinkage) estimators and compares to maximum likelihood in small-sample problems.
Credits
4
Recent Professors
Schedule Planner
Recent Semesters
Fall 2019, Fall 2018
Offered
MW
Avg. Sections
1