STAT40320 Mathematical Statistics

Academic Year 2021/2022

An overview of mathematical statistics based on the likelihood principle will be presented and placed in the context of modern statistical problems. Both single and multiparameter methods will be examined. The introduction will cover maximum likelihood, likelihood ratio and Fisher information. Then large sample results will be derived heuristically - distribution of the mle, the score statistic and likelihood ratio statistic. Profile, marginal and conditional likelihood will be explored. Bias and variability of point estimates will be examined including the method of moments, bootstrap method, mle together with a proof of the Cramer-Rao lower bound. Multiple testing and the Benjamini-Hochberg approach will be covered. Confidence intervals and coverage probability will be discussed.
Bayesian estimators/models will also be discussed briefly.
The topics will be illustrated throughout via numerical examples and their application to modern statistical problems. Typewritten notes will be provided on Brightspace.

Show/hide contentOpenClose All

Curricular information is subject to change

Learning Outcomes:

On successful completion of this module students should be able to demonstrate knowledge of the basic parametric models used in statistics and the methods of maximum likelihood and likelihood ratio. They should be able to derive large sample results and identify when they are appropriate. They should demonstrate knowledge of the concepts of profile, marginal and conditional likelihood and why they are necessary. They should be able to explain the delta method, jackknife, the bootstrap and results relating to variability. They should be able to carry out multiple testing procedures including that of Benjamini and Hochberg.They should be able to use the basic tools of classical statistics and demonstrate knowledge of how to use them in modern statistical problems. They should have a basic understanding of Bayesian models and their role in modern statistics.

Indicative Module Content:

Maximum likelihood; Invariance; Score statistic; Fisher information; Cramer-Rao lower bound; Large sample results: Central limit theorem, consistency and asymptotic normality of the mle; Likelihood ratio statistics and asymptotic distribution; delta method, jackknife, bootstrap, method of moments, Bayesian estimation; conditional, profile and marginal likelihoods, Fisher's exact test; confidence intervals and coverage probability; multiple testing: Bonferroni, Holm, Benjamini-Hochberg.

Student Effort Hours: 
Student Effort Type Hours
Lectures

24

Tutorial

11

Specified Learning Activities

48

Autonomous Student Learning

90

Total

173

Approaches to Teaching and Learning:
Active/task-based and problem-based learning via assignments; lectures; reflective learning based on the lectures. 
Requirements, Exclusions and Recommendations
Learning Requirements:

A knowledge of probability and statistical inference to the level of Probability Theory STAT20110 and Inferential Statistics STAT20100 courses is required. A knowledge of linear models to the level of STAT30240 is required. Good knowledge of calculus and linear algebra is required.


Module Requisites and Incompatibles
Not applicable to this module.
 
Assessment Strategy  
Description Timing Open Book Exam Component Scale Must Pass Component % of Final Grade
Examination: 2 hour exam 2 hour End of Trimester Exam No Standard conversion grade scale 40% No

70

Assignment: Approximately 8 assignments Varies over the Trimester n/a Standard conversion grade scale 40% No

30


Carry forward of passed components
No
 
Resit In Terminal Exam
Summer Yes - 2 Hour
Please see Student Jargon Buster for more information about remediation types and timing. 
Feedback Strategy/Strategies

• Feedback individually to students, post-assessment

How will my Feedback be Delivered?

Each assignment will be corrected and graded and returned to the student. This will be followed by solutions to the Assignments covered in Tutorials.

Azzalini, A. (1996) Statistical Inference: based on the likelihood.
Casella, G. and Berger, R. (2001) Statistical Inference.
Cox, D.R. and Hinkley, D. (1974) Theoretical Statistics.
Edwards, A.W.F. (1992) Likelihood.
Lindsey, J.K. (1996) Parametric Statistical Inference.
Pawitan, Y. (2001) In All Likelihood.
Rice, J. (1995) Mathematical Statistics and Data Analysis.