Date:13 August 2024, Tuesday
Location:S16-06-118, Seminar Room
Time:2pm, Singapore
The maximum likelihood estimation is widely used for statistical inferences. This paper aims to extend the classical likelihood theory to embrace broader class of statistical models with additional random unknowns. We reformulate Lee and Nelder’s (1996) h-likelihood, so that the maximum h-likelihood estimator resembles the maximum likelihood estimator of the classical likelihood. Maximization of the new h-likelihood yields asymptotically optimal estimators for both fixed and random parameters achieving the generalized Cramér-Rao lower bound. Furthermore, the Hessian matrix of the h-likelihood gives consistent variance estimators. We further investigate small sample properties, when maximum h-likelihood estimators and their variance estimators may not be consistent, leading to difficulties in likelihood inference. We propose the h-confidence that could lead to valid interval estimators, exactly maintaining the coverage probability for both fixed and random unknowns in small samples. We also study various approximate methods for the h-likelihood and h-confidence when they are not explicitly available. We show that classical likelihood and confidence theories can be extended to a broad class of models with additional random unknowns.