Ioannis Kosmidis (University of Warwick)

will speak on

Improved estimation in partially specified models

Time: 3:00PM
Date: Thu 23rd January 2020
Location: Seminar Room SCN 1.25 [map]

Abstract: Many popular methods that reduced estimating bias rely on an approximation of the estimator's bias under the assumption that the model is correct and fully-specified. Other bias reduction methods, like the bootstrap, the jackknife and indirect inference require fewer assumptions to operate but are typically computer-intensive, requiring repeated optimizations.
We present current research on a new framework for reducing estimation bias that:
i) can deliver estimators with smaller bias than reference estimators even for partially specified models, as long as estimation is through unbiased estimating functions;
ii) always results in closed-form bias-reducing penalties to the objective function if estimation is through the maximisation of one, like maximum likelihood and maximum composite likelihood.
iii) relies only on the estimating functions and/or the objective and their derivatives, greatly facilitating implementation for general modelling frameworks through numerical or automatic differentiation techniques and standard numerical optimisation routines.
The bias-reducing penalized objectives are found to closely relate to established information criteria for model selection based on the Kullback-Leibler divergence, establishing, for the first time, a strong link between reduction of estimation bias and model selection. We also discuss the asymptotic efficiency properties of the new estimator, inference and model selection, and illustrate the new bias reduction method in well-used, important modelling settings of varying complexity, including quasi-likelihoods and Gaussian max-stable processes.

(This talk is part of the Statistics and Actuarial Science series.)

PDF notice

Return to all seminars