02.10.2025 (Thursday)

Andrew Yiu (Southampton)
02 Oct at 14:00 - 15:00
Strand - S3.32

Suppose we wish to estimate a finite-dimensional parameter but we don't want to restrict ourselves to a finite-dimensional model. This is called semiparametric inference. An exciting aspect of this paradigm is that we might be able to leverage state-of-the-art machine learning algorithms to estimate our high-dimensional nuisance parameters and still obtain statistical guarantees (e.g. a 95% confidence interval). This approach has been especially popular in the field of causal inference in recent years. To attain these nice inferential properties, however, we will generally need to carefully tailor our inference to the target estimand. This can be problematic for nonparametric Bayesian inference, which focuses on good performance for the whole data-generating distribution, possibly at the expense of low-dimensional parameters of interest. To remedy this, we introduce a simple, computationally efficient procedure that corrects the marginal posterior of our target estimand, yielding a debiased and calibrated one-step posterior.

Posted by yu.luo@kcl.ac.uk