Academic

News


Filter by
Jump to
Search

Hamiltonian Monte Carlo In Bayesian Empirical Likelihood Computation

Assoc. Professor Sanjay Chaudhuri National University of Singapore

Date:11 October 2017, Wednesday

Location:S16-06-118, DSAP Seminar Room

Time:03:00pm - 04:00pm

 

We consider Bayesian empirical likelihood estimation and develop an efficient Hamiltonian Monte Carlo method for sampling from the posterior distribution of the parameters of interest. The proposed method uses hitherto unknown properties of the gradient of the underlying log-empirical likelihood function.

It is seen that these properties hold under minimal assumptions on the parameter space, prior density and the functions used in the estimating equations determining the empirical likelihood. We overcome major challenges posed by complex, non-convex boundaries of the support routinely observed for empirical likelihood which prevents efficient implementation of traditional Markov chain Monte Carlo methods like random walk Metropolis-Hastings etc. with or without parallel tempering. Our method employs finite number of estimating equations and observations but produces valid semi-parametric inference for a large class of statistical models including mixed effects models, generalised linear models, hierarchical Bayes models etc. A simulation study confirms that our proposed method converges quickly and draws samples from the posterior support efficiently. We further illustrate its utility through an analysis of a discrete data-set in small area estimation.

Keywords: Constrained convex optimisation; Empirical likelihood; Generalised linear models; Hamiltonian Monte Carlo; Mixed effect models; Score equations; Small area estimation; Unbiased estimating equations.

This is a joint work with Debashis Mondal, Oregon State University and Yin Teng, E&Y, Singapore.