Academic

News


Filter by
Jump to
Search

Inference in Differentiable Generative Models

Mr Matt Graham Department of Statistics and Applied Probability

Date:08 November 2017, Wednesday

Location:S16-06-118, DSAP Seminar Room

Time:03:00pm - 04:00pm

 

Standard approximate inference methods such as Markov chain Monte Carlo (MCMC) and variational inference require access to a density function for the probability distribution of interest. Many interesting probabilistic models are however specified by a generative process and the density function on the model variables is only defined implicitly. The approximate Bayesian computation (ABC) framework allows inference in this class of models by relaxing the constraint that model variables exactly match observed data values, to instead require only that simulated values are within some distance of the observations. Although ABC methods have been successfully applied to complex problems in many fields, they tend to scale poorly with the dimensionality of the observations. This means ABC methods usually require further approximation by conditioning only on reduced dimensionality summary statistics of the observed data.

We show how considering a generative model as a deterministic mapping from random inputs (draws from random number generator) to simulated outputs allows he application of efficient MCMC methods such as slice sampling and Hamiltonian Monte Carlo (HMC) to perform inference in implicit generative models. In some cases this allows tractable ABC inference when conditioning on the full set of observed data where standard ABC methods are only able to get useful results when reducing to lower dimensional summaries. We also show that for a restricted class of differentiable generative models, by using a constrained variant of HMC it is possible to perform inference when conditioning on model variables being arbitrarily close to observed values while maintaining computational tractability.

Joint work with Amos Storkey.