Abstracts

P-SGLD : Stochastic Gradient Langevin Dynamics with control variates

by Andrea Bruzzone




Institution: Linkping University
Department:
Year: 2017
Keywords: Big Data; Bayesian Inference; MCMC; SGLD; Estimated Gradient; Logistic Regression; Probability Theory and Statistics; Sannolikhetsteori och statistik
Posted: 02/01/2018
Record ID: 2168801
Full text PDF: http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-140121


Abstract

Year after years, the amount of data that we continuously generate is increasing. When this situation started the main challenge was to find a way to store the huge quantity of information. Nowadays, with the increasing availability of storage facilities, this problem is solved but it gives us a new issue to deal with: find tools that allow us to learn from this large data sets. In this thesis, a framework for Bayesian learning with the ability to scale to large data sets is studied. We present the Stochastic Gradient Langevin Dynamics (SGLD) framework and show that in some cases its approximation of the posterior distribution is quite poor. A reason for this can be that SGLD estimates the gradient of the log-likelihood with a high variability due to naive sampling. Our approach combines accurate proxies for the gradient of the log-likelihood with SGLD. We show that it produces better results in terms of convergence to the correct posterior distribution than the standard SGLD, since accurate proxies dramatically reduce the variance of the gradient estimator. Moreover, we demonstrate that this approach is more efficient than the standard Markov Chain Monte Carlo (MCMC) method and that it exceeds other techniques of variance reduction proposed in the literature such as SAGA-LD algorithm. This approach also uses control variates to improve SGLD so that it is straightforward the comparison with our approach. We apply the method to the Logistic Regression model.