Giter Club home page Giter Club logo

bbvi's Introduction

BBVI

A collection of Black Box Variational Inference algorithms implemented in an object-oriented Python framework using Autograd.

bbvi.py provides a class BaseBBVIModel which is a base class for a general Bayesian inference problem to be solved by variational methods. Specifically, if we have a posterior p(z|x), BaseBBVIModel provides the framework and machinery to approximate p(z|x) by a distribution q(z|lambda) where lambda is the variational parameter(s) that determines the variational family. This is based on some excellent models found at https://github.com/HIPS/autograd/blob/master/examples/.

How to use

  1. Derive a model class from BaseBBVIModel which requires that you implement the log-posterior, the log-variational approximation, a sampler of the variational approximation, and a parameter-handler.
  2. Chose an ELBO gradient estimator (more on this below).
  3. Call the method run_VI with the appropriate initial parameters.

Available ELBO Estimators

This framework is designed to facilitate rapid experimentation with different BBVI methods. We have provided three ELBO gradient estimators, but it is a simple matter to add others afterward. The estimators broadly fall into two categories: "stochastic search" and "reparameterization":

Stochastic Search

These follow the original approach of Ranganath et al. (https://arxiv.org/abs/1401.0118) and uses the gradient of the variational distribution.

Reparameterization

These follow the approach taken in Kingma et al. (https://arxiv.org/abs/1506.02557) by parameterizing the samples then using backprop.

We should also note that various enhancements to these models (Rao-Blackwellization, control variates) were considered but omitted because neither is easily applied to a black-box model. These models could be specialized to include these enhancements on a case-by-case basis.

Examples

Three examples are provided, BBVI_test1.py and BBVI_test2.py deal with using a mutlivariate Gaussian to approximate a highly non-normal distribution. BBVI_test3.py shows how to do inference on the weights of a MLP. It is worthwhile noting that the variational approximation and much of the code is essentially the same in each example, and only the details of the specific problem need to be adjusted.

bbvi's People

Contributors

jamesvuc avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.