Skip to main content

Efficient, lightweight, variational inference and approximation bounds

Project description

viabel: Variational Inference Approximation Bounds that are Efficient and Lightweight

Description

This package computes bounds errors of the mean, standard deviation, and variances estimates produced by a continuous approximation to a (unnormalized) distribution. A canonical application is a variational approximation to a Bayesian posterior distribution. In particular, using samples from the approximation Q and evaluations of the (maybe unnormalized) log densities of Q and (target distribution) P, the package provides functionality to compute bounds on:

  • the α-divergence between P and Q
  • the p-Wasserstein distance between P and Q
  • the differences between the means, standard deviations, and variances of P and Q

There is also an (optional) variational Bayes functionality (viabel.vb), which supports both standard KL-based variational inference (KLVI) and chi-squared variational inference (CHIVI). Models are provided as autograd-compatible log densities or can be constructed from pystan fit objects. The variational objective is optimized using a windowed version of adagrad and unbiased reparameterization gradients. By default there is support for mean-field Gaussian, mean-field Student's t, and full-rank Student's t variational families.

If you use this package, please cite:

Practical posterior error bounds from variational objectives. Jonathan H. Huggins, Mikołaj Kasprzak, Trevor Campbell, Tamara Broderick. In Proc. of the 23rd International Conference on Artificial Intelligence and Statistics (AISTATS), Palermo, Italy. PMLR: Volume 108, 2020.

Compilation and testing

After cloning the repository, testing and installation is easy. If you just want to compute bounds, you can install using the command

pip install .

The only dependency is numpy. If you want to use the basic variational Bayes functionality, use the command

pip install .[vb]

This will install some additional dependencies. If in addition to the above, you want to run all of the example notebooks, use the command

pip install .[examples]

This will install even more dependencies.

To test the package:

nosetests tests/

Currently there is only coverage for viabel.bounds.

Usage Examples

The normal mixture notebook provides basic usage examples of the bounds.

The robust regression example demonstrates how to use the variational Bayes functionality and then compute bounds.

Running Comparison Experiments

The notebooks/experiments.py contains additional functionality for running experiments and computing PSIS-corrected posterior estimates. The robust regression example uses some of this functionality. A simple funnel distribution example demonstrates how to use the high-level run_experiment function. The eight schools example is more involved and realistic.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

viabel-0.3.0.post1.tar.gz (8.3 kB view hashes)

Uploaded Source

Built Distribution

viabel-0.3.0.post1-py3-none-any.whl (9.4 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page