Skip to main content

Kernel Stein Discrepancy descent

Project description

GHActions PyPI

Sampling by optimization of the Kernel Stein Discrepancy

The paper is available at arxiv.org/abs/2105.09994.

The code uses Pytorch, and a numpy backend is available for svgd.

ksd_picture

Install

The code is available on pip:

$ pip install ksddescent

Documentation

The documentation is at pierreablin.github.io/ksddescent/.

Example

The main function is ksdd_lbfgs, which uses the fast L-BFGS algorithm to converge quickly. It takes as input the initial position of the particles, and the score function. For instance, to samples from a Gaussian (where the score is identity), you can use these simple lines of code:

>>> import torch
>>> from ksddescent import ksdd_lbfgs
>>> n, p = 50, 2
>>> x0 = torch.rand(n, p)  # start from uniform distribution
>>> score = lambda x: x  # simple score function
>>> x = ksdd_lbfgs(x0, score)  # run the algorithm

Reference

If you use this code in your project, please cite:

Anna Korba, Pierre-Cyril Aubin-Frankowski, Simon Majewski, Pierre Ablin
Kernel Stein Discrepancy Descent
International Conference on Machine Learning, 2021

Bug reports

Use the github issue tracker to report bugs.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ksddescent-0.3.tar.gz (13.2 kB view hashes)

Uploaded Source

Built Distribution

ksddescent-0.3-py3-none-any.whl (8.4 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page