Features
Setup
using AbstractGPs, Random
rng = MersenneTwister(0)
# Construct a zero-mean Gaussian process with a matern-3/2 kernel.
f = GP(Matern32Kernel())
# Specify some input and target locations.
x = randn(rng, 10)
y = randn(rng, 10)
Finite dimensional projection
Look at the finite-dimensional projection of f
at x
, under zero-mean observation noise with variance 0.1
.
fx = f(x, 0.1)
Sample from GP from the prior at x
under noise.
y_sampled = rand(rng, fx)
Compute the log marginal probability of y
.
logpdf(fx, y)
Construct the posterior process implied by conditioning f
at x
on y
.
f_posterior = posterior(fx, y)
A posterior process follows the AbstractGP
interface, so the same functions which work on the posterior as on the prior.
rand(rng, f_posterior(x))
logpdf(f_posterior(x), y)
Compute the VFE approximation to the log marginal probability of y
.
Here, z is a set of pseudo-points.
z = randn(rng, 4)
Evidence Lower BOund (ELBO)
We provide a ready implentation of elbo w.r.t to the pseudo points. We can perform Variational Inference on pseudo-points by maximizing the ELBO term w.r.t pseudo-points z
and any kernel parameters. For more information, see examples.
elbo(VFE(f(z)), fx, y)
Construct the approximate posterior process implied by the VFE approximation.
The optimal pseudo-points obtained above can be used to create a approximate/sparse posterior. This can be used like a regular posterior in many cases.
f_approx_posterior = posterior(VFE(f(z)), fx, y)
An approximate posterior process is yet another AbstractGP
, so you can do things with it like
marginals(f_approx_posterior(x))
Sequential Conditioning
Sequential conditioning allows you to compute your posterior in an online fashion. We do this in an efficient manner by updating the cholesky factorisation of the covariance matrix and avoiding recomputing it from original covariance matrix.
# Define GP prior
f = GP(SqExponentialKernel())
Exact Posterior
Generate posterior with the first batch of data by conditioning the prior on them:
p_fx = posterior(f(x[1:3], 0.1), y[1:3])
Generate posterior with the second batch of data, considering the previous posterior p_fx
as the prior:
p_p_fx = posterior(p_fx(x[4:10], 0.1), y[4:10])
Approximate Posterior
Adding observations in a sequential fashion
Z1 = rand(rng, 4)
Z2 = rand(rng, 3)
Z = vcat(Z1, Z2)
p_fx1 = posterior(VFE(f(Z)), f(x[1:7], 0.1), y[1:7])
u_p_fx1 = update_posterior(p_fx1, f(x[8:10], 0.1), y[8:10])
Adding pseudo-points in a sequential fashion
p_fx2 = posterior(VFE(f(Z1)), f(x, 0.1), y)
u_p_fx2 = update_posterior(p_fx2, f(Z2))
Plotting
Plots.jl
We provide functions for plotting samples and predictions of Gaussian processes with Plots.jl. You can see some examples in the One-dimensional regression tutorial.
RecipesBase.plot
— Methodplot(x::AbstractVector, f::FiniteGP; ribbon_scale=1, kwargs...)
plot!([plot, ]x::AbstractVector, f::FiniteGP; ribbon_scale=1, kwargs...)
Plot the predictive mean for the projection f
of a Gaussian process and a ribbon of ribbon_scale
standard deviations around it versus x
.
Make sure to load Plots.jl before you use this function.
Examples
Plot the mean and a ribbon of 3 standard deviations:
using Plots
gp = GP(SqExponentialKernel())
plot(gp(rand(5)); ribbon_scale=3)
RecipesBase.plot
— Methodplot(f::FiniteGP; kwargs...)
plot!([plot, ]f::FiniteGP; kwargs...)
Plot the predictive mean and a ribbon around it for the projection f
of a Gaussian process versus f.x
.
RecipesBase.plot
— Methodplot(x::AbstractVector, gp::AbstractGP; kwargs...)
plot!([plot, ]x::AbstractVector, gp::AbstractGP; kwargs...)
Plot the predictive mean and a ribbon around it for the projection gp(x)
of the Gaussian process gp
.
AbstractGPs.sampleplot
— Functionsampleplot([x::AbstractVector=f.x, ]f::FiniteGP; samples=1, kwargs...)
Plot samples from the projection f
of a Gaussian process versus x
.
Make sure to load Plots.jl before you use this function.
When plotting multiple samples, these are treated as a single series (i.e., only a single entry will be added to the legend when providing a label
).
Example
using Plots
gp = GP(SqExponentialKernel())
sampleplot(gp(rand(5)); samples=10, linealpha=1.0)
The given example plots 10 samples from the projection of the GP gp
. The linealpha
is modified from default of 0.35 to 1.
sampleplot(x::AbstractVector, gp::AbstractGP; samples=1, kwargs...)
Plot samples from the finite projection gp(x, 1e-9)
versus x
.
Makie.jl
You can use the Julia package AbstractGPsMakie.jl to plot Gaussian processes with Makie.jl.