Welcome to GPJax!#

GPJax is a didactic Gaussian process library that supports GPU acceleration and just-in-time compilation. We seek to provide a flexible API as close as possible to how the underlying mathematics is written on paper to enable researchers to rapidly prototype and develop new ideas.

Gaussian process posterior.

You can view the source code for GPJax here on Github.

‘Hello World’ example#

Defining a Gaussian process posterior is as simple as typing the maths we would write on paper. To see this, consider the following example.

import gpjax as gpx

kernel = gpx.kernels.RBF()
prior = gpx.gps.Prior(kernel = kernel)

likelihood = gpx.likelihoods.Gaussian(num_datapoints = 123)

posterior = prior * likelihood


If you’re new to Gaussian processes and want a gentle introduction, we have put together an introduction to GPs notebook that starts from Bayes’ theorem and univariate Gaussian random variables. The notebook is linked here.

See also

To learn more, checkout the regression notebook.



Viacheslav Borovitskiy, Iskander Azangulov, Alexander Terenin, Peter Mostowsky, Marc Deisenroth, and Nicolas Durrande. Matern Gaussian processes on graphs. In International Conference on Artificial Intelligence and Statistics. 2021.


James Hensman, Nicolo Fusi, and Neil D Lawrence. Gaussian processes for big data. arXiv preprint arXiv:1309.6835, 2013.


James Hensman, Alexander Matthews, and Zoubin Ghahramani. Scalable variational Gaussian process classification. In Proceedings of the Eighteenth International Conference on Artificial Intelligence and Statistics, volume 38 of Proceedings of Machine Learning Research, 351–360. PMLR, 2015.


Junpeng Lao, Christopher Suter, Ian Langmore, Cyril Chimisov, Ashish Saxena, Pavel Sountsov, Dave Moore, Rif A Saurous, Matthew D Hoffman, and Joshua V Dillon. Tfp. mcmc: modern markov chain monte carlo tools built for modern hardware. arXiv preprint arXiv:2002.01184, 2020.


Felix Leibfried, Vincent Dutordoir, ST John, and Nicolas Durrande. A tutorial on sparse Gaussian processes and variational inference. arXiv preprint arXiv:2012.13962, 2020.


Anton Mallasto and Aasa Feragen. Learning from uncertain curves: the 2-Wasserstein metric for Gaussian processes. Advances in Neural Information Processing Systems, 2017.


Joaquin Quiñonero-Candela and Carl Edward Rasmussen. A unifying view of sparse approximate Gaussian process regression. Journal of Machine Learning Research, 6(65):1939–1959, 2005.


Carl Edward Rasmussen and Christopher K Williams. Gaussian processes for machine learning. Volume 2. MIT press Cambridge, MA, 2006.


Hugh Salimbeni, Stefanos Eleftheriadis, and James Hensman. Natural gradients in practice: non-conjugate variational inference in Gaussian process models. 2018.


Michalis Titsias. Variational learning of inducing variables in sparse Gaussian processes. In Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics, volume 5 of Proceedings of Machine Learning Research, 567–574. PMLR, 2009.


Andrew Gordon Wilson, Zhiting Hu, Ruslan Salakhutdinov, and Eric P Xing. Deep kernel learning. In Artificial intelligence and statistics, 370–378. PMLR, 2016.