Scalable exact inference in multi-output Gaussian processes

Multi-output Gaussian processes (MOGPs) leverage the flexibility and interpretability of GPs while capturing structure across outputs, which is desirable, for example, in spatio-temporal modelling. The key problem with MOGPs is their computational scaling O(n3p3), which is cubic in the number of both inputs n (e.g., time points or locations) and outputs p. For this reason, a popular class of MOGPs assumes that the data live around a low-dimensional linear subspace, reducing the complexity to O(n3m3). However, this cost is still cubic in the dimensionality of the subspace m, which is still prohibitively expensive for many applications. We propose the use of a sufficient statistic of the data to accelerate inference and learning in MOGPs with orthogonal bases. The method achieves linear scaling in m in practice, allowing these models to scale to large m without sacrificing significant expressivity or requiring approximation. This advance opens up a wide range of real-world tasks and can be combined with existing GP approximations in a plug-and-play way. We demonstrate the efficacy of the method on various synthetic and real-world data sets.

Details

Publication status:
Published
Author(s):
Authors: Bruinsma, Wessel P, Perim, Eric, Tebbutt, Will, Hosking, J. Scott ORCIDORCID record for J. Scott Hosking, Solin, Arno, Turner, Richard E.

Editors: Daumé, Hal, Singh, Aarti

On this site: Scott Hosking
Date:
1 July, 2020
Journal/Source:
In: Daumé, Hal, Singh, Aarti (eds.). Proceedings of the 37th International Conference on Machine Learning, PMLR, 1190-1201.
Page(s):
1190-1201
Link to published article:
https://doi.org/