Skip to content

Commit

Permalink
Merge pull request #785 from stan-dev/gp_computation_cost_warning
Browse files Browse the repository at this point in the history
warn about the computation cost of GPs
  • Loading branch information
bob-carpenter authored May 28, 2024
2 parents 0078953 + 5eedb5e commit f2a561b
Show file tree
Hide file tree
Showing 2 changed files with 19 additions and 0 deletions.
9 changes: 9 additions & 0 deletions src/bibtex/all.bib
Original file line number Diff line number Diff line change
Expand Up @@ -1816,3 +1816,12 @@ @article{zhang_pathfinder:2022
url = {http://jmlr.org/papers/v23/21-0889.html}
}

@article{Riutort-Mayol:2023:HSGP,
title={Practical {Hilbert} space approximate {Bayesian} {Gaussian} processes for probabilistic programming},
author={Riutort-Mayol, Gabriel and B{\"u}rkner, Paul-Christian and Andersen, Michael R and Solin, Arno and Vehtari, Aki},
journal={Statistics and Computing},
volume={33},
number={1},
pages={17},
year={2023}
}
10 changes: 10 additions & 0 deletions src/stan-users-guide/gaussian-processes.qmd
Original file line number Diff line number Diff line change
Expand Up @@ -44,6 +44,16 @@ Gaussian processes are general, and by necessity this chapter
only touches on some basic models. For more information, see
@RasmussenWilliams:2006.

Note that fitting Gaussian processes as described below using exact
inference by computing Cholesky of the covariance matrix scales
cubicly with the size of data. Due to how Stan autodiff is
implemented, Stan is also slower than Gaussian process specialized
software. It is likely that Gaussian processes using exact inference
by computing Cholesky of the covariance matrix with $N>1000$ are too
slow for practical purposes in Stan. There are many approximations to
speed-up Gaussian process computation, from which the basis function
approaches for 1-3 dimensional $x$ are easiest to implement in Stan
(see, e.g., @Riutort-Mayol:2023:HSGP).

## Gaussian process regression

Expand Down

0 comments on commit f2a561b

Please sign in to comment.