Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Reorganise introductory docs #520

Merged
merged 5 commits into from
Sep 12, 2024
Merged
Show file tree
Hide file tree
Changes from 4 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
23 changes: 12 additions & 11 deletions _quarto.yml
Original file line number Diff line number Diff line change
Expand Up @@ -50,16 +50,15 @@ website:
- text: documentation
collapse-level: 1
contents:
- section: "Documentation"
- section: "Users"
# href: tutorials/index.qmd, This page will be added later so keep this line commented
contents:
- section: "Using Turing - Modelling Syntax and Interface"
- tutorials/docs-00-getting-started/index.qmd
- tutorials/docs-12-using-turing-guide/index.qmd

- section: "Usage Tips"
collapse-level: 1
contents:
- tutorials/docs-00-getting-started/index.qmd
- text: "Quick Start"
href: tutorials/docs-14-using-turing-quick-start/index.qmd
- tutorials/docs-12-using-turing-guide/index.qmd
- text: "Mode Estimation"
href: tutorials/docs-17-mode-estimation/index.qmd
- tutorials/docs-09-using-turing-advanced/index.qmd
Expand All @@ -70,7 +69,7 @@ website:
- text: "External Samplers"
href: tutorials/docs-16-using-turing-external-samplers/index.qmd

- section: "Using Turing - Tutorials"
- section: "Worked Tutorials"
penelopeysm marked this conversation as resolved.
Show resolved Hide resolved
contents:
- tutorials/00-introduction/index.qmd
- text: Gaussian Mixture Models
Expand All @@ -97,25 +96,27 @@ website:
- text: "Gaussian Process Latent Variable Models"
href: tutorials/12-gplvm/index.qmd

- section: "Developers: Contributing"
- section: "Developers"
contents:
- section: "Contributing"
collapse-level: 1
contents:
- text: "How to Contribute"
href: tutorials/docs-01-contributing-guide/index.qmd

- section: "Developers: PPL"
- section: "DynamicPPL in Depth"
collapse-level: 1
contents:
- tutorials/docs-05-for-developers-compiler/index.qmd
- text: "A Mini Turing Implementation I: Compiler"
href: tutorials/14-minituring/index.qmd
- text: "A Mini Turing Implementation II: Contexts"
href: tutorials/16-contexts/index.qmd
- tutorials/docs-06-for-developers-interface/index.qmd

- section: "Developers: Inference"
- section: "Inference (note: outdated)"
collapse-level: 1
contents:
- tutorials/docs-06-for-developers-interface/index.qmd
- tutorials/docs-04-for-developers-abstractmcmc-turing/index.qmd
- tutorials/docs-07-for-developers-variational-inference/index.qmd
- text: "Implementing Samplers"
Expand Down
35 changes: 15 additions & 20 deletions tutorials/00-introduction/index.qmd
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
---
title: Introduction to Turing
title: "Introduction: Coin Flipping"
engine: julia
aliases:
- ../
Expand All @@ -12,23 +12,12 @@ using Pkg;
Pkg.instantiate();
```

### Introduction
This is the first of a series of guided tutorials on the Turing language.
In this tutorial, we will use Bayesian inference to estimate the probability that a coin flip will result in heads, given a series of observations.

This is the first of a series of tutorials on the universal probabilistic programming language **Turing**.
### Setup

Turing is a probabilistic programming system written entirely in Julia. It has an intuitive modelling syntax and supports a wide range of sampling-based inference algorithms.

Familiarity with Julia is assumed throughout this tutorial. If you are new to Julia, [Learning Julia](https://julialang.org/learning/) is a good starting point.

For users new to Bayesian machine learning, please consider more thorough introductions to the field such as [Pattern Recognition and Machine Learning](https://www.springer.com/us/book/9780387310732). This tutorial tries to provide an intuition for Bayesian inference and gives a simple example on how to use Turing. Note that this is not a comprehensive introduction to Bayesian machine learning.

### Coin Flipping Without Turing

The following example illustrates the effect of updating our beliefs with every piece of new evidence we observe.

Assume that we are unsure about the probability of heads in a coin flip. To get an intuitive understanding of what "updating our beliefs" is, we will visualize the probability of heads in a coin flip after each observed evidence.

First, let us load some packages that we need to simulate a coin flip
First, let us load some packages that we need to simulate a coin flip:

```{julia}
using Distributions
Expand All @@ -43,8 +32,7 @@ and to visualize our results.
using StatsPlots
```

Note that Turing is not loaded here — we do not use it in this example. If you are already familiar with posterior updates, you can proceed to the next step.

Note that Turing is not loaded here — we do not use it in this example.
Next, we configure the data generating model. Let us set the true probability that a coin flip turns up heads

```{julia}
Expand All @@ -63,13 +51,20 @@ We simulate `N` coin flips by drawing N random samples from the Bernoulli distri
data = rand(Bernoulli(p_true), N);
```

Here is what the first five coin flips look like:
Here are the first five coin flips:

```{julia}
data[1:5]
```

Next, we specify a prior belief about the distribution of heads and tails in a coin toss. Here we choose a [Beta](https://en.wikipedia.org/wiki/Beta_distribution) distribution as prior distribution for the probability of heads. Before any coin flip is observed, we assume a uniform distribution $\operatorname{U}(0, 1) = \operatorname{Beta}(1, 1)$ of the probability of heads. I.e., every probability is equally likely initially.

### Coin Flipping Without Turing

The following example illustrates the effect of updating our beliefs with every piece of new evidence we observe.

Assume that we are unsure about the probability of heads in a coin flip. To get an intuitive understanding of what "updating our beliefs" is, we will visualize the probability of heads in a coin flip after each observed evidence.

We begin by specifying a prior belief about the distribution of heads and tails in a coin toss. Here we choose a [Beta](https://en.wikipedia.org/wiki/Beta_distribution) distribution as prior distribution for the probability of heads. Before any coin flip is observed, we assume a uniform distribution $\operatorname{U}(0, 1) = \operatorname{Beta}(1, 1)$ of the probability of heads. I.e., every probability is equally likely initially.

```{julia}
prior_belief = Beta(1, 1);
Expand Down
83 changes: 29 additions & 54 deletions tutorials/docs-00-getting-started/index.qmd
Original file line number Diff line number Diff line change
Expand Up @@ -16,96 +16,71 @@ Pkg.instantiate();

To use Turing, you need to install Julia first and then install Turing.

### Install Julia
You will need to install Julia 1.7 or greater, which you can get from [the official Julia website](http://julialang.org/downloads/).

You will need to install Julia 1.3 or greater, which you can get from [the official Julia website](http://julialang.org/downloads/).

### Install Turing.jl

Turing is an officially registered Julia package, so you can install a stable version of Turing by running the following in the Julia REPL:
Turing is officially registered in the [Julia General package registry](https://github.com/JuliaRegistries/General), which means that you can install a stable version of Turing by running the following in the Julia REPL:

```{julia}
#| eval: false
#| output: false
using Pkg
Pkg.add("Turing")
```

You can check if all tests pass by running `Pkg.test("Turing")` (it might take a long time)

### Example

Here's a simple example showing Turing in action.
### Example usage

First, we can load the Turing and StatsPlots modules
First, we load the Turing and StatsPlots modules.
The latter is required for visualising the results.

```{julia}
using Turing
using StatsPlots
```

Then, we define a simple Normal model with unknown mean and variance
We then specify our model, which is a simple Gaussian model with unknown mean and variance.
Models are defined as ordinary Julia functions, prefixed with the `@model` macro.
Each statement inside closely resembles how the model would be defined with mathematical notation.
Here, both `x` and `y` are observed values, and are therefore passed as function parameters.
`m` and `` are the parameters to be inferred.

```{julia}
@model function gdemo(x, y)
s² ~ InverseGamma(2, 3)
m ~ Normal(0, sqrt(s²))
x ~ Normal(m, sqrt(s²))
return y ~ Normal(m, sqrt(s²))
y ~ Normal(m, sqrt(s²))
end
```

Then we can run a sampler to collect results. In this case, it is a Hamiltonian Monte Carlo sampler

```{julia}
chn = sample(gdemo(1.5, 2), NUTS(), 1000, progress=false)
```

We can plot the results
Suppose we observe `x = 1.5` and `y = 2`, and want to infer the mean and variance.
We can pass these data as arguments to the `gdemo` function, and run a sampler to collect the results.
Here, we collect 1000 samples using the No U-Turn Sampler (NUTS) algorithm.

```{julia}
plot(chn)
chain = sample(gdemo(1.5, 2), NUTS(), 1000, progress=false)
```

In this case, because we use the normal-inverse gamma distribution as a conjugate prior, we can compute its updated mean as follows:
We can plot the results:

```{julia}
s² = InverseGamma(2, 3)
m = Normal(0, 1)
data = [1.5, 2]
x_bar = mean(data)
N = length(data)
mean_exp = (m.σ * m.μ + N * x_bar) / (m.σ + N)
plot(chain)
```

We can also compute the updated variance
and obtain summary statistics by indexing the chain:

```{julia}
updated_alpha = shape(s²) + (N / 2)
updated_beta =
scale(s²) +
(1 / 2) * sum((data[n] - x_bar)^2 for n in 1:N) +
(N * m.σ) / (N + m.σ) * ((x_bar)^2) / 2
variance_exp = updated_beta / (updated_alpha - 1)
mean(chain[:m]), mean(chain[:s²])
```

Finally, we can check if these expectations align with our HMC approximations from earlier. We can compute samples from a normal-inverse gamma following the equations given [here](https://en.wikipedia.org/wiki/Normal-inverse-gamma_distribution#Generating_normal-inverse-gamma_random_variates).
### Where to go next

```{julia}
function sample_posterior(alpha, beta, mean, lambda, iterations)
samples = []
for i in 1:iterations
sample_variance = rand(InverseGamma(alpha, beta), 1)
sample_x = rand(Normal(mean, sqrt(sample_variance[1]) / lambda), 1)
samples = append!(samples, sample_x)
end
return samples
end
::: {.callout-note title="Note on prerequisites"}
Familiarity with Julia is assumed throughout the Turing documentation.
If you are new to Julia, [Learning Julia](https://julialang.org/learning/) is a good starting point.

analytical_samples = sample_posterior(updated_alpha, updated_beta, mean_exp, 2, 1000);
```
The underlying theory of Bayesian machine learning is not explained in detail in this documentation.
A thorough introduction to the field is [*Pattern Recognition and Machine Learning*](https://www.springer.com/us/book/9780387310732) (Bishop, 2006); an online version is available [here (PDF, 18.1 MB)](https://www.microsoft.com/en-us/research/uploads/prod/2006/01/Bishop-Pattern-Recognition-and-Machine-Learning-2006.pdf).
:::

```{julia}
density(analytical_samples; label="Posterior (Analytical)")
density!(chn[:m]; label="Posterior (HMC)")
```
The next page on [Turing's core functionality](../../tutorials/docs-12-using-turing-guide/) explains the basic features of the Turing language.
From there, you can either look at [worked examples of how different models are implemented in Turing](../../tutorials/00-introduction/), or [specific tips and tricks that can help you get the most out of Turing](../../tutorials/docs-17-mode-estimation/).
4 changes: 3 additions & 1 deletion tutorials/docs-12-using-turing-guide/index.qmd
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
---
title: Guide
title: "Core Functionality"
engine: julia
---

Expand All @@ -10,6 +10,8 @@ using Pkg;
Pkg.instantiate();
```

This article provides an overview of the core functionality in Turing.jl, which are likely to be used across a wide range of models.

## Basics

### Introduction
Expand Down
74 changes: 0 additions & 74 deletions tutorials/docs-14-using-turing-quick-start/index.qmd

This file was deleted.

Loading