Skip to content

Commit

Permalink
small update to the VI tutorial
Browse files Browse the repository at this point in the history
  • Loading branch information
torfjelde committed Sep 25, 2020
1 parent 3379a7f commit 6d38bcd
Showing 1 changed file with 9 additions and 10 deletions.
19 changes: 9 additions & 10 deletions tutorials/variational-inference/01_variational-inference.jmd
Original file line number Diff line number Diff line change
Expand Up @@ -31,8 +31,8 @@ Random.seed!(42);
The Normal-(Inverse)Gamma conjugate model is defined by the following generative process

\begin{align}
s &\sim \mathrm{InverseGamma}(2, 3) \\\\
m &\sim \mathcal{N}(0, s) \\\\
s &\sim \mathrm{InverseGamma}(2, 3) \\
m &\sim \mathcal{N}(0, s) \\
x_i &\overset{\text{i.i.d.}}{=} \mathcal{N}(m, s), \quad i = 1, \dots, n
\end{align}

Expand Down Expand Up @@ -154,19 +154,18 @@ samples = rand(q, 10000);
```julia
# setup for plotting
using Plots, LaTeXStrings, StatsPlots
pyplot()
```

```julia
p1 = histogram(samples[1, :], bins=100, normed=true, alpha=0.2, color = :blue, label = "")
density!(samples[1, :], label = "s (ADVI)", color = :blue, linewidth = 2)
density!(collect(skipmissing(samples_nuts[:s].data)), label = "s (NUTS)", color = :green, linewidth = 2)
density!(samples_nuts, :s; label = "s (NUTS)", color = :green, linewidth = 2)
vline!([var(x)], label = "s (data)", color = :black)
vline!([mean(samples[1, :])], color = :blue, label ="")

p2 = histogram(samples[2, :], bins=100, normed=true, alpha=0.2, color = :blue, label = "")
density!(samples[2, :], label = "m (ADVI)", color = :blue, linewidth = 2)
density!(collect(skipmissing(samples_nuts[:m].data)), label = "m (NUTS)", color = :green, linewidth = 2)
density!(samples_nuts, :m; label = "m (NUTS)", color = :green, linewidth = 2)
vline!([mean(x)], color = :black, label = "m (data)")
vline!([mean(samples[2, :])], color = :blue, label="")

Expand Down Expand Up @@ -219,7 +218,7 @@ p_μ_pdf = z -> exp(logpdf(p_μ, (z - μₙ) * exp(- 0.5 * log(βₙ) + 0.5 * lo
p1 = plot();
histogram!(samples[1, :], bins=100, normed=true, alpha=0.2, color = :blue, label = "")
density!(samples[1, :], label = "s (ADVI)", color = :blue)
density!(vec(samples_nuts[:s].data), label = "s (NUTS)", color = :green)
density!(samples_nuts, :s; label = "s (NUTS)", color = :green)
vline!([mean(samples[1, :])], linewidth = 1.5, color = :blue, label ="")

# normalize using Riemann approx. because of (almost certainly) numerical issues
Expand All @@ -233,7 +232,7 @@ xlims!(0.75, 1.35);
p2 = plot();
histogram!(samples[2, :], bins=100, normed=true, alpha=0.2, color = :blue, label = "")
density!(samples[2, :], label = "m (ADVI)", color = :blue)
density!(vec(samples_nuts[:m].data), label = "m (NUTS)", color = :green)
density!(samples_nuts, :m; label = "m (NUTS)", color = :green)
vline!([mean(samples[2, :])], linewidth = 1.5, color = :blue, label="")


Expand All @@ -252,13 +251,13 @@ p = plot(p1, p2; layout=(2, 1), size=(900, 500))

# Bayesian linear regression example using `ADVI`

This is simply a duplication of the tutorial [5. Linear regression](../../tutorials/5-linearregression) but now with the addition of an approximate posterior obtained using `ADVI`.
This is simply a duplication of the tutorial [5. Linear regression](../regression/02_linear-regression) but now with the addition of an approximate posterior obtained using `ADVI`.

As we'll see, there is really no additional work required to apply variational inference to a more complex `Model`.

## Copy-paste from [5. Linear regression](../../tutorials/5-linearregression)
## Copy-paste from [5. Linear regression](../regression/02_linear-regression)

This section is basically copy-pasting the code from the [linear regression tutorial](../../tutorials/5-linearregression).
This section is basically copy-pasting the code from the [linear regression tutorial](../regression/02_linear-regression).


```julia
Expand Down

0 comments on commit 6d38bcd

Please sign in to comment.