diff --git a/markdown/00-introduction/00_introduction.md b/markdown/00-introduction/00_introduction.md index bf5fa78e3..1555c365f 100644 --- a/markdown/00-introduction/00_introduction.md +++ b/markdown/00-introduction/00_introduction.md @@ -1,4 +1,5 @@ --- +redirect_from: "tutorials/0-introduction/" title: "Introduction to Turing" permalink: "/:collection/:name/" --- diff --git a/markdown/01-gaussian-mixture-model/01_gaussian-mixture-model.md b/markdown/01-gaussian-mixture-model/01_gaussian-mixture-model.md index c4ca9759d..70575dcbc 100644 --- a/markdown/01-gaussian-mixture-model/01_gaussian-mixture-model.md +++ b/markdown/01-gaussian-mixture-model/01_gaussian-mixture-model.md @@ -1,4 +1,5 @@ --- +redirect_from: "tutorials/1-gaussianmixturemodel/" title: "Unsupervised Learning using Bayesian Mixture Models" permalink: "/:collection/:name/" --- diff --git a/markdown/02-logistic-regression/02_logistic-regression.md b/markdown/02-logistic-regression/02_logistic-regression.md index 319b9ae05..1dffbf3ae 100644 --- a/markdown/02-logistic-regression/02_logistic-regression.md +++ b/markdown/02-logistic-regression/02_logistic-regression.md @@ -1,4 +1,5 @@ --- +redirect_from: "tutorials/2-logisticregression/" title: "Bayesian Logistic Regression" permalink: "/:collection/:name/" --- diff --git a/markdown/03-bayesian-neural-network/03_bayesian-neural-network.md b/markdown/03-bayesian-neural-network/03_bayesian-neural-network.md index 02f2dd9d2..d1a7a1248 100644 --- a/markdown/03-bayesian-neural-network/03_bayesian-neural-network.md +++ b/markdown/03-bayesian-neural-network/03_bayesian-neural-network.md @@ -1,4 +1,5 @@ --- +redirect_from: "tutorials/3-bayesnn/" title: "Bayesian Neural Networks" permalink: "/:collection/:name/" --- diff --git a/markdown/04-hidden-markov-model/04_hidden-markov-model.md b/markdown/04-hidden-markov-model/04_hidden-markov-model.md index 3b611166d..5ddba8f52 100644 --- a/markdown/04-hidden-markov-model/04_hidden-markov-model.md +++ b/markdown/04-hidden-markov-model/04_hidden-markov-model.md @@ -1,4 +1,5 @@ --- +redirect_from: "tutorials/4-bayeshmm/" title: "Bayesian Hidden Markov Models" permalink: "/:collection/:name/" --- diff --git a/markdown/05-linear-regression/05_linear-regression.md b/markdown/05-linear-regression/05_linear-regression.md index e241c8dc5..86c65ac78 100644 --- a/markdown/05-linear-regression/05_linear-regression.md +++ b/markdown/05-linear-regression/05_linear-regression.md @@ -1,4 +1,5 @@ --- +redirect_from: "tutorials/5-linearregression/" title: "Linear Regression" permalink: "/:collection/:name/" --- diff --git a/markdown/06-infinite-mixture-model/06_infinite-mixture-model.md b/markdown/06-infinite-mixture-model/06_infinite-mixture-model.md index 327f1e2a4..8c21051f4 100644 --- a/markdown/06-infinite-mixture-model/06_infinite-mixture-model.md +++ b/markdown/06-infinite-mixture-model/06_infinite-mixture-model.md @@ -1,4 +1,5 @@ --- +redirect_from: "tutorials/6-infinitemixturemodel/" title: "Probabilistic Modelling using the Infinite Mixture Model" permalink: "/:collection/:name/" --- diff --git a/markdown/07-poisson-regression/07_poisson-regression.md b/markdown/07-poisson-regression/07_poisson-regression.md index 62ff64041..db9a20bc9 100644 --- a/markdown/07-poisson-regression/07_poisson-regression.md +++ b/markdown/07-poisson-regression/07_poisson-regression.md @@ -1,4 +1,5 @@ --- +redirect_from: "tutorials/7-poissonregression/" title: "Bayesian Poisson Regression" permalink: "/:collection/:name/" --- diff --git a/markdown/08-multionomial-regression/08_multinomial-logistic-regression.md b/markdown/08-multionomial-regression/08_multinomial-logistic-regression.md index b9be8fe85..820d75bb5 100644 --- a/markdown/08-multionomial-regression/08_multinomial-logistic-regression.md +++ b/markdown/08-multionomial-regression/08_multinomial-logistic-regression.md @@ -1,4 +1,5 @@ --- +redirect_from: "tutorials/8-multinomiallogisticregression/" title: "Bayesian Multinomial Logistic Regression" permalink: "/:collection/:name/" --- diff --git a/markdown/09-variational-inference/09_variational-inference.md b/markdown/09-variational-inference/09_variational-inference.md index e96710688..2d1fdc24c 100644 --- a/markdown/09-variational-inference/09_variational-inference.md +++ b/markdown/09-variational-inference/09_variational-inference.md @@ -1,4 +1,5 @@ --- +redirect_from: "tutorials/9-variationalinference/" title: "Variational inference (VI) in Turing.jl" permalink: "/:collection/:name/" --- diff --git a/markdown/10-bayesian-differential-equations/10_bayesian-differential-equations.md b/markdown/10-bayesian-differential-equations/10_bayesian-differential-equations.md index 08343976c..3259e9f0e 100644 --- a/markdown/10-bayesian-differential-equations/10_bayesian-differential-equations.md +++ b/markdown/10-bayesian-differential-equations/10_bayesian-differential-equations.md @@ -1,4 +1,5 @@ --- +redirect_from: "tutorials/10-bayesiandiffeq/" title: "Bayesian Estimation of Differential Equations" permalink: "/:collection/:name/" --- diff --git a/tutorials/00-introduction/00_introduction.jmd b/tutorials/00-introduction/00_introduction.jmd index d2e5598e7..4a5d2a5b5 100644 --- a/tutorials/00-introduction/00_introduction.jmd +++ b/tutorials/00-introduction/00_introduction.jmd @@ -1,6 +1,7 @@ --- title: Introduction to Turing permalink: /:collection/:name/ +redirect_from: tutorials/0-introduction/ --- ## Introduction diff --git a/tutorials/01-gaussian-mixture-model/01_gaussian-mixture-model.jmd b/tutorials/01-gaussian-mixture-model/01_gaussian-mixture-model.jmd index 2f75d531c..2fb834bfd 100644 --- a/tutorials/01-gaussian-mixture-model/01_gaussian-mixture-model.jmd +++ b/tutorials/01-gaussian-mixture-model/01_gaussian-mixture-model.jmd @@ -1,6 +1,7 @@ --- title: Unsupervised Learning using Bayesian Mixture Models permalink: /:collection/:name/ +redirect_from: tutorials/1-gaussianmixturemodel/ --- The following tutorial illustrates the use *Turing* for clustering data using a Bayesian mixture model. The aim of this task is to infer a latent grouping (hidden structure) from unlabelled data. diff --git a/tutorials/02-logistic-regression/02_logistic-regression.jmd b/tutorials/02-logistic-regression/02_logistic-regression.jmd index 226aa3b52..b309bf064 100644 --- a/tutorials/02-logistic-regression/02_logistic-regression.jmd +++ b/tutorials/02-logistic-regression/02_logistic-regression.jmd @@ -1,6 +1,7 @@ --- title: Bayesian Logistic Regression permalink: /:collection/:name/ +redirect_from: tutorials/2-logisticregression/ --- [Bayesian logistic regression](https://en.wikipedia.org/wiki/Logistic_regression#Bayesian) is the Bayesian counterpart to a common tool in machine learning, logistic regression. The goal of logistic regression is to predict a one or a zero for a given training item. An example might be predicting whether someone is sick or ill given their symptoms and personal information. diff --git a/tutorials/03-bayesian-neural-network/03_bayesian-neural-network.jmd b/tutorials/03-bayesian-neural-network/03_bayesian-neural-network.jmd index 1f051bb96..9bca3cdc8 100644 --- a/tutorials/03-bayesian-neural-network/03_bayesian-neural-network.jmd +++ b/tutorials/03-bayesian-neural-network/03_bayesian-neural-network.jmd @@ -1,6 +1,7 @@ --- title: Bayesian Neural Networks permalink: /:collection/:name/ +redirect_from: tutorials/3-bayesnn/ --- In this tutorial, we demonstrate how one can implement a Bayesian Neural Network using a combination of Turing and [Flux](https://github.com/FluxML/Flux.jl), a suite of tools machine learning. We will use Flux to specify the neural network's layers and Turing to implement the probabalistic inference, with the goal of implementing a classification algorithm. diff --git a/tutorials/04-hidden-markov-model/04_hidden-markov-model.jmd b/tutorials/04-hidden-markov-model/04_hidden-markov-model.jmd index 0c308bd46..f4717164b 100644 --- a/tutorials/04-hidden-markov-model/04_hidden-markov-model.jmd +++ b/tutorials/04-hidden-markov-model/04_hidden-markov-model.jmd @@ -1,6 +1,7 @@ --- title: Bayesian Hidden Markov Models permalink: /:collection/:name/ +redirect_from: tutorials/4-bayeshmm/ --- This tutorial illustrates training Bayesian [Hidden Markov Models](https://en.wikipedia.org/wiki/Hidden_Markov_model) (HMM) using Turing. The main goals are learning the transition matrix, emission parameter, and hidden states. For a more rigorous academic overview on Hidden Markov Models, see [An introduction to Hidden Markov Models and Bayesian Networks](http://mlg.eng.cam.ac.uk/zoubin/papers/ijprai.pdf) (Ghahramani, 2001). diff --git a/tutorials/05-linear-regression/05_linear-regression.jmd b/tutorials/05-linear-regression/05_linear-regression.jmd index 73b2a7ad7..d3c4b3da4 100644 --- a/tutorials/05-linear-regression/05_linear-regression.jmd +++ b/tutorials/05-linear-regression/05_linear-regression.jmd @@ -1,6 +1,7 @@ --- title: Linear Regression permalink: /:collection/:name/ +redirect_from: tutorials/5-linearregression/ --- Turing is powerful when applied to complex hierarchical models, but it can also be put to task at common statistical procedures, like [linear regression](https://en.wikipedia.org/wiki/Linear_regression). This tutorial covers how to implement a linear regression model in Turing. diff --git a/tutorials/06-infinite-mixture-model/06_infinite-mixture-model.jmd b/tutorials/06-infinite-mixture-model/06_infinite-mixture-model.jmd index bf0bc0113..5447542d3 100644 --- a/tutorials/06-infinite-mixture-model/06_infinite-mixture-model.jmd +++ b/tutorials/06-infinite-mixture-model/06_infinite-mixture-model.jmd @@ -1,6 +1,7 @@ --- title: Probabilistic Modelling using the Infinite Mixture Model permalink: /:collection/:name/ +redirect_from: tutorials/6-infinitemixturemodel/ --- In many applications it is desirable to allow the model to adjust its complexity to the amount the data. Consider for example the task of assigning objects into clusters or groups. This task often involves the specification of the number of groups. However, often times it is not known beforehand how many groups exist. Moreover, in some applictions, e.g. modelling topics in text documents or grouping species, the number of examples per group is heavy tailed. This makes it impossible to predefine the number of groups and requiring the model to form new groups when data points from previously unseen groups are observed. diff --git a/tutorials/07-poisson-regression/07_poisson-regression.jmd b/tutorials/07-poisson-regression/07_poisson-regression.jmd index c11307ff8..7a373d973 100644 --- a/tutorials/07-poisson-regression/07_poisson-regression.jmd +++ b/tutorials/07-poisson-regression/07_poisson-regression.jmd @@ -1,6 +1,7 @@ --- title: Bayesian Poisson Regression permalink: /:collection/:name/ +redirect_from: tutorials/7-poissonregression/ --- This notebook is ported from the [example notebook](https://docs.pymc.io/notebooks/GLM-poisson-regression.html) of PyMC3 on Poisson Regression. diff --git a/tutorials/08-multionomial-regression/08_multinomial-logistic-regression.jmd b/tutorials/08-multionomial-regression/08_multinomial-logistic-regression.jmd index 9c7713d33..fe52647a5 100644 --- a/tutorials/08-multionomial-regression/08_multinomial-logistic-regression.jmd +++ b/tutorials/08-multionomial-regression/08_multinomial-logistic-regression.jmd @@ -1,6 +1,7 @@ --- title: Bayesian Multinomial Logistic Regression permalink: /:collection/:name/ +redirect_from: tutorials/8-multinomiallogisticregression/ --- [Multinomial logistic regression](https://en.wikipedia.org/wiki/Multinomial_logistic_regression) is an extension of logistic regression. Logistic regression is used to model problems in which there are exactly two possible discrete outcomes. Multinomial logistic regression is used to model problems in which there are two or more possible discrete outcomes. diff --git a/tutorials/09-variational-inference/09_variational-inference.jmd b/tutorials/09-variational-inference/09_variational-inference.jmd index 624adba51..7da86300d 100644 --- a/tutorials/09-variational-inference/09_variational-inference.jmd +++ b/tutorials/09-variational-inference/09_variational-inference.jmd @@ -1,6 +1,7 @@ --- title: Variational inference (VI) in Turing.jl permalink: /:collection/:name/ +redirect_from: tutorials/9-variationalinference/ --- In this post we'll have a look at what's know as **variational inference (VI)**, a family of _approximate_ Bayesian inference methods, and how to use it in Turing.jl as an alternative to other approaches such as MCMC. In particular, we will focus on one of the more standard VI methods called **Automatic Differentation Variational Inference (ADVI)**. diff --git a/tutorials/10-bayesian-differential-equations/10_bayesian-differential-equations.jmd b/tutorials/10-bayesian-differential-equations/10_bayesian-differential-equations.jmd index e716046e2..fef24c161 100644 --- a/tutorials/10-bayesian-differential-equations/10_bayesian-differential-equations.jmd +++ b/tutorials/10-bayesian-differential-equations/10_bayesian-differential-equations.jmd @@ -1,6 +1,7 @@ --- title: Bayesian Estimation of Differential Equations permalink: /:collection/:name/ +redirect_from: tutorials/10-bayesiandiffeq/ --- Most of the scientific community deals with the basic problem of trying to mathematically model the reality around them and this often involves dynamical systems. The general trend to model these complex dynamical systems is through the use of differential equations. Differential equation models often have non-measurable parameters. The popular “forward-problem” of simulation consists of solving the differential equations for a given set of parameters, the “inverse problem” to simulation, known as parameter estimation, is the process of utilizing data to determine these model parameters. Bayesian inference provides a robust approach to parameter estimation with quantified uncertainty.