Here we show how to implement various linear regression techniques in TensorFlow. The first two sections show how to do standard matrix linear regression solving in TensorFlow. The remaining six sections depict how to implement various types of regression using computational graphs in TensorFlow.
- How to solve a 2D regression with a matrix inverse in TensorFlow.
- Solving a 2D linear regression with Cholesky decomposition.
- Linear regression iterating through a computational graph with L2 Loss.
- L2 vs L1 loss in linear regression. We talk about the benefits and limitations of both.
- Deming (total) regression implemented in TensorFlow by changing the loss function.
- Lasso and Ridge regression are ways of regularizing the coefficients. We implement both of these in TensorFlow via changing the loss functions.
- Elastic net is a regularization technique that combines the L2 and L1 loss for coefficients. We show how to implement this in TensorFlow.
- We implement logistic regression by the use of an activation function in our computational graph.