Code for ICML 2020 paper on Handling the Positive-Definite Constraint in the Bayesian Learning Rule
We propose an Efficient Riemannian/Natural Gradient Variational Inference method
- To-do List:
- Added slides and a Youtube Link of the ICML talk
- Added a Python implementatoin of the implicit reparameterization gradient for inverse Gaussian distribution (See Appendix H.1)
- Added a Matlab implementation about Gaussian approximation (See Appendix E)
- Added a Matlab implementation about MoG approximation (See Appendix J)
examples of MoG approximation: star, double-banana, 2d Laplace, BNN, mixture of Ts - Added a Matlab implementation about the Gamma approximation (See Appendix F): Gamma factor model
- To add a Python implementation for the Adam-like update using factorized/diagonal Gaussian approximation (See Appendix E.3)