Skip to content

cangermueller/vbmfa

Repository files navigation

vbmfa: Variational Bayesian Mixture of Factor Analysers

Variational Bayesian Mixture of Factor Analysers for dimensionality reduction and clustering.

Factor analysis (FA) is a method for dimensionality reduction, similar to principle component analysis (PCA), singular value decomposition (SVD), or independent component analysis (ICA). Applications include visualization, image compression, or feature learning. A mixture of factor analysers consists of several factor analysers, and allows both dimensionality reduction and clustering. Variational Bayesian learning of model parameters prevents overfitting compared with maximum likelihood methods such as expectation maximization (EM), and allows to learn the dimensionality of the lower dimensional subspace by automatic relevance determination (ARD). A detailed explanation of the model can be found here.

Note

The current version is still under development, and needs to be optimized for large-scale data sets. I am open for any suggestions, and happy about every bug report!

Installation

The easiest way to install vbmfa is to use PyPI:

pip install vbmfa

Alternatively, you can checkout the repository from Github:

git clone https://github.com/cangermueller/vbmfa.git

Examples

The folder examples/ contains example ipython notebooks:

  • VbFa, a single Variational Bayesian Factor Analyser
  • VbMfa, a mixture of Variational Bayesian Factors Analysers

References

[1]Ghahramani, Zoubin, Matthew J Beal, Gatsby Computational, and Neuroscience Unit. “Variational Inference for Bayesian Mixtures of Factor Analysers.” NIPS, 1999.
[2]Bishop, Christopher M. “Variational Principal Components,” 1999.
[3]Beal, Matthew J. “Variational Algorithms For Approximate Bayesian Inference,” 2003.

Contact

Christof Angermueller

https://github.com/cangermueller

About

Variational Bayesian Mixture of Factor Analysers

Resources

License

Stars

Watchers

Forks

Packages

No packages published