Skip to content

Xin-Jing/K-FAC

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

43 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

K-FAC

The Implementation of K-FAC, based on the original paper: https://arxiv.org/pdf/1503.05671.pdf

K-FAC is short for "Kronecker-factored Approximate Curvature". It optimizes neural networks using block-diagonal approximation to the Fisher information matrix required in the Natural Gradient algorithm. It converges with far fewer iterations than other algorithms such as SGD and adam optimizer. The code for the optimizer is in the file K_FAC.

So far, only feedforward neural networks are supported. If you are training a MLP, please try it!

This implementation is simple to use. Users don't need to construct the neural networks on their own. By simply specifying the activation type and size of each layer, the optimizer will construct the neural network inside (see Example).

The file optimize_with_adam is used as a comparison to the performance of K-FAC. It performs exactly the same classification task with an adam optimizer. It has the same backpropagation code as that of K-FAC. So it differs from K-FAC in what happens after backpropagation. The

I'm continuing improving the implementation. If you have any suggestion, please feel free to contact me at [email protected]

About

Implementation of K-FAC in Python. Please see the paper https://arxiv.org/pdf/1503.05671.pdf

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages