Skip to content

Latest commit

 

History

History
31 lines (25 loc) · 1.52 KB

README.md

File metadata and controls

31 lines (25 loc) · 1.52 KB

Code For Minimal Gated Unit(MGU)

This repository contains code and some experiments for Minimal Gated Unit in paper Minimal gated unit for recurrent neural networks. For any problem concerning the code, please feel free to contact Mr. Chen-Lin Zhang ([email protected]).

This packages are free for academic usage. You can run them at your own risk. For other purposes, please contact Prof. Jianxin Wu ([email protected])

Operating System:

Ubuntu Linux 14.04

Requirements:

Python2.7 (Anaconda is preferred)

Theano

Lasagne

GPU with CUDA Support (Optional)

Hints: The code is originally developed in Theano v0.7 and Lasagne v0.1. But I test it with Theano v0.9 and Lasagne v0.2dev1. It's also OK.

The MGU code is in MGULayer.py. other contains some utility code for performing the experiments.

Perform the experiments in the paper

First, you should install the lasagne (http://lasagne.readthedocs.io/en/latest/user/installation.html) on your computer.

This repository contains three experiments: adding problems, mnist and imdb problem. For each problem, you can enter the folder and run

python IRNN_gru_2014.py

to perform the experiment with the GRU layer.

and you can run

python IRNN_gru_2015.py

to perform the experiment with our MGU layer.

[1] G.-B Zhou, J.-X. Wu, C.-L. Zhang, Z.-H. Zhou, Minimal gated unit for recurrent neural networks. International Journal of Automation and Computing, 2016, 13(3): 226-234.