Skip to content
/ luamlp Public

luamlp.lua is a module written in Lua, implements the algorithm of multilayer perceptron with back-propagation algorithm in a generic way

License

Notifications You must be signed in to change notification settings

jhoonb/luamlp

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

20 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

luamlp

Luamlp is a module written in Lua, implements the algorithm of multilayer perceptron with back-propagation algorithm in a generic way. One can define the number of hidden layers to 1 or 2 (not require the use of more than two hidden layers). One can also define the number of neurons used in any layer. The data input and output (for training) must be formatted in a two-dimensional array format:

Data_input = {{x1},... {xn}} or {{x1,x2} ..., {xn-1, xn}}

Data_output y1 = {{y1},... {yn}} or {{y1,y2} ..., {yn-1, yn}}

DEPENDENCIES

Lua installed in system.
Tested in Lua 5.1 and 5.2
Linux/Ubuntu: sudo apt-get install lua5.2
or
sudo apt-get install lua5.1

FUNCTIONS | RETURN

Narray(nl, nc, value) -> table
Luamlp:New(ni, nh1, nh2, nout) -> number
Luamlp:Config(lr, it, bias, ert, mm, mt, fx, dfx) -> table
Luamlp:LoadConfig(name(optional)) -> None
Luamlp:Training(print_error) -> None
Luamlp:Propagation(x) -> table
Luamlp:Backpropagation(y) -> number
Luamlp:Test(dinput) -> None

Files:

luamlp.lua: source code
test_luamlp.lua: file test of luamlp.lua
config.luamlp: optional, file for configuration parameters luamlp.lua

Example of use:

The XOR Problem: http://home.agh.edu.pl/~vlsi/AI/xor_t/en/main.htm

Run test_luamlp.lua:

In terminal enter the directo luary where Luamlp, digit:

$ lua5.2 test_luamlp.lua

Solution using Luamlp

Detais of example XOR:

| input | output |
0,0 = 0
0,1 = 1
1,0 = 1
1,1 = 0

load module in variable
luamlp = require 'luamlp'
create neural network with 2 neuron in layer input
2 neuron in layer hidden 1, 0 neuron in layer hidden 2
and 1 neuron in layer ouput

mlp = luamlp:New(2,2,0,1)
insert to input and output in neural network
mlp.input = {{0,0}, {0,1}, {1,0}, {1,1}}
mlp.output = {{0}, {1}, {1}, {1}}
configure os parameters mlp
0.3 = learning rate, 10000 = iteration max.
1 = value for bias, 0.01: error in training
0.003: rate of mommentum term

mlp:Config(0.3, 10000, 1, 0.01, 0.003)
execute training, parameter true: displays error in each iteration
mlp:Training(true)
test of learning
mlp:Test(mlp.input)


Using File of configuration, config.luamlp

Can use a special file that contains a table,
table where the keys are the parameters of the neural network.

Config.luamlp file, can create activation functions and other options
to modify the behavior of the neural network, without modifying the original stand.
The filename 'config.luamlp' is the default name, if he cares for your application,
simply called function:

mlp:loadConfig()

after creating a neural network. If you use another file, such as 'load.txt'
just in function call loadConfig pass a string with the file name, example:

mlp:loadConfig('load.txt')

About

luamlp.lua is a module written in Lua, implements the algorithm of multilayer perceptron with back-propagation algorithm in a generic way

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages