v0.4.1
The major change for this version is to support the RNN layers.
New RNN layers
- Support options:
return_sequence
,stateful
andgo_backewards
- Support RNN cells: Simple RNN, LSTM and GRU
- GRU and LSTM run on 16bit internally.
New Activations
- AdvanceReLU -> is an equivalence to Keras ReLU with any argment(s) i.e.
slope
,threshold
andmax
. Also those predefine ReLU such as ReLU6 - Hard TanH and Hard Sigmoid -> currently only in backend.
New Examples
- New
uci-har-rnn
demonstrate the usage of new RNN layers. - Add an RNN model in kws example.
Minor:
- Completed the Q format calculation for input/output tensor. Add new method
model_io_format()
to print the layer io info.
Bugs:
- Fixed concatenate on 2D input.