Releases: denosaurs/netsaur
Releases · denosaurs/netsaur
0.2.11
0.2.10
0.2.9
0.2.8
What's Changed
- feat: convtranspose2d & binary crossentropy by @CarrotzRule123 in #30
- fix: batches by @CarrotzRule123 in #31
- feat: batchnorm1d layer by @CarrotzRule123 in #32
- feat: optimizers by @CarrotzRule123 in #33
- chore: bump version by @load1n9 in #34
Full Changelog: 0.2.7...0.2.8
0.2.7
What's Changed
- feat: dropout layer by @load1n9 in #28
- feat: batch normalization layer by @CarrotzRule123 in #29
Full Changelog: 0.2.6...0.2.7
0.2.6
Note: This is a patch release for 0.2.5.
What's Changed
- fix: wasm compile async by @CarrotzRule123 in #26
- chore: bump version by @CarrotzRule123 in #27
Full Changelog: 0.2.5...0.2.6
0.2.5
What's Changed
- feat: convolution back propapagation + initial model implementation by @load1n9 in #21
- feat: flatten layer + pooling backpropagation by @load1n9 in #24
- fix: web compat by @CarrotzRule123 in #25
Full Changelog: 0.2.4...0.2.5
0.2.4
What's Changed
- feat(rust): new layer structures by @load1n9 in #16
- feat: pool & conv feedforward by @CarrotzRule123 in #18
Full Changelog: 0.2.3...0.2.4
0.2.3
What's Changed
- feat(rust): activation layers by @CarrotzRule123 in #15
Full Changelog: 0.2.2...0.2.3
0.2.2
What's Changed
- feat(rust): init strategies by @CarrotzRule123 in #13
- feat: docs by @load1n9 in #14
Full Changelog: 0.2.1...0.2.2