Skip to content

Commit

Permalink
refactor: Various stylistic and refactoring changes to bring the proj…
Browse files Browse the repository at this point in the history
…ect up to par.
  • Loading branch information
vxern committed Apr 20, 2022
1 parent 86fd83f commit ce5e5dc
Show file tree
Hide file tree
Showing 16 changed files with 392 additions and 304 deletions.
7 changes: 7 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,10 @@
# 0.4.3

- Bumped version of `sprint` from `1.0.2+3` to `1.0.3`.
- Updated repository, homepage and issue tracker links.
- Refactored and made formatting and style changes to bring the project up to
par.

# 0.4.2+1

- Updated package description.
Expand Down
43 changes: 29 additions & 14 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,8 @@
# synadart

The `synadart` library can be used to create neural networks of any complexity, as well as learn from the source code by studying its extremely clean implementation.
The `synadart` library can be used to create neural networks of any complexity,
as well as learn from the source code by studying its extremely clean
implementation.

## Launching our first network

Expand All @@ -10,25 +12,31 @@ To begin using the `synadart`, we must first import it into our project:
import 'package:synadart/synadart.dart';
```

Next, we must create a network of our chosen type. Let's create a sequential network, in which every layer has one input and one output tensor. This should be pretty easy:
Next, we must create a network of our chosen type. Let's create a sequential
network, in which every layer has one input and one output tensor. This should
be pretty easy:

```dart
final network = Sequential(learningRate: 0.3);
```

Our network is currently empty; it contains no layers and therefore no neurons. Let's add three layers; the input layer, one hidden layer and the output layer:
Our network is currently empty; it contains no layers and therefore no neurons.
Let's add three layers; the input layer, one hidden layer and the output layer:

```dart
network.addLayers([
Dense(15, activationAlgorithm: ActivationAlgorithm.sigmoid),
Dense(5, activationAlgorithm: ActivationAlgorithm.sigmoid),
Dense(1, activationAlgorithm: ActivationAlgorithm.sigmoid),
Dense(15, activationAlgorithm: ActivationAlgorithm.sigmoid),
Dense(5, activationAlgorithm: ActivationAlgorithm.sigmoid),
Dense(1, activationAlgorithm: ActivationAlgorithm.sigmoid),
]);
```

Now that our network has some structure to it, we can begin using it.. No, not quite yet. Our network is still not trained, and has no clue what it is doing. Time to train it.
Now that our network has some structure to it, we can begin using it.. No, not
quite yet. Our network is still not trained, and has no clue what it is doing.
Time to train it.

Firstly, we will create a list of *expected* values, i.e. *values we are expecting the network to output*. Here, we are expecting to get the number '5'.
Firstly, we will create a list of _expected_ values, i.e. _values we are
expecting the network to output_. Here, we are expecting to get the number '5'.

```dart
final expected = [
Expand All @@ -45,9 +53,12 @@ final expected = [
];
```

Fantastic, we are now expecting our infantile network to magically output a number 5, not having taught it a thing. Oh, right - that's where the training data part comes in!
Fantastic, we are now expecting our infantile network to magically output a
number 5, not having taught it a thing. Oh, right - that's where the training
data part comes in!

We must now tell the network what each of our expected output values is associated with. Let's teach it some numbers:
We must now tell the network what each of our expected output values is
associated with. Let's teach it some numbers:

```dart
final trainingData = [
Expand All @@ -64,15 +75,19 @@ final trainingData = [
];
```

Now that we granted our network a grand total of 10 numbers to learn, we can begin training the network using the values we've set up:
Now that we granted our network a grand total of 10 numbers to learn, we can
begin training the network using the values we've set up:

```dart
network.train(inputs: trainingData, expected: expected, iterations: 5000);
```

Wonderful! We've trained our network using the pixel representation of number images, and our network is now able to recognise the number '5' with relative confidence. The last step is to test our network's capabilities ourselves.
Wonderful! We've trained our network using the pixel representation of number
images, and our network is now able to recognise the number '5' with relative
confidence. The last step is to test our network's capabilities ourselves.

Let's give our network a couple pixel representations of distorted images of the number '5':
Let's give our network a couple pixel representations of distorted images of the
number '5':

```dart
final testData = [
Expand All @@ -91,4 +106,4 @@ To check the confidence of the network in recognising distorted '5's:
for (final test in testData) {
print('Confidence in recognising a distorted 5: ${network.process(test)}');
}
```
```
43 changes: 21 additions & 22 deletions example/example.dart
Original file line number Diff line number Diff line change
Expand Up @@ -2,24 +2,22 @@ import 'package:synadart/src/layers/core/dense.dart';
import 'package:synadart/synadart.dart';

void main() {
final network = Sequential(learningRate: 0.2);

network.addLayer(Dense(
size: 15,
activation: ActivationAlgorithm.sigmoid,
));
final network = Sequential(learningRate: 0.2, layers: [
Dense(
size: 15,
activation: ActivationAlgorithm.sigmoid,
),
Dense(
size: 5,
activation: ActivationAlgorithm.sigmoid,
),
Dense(
size: 1,
activation: ActivationAlgorithm.sigmoid,
)
]);

network.addLayer(Dense(
size: 5,
activation: ActivationAlgorithm.sigmoid,
));

network.addLayer(Dense(
size: 1,
activation: ActivationAlgorithm.sigmoid,
));

// We are expecting to get the number '5'
// We are expecting to get the number '5'.
final expected = [
[0.01],
[0.01],
Expand All @@ -33,21 +31,22 @@ void main() {
[0.01],
];

// Training data contains different number patterns
// Training data contains different number patterns.
final trainingData = [
'111101101101111'.split('').map(double.parse).toList(),
'001001001001001'.split('').map(double.parse).toList(),
'111001111100111'.split('').map(double.parse).toList(),
'111001111001111'.split('').map(double.parse).toList(),
'101101111001001'.split('').map(double.parse).toList(),
'111100111001111'.split('').map(double.parse).toList(), // This is the number 5
// This is the number 5
'111100111001111'.split('').map(double.parse).toList(),
'111100111101111'.split('').map(double.parse).toList(),
'111001001001001'.split('').map(double.parse).toList(),
'111101111101111'.split('').map(double.parse).toList(),
'111101111001111'.split('').map(double.parse).toList(),
];

// Test data which contains distorted patterns of the number 5
// Test data which contains distorted patterns of the number 5.
final testData = [
'111100111000111'.split('').map(double.parse).toList(),
'111100010001111'.split('').map(double.parse).toList(),
Expand All @@ -57,10 +56,10 @@ void main() {
'111100101001111'.split('').map(double.parse).toList(),
];

// The number 5 itself
// The number 5 itself.
final numberFive = trainingData[5];

// Train the network using the training and expected data
// Train the network using the training and expected data.
network.train(inputs: trainingData, expected: expected, iterations: 5000);

print('Confidence in recognising a 5: ${network.process(numberFive)}');
Expand Down
Loading

0 comments on commit ce5e5dc

Please sign in to comment.