DEV Community

LeviAckerman3855
LeviAckerman3855

Posted on

Dann.js - Making a Neural Network solve XOR problems!

As you must have read from the previous tutorial, Dann.js is a NPM module for Node.js which allows you to build a neural network easily. You can read the previous tutorial here.

In this tutorial, we will build a new neural network which will solve XOR problems.

XOR is a logical operation that is true if and only if its arguments differ (one is true, the other is false).

So basically, it is a OR gate with a condition that it will be true only when the second bit is inverse of the first. You can read more about XOR here.

A XOR has a this truth table (A table which summarizes which condition produces which output):

1st Bit         2nd Bit         Output
0               0               0
0               1               1
1               0               0
1               1               0
Enter fullscreen mode Exit fullscreen mode

What you will need

  • Node.js
  • A computer with more than 2GB of ram and a good CPU

Getting started

Setup

Install Dann.js into your environment if you haven't already by doing:

npm i dannjs
Enter fullscreen mode Exit fullscreen mode

As you have learnt in the previous tutorial, we start using Dann.js in our JavaScript program by typing:

const Dannjs = require('dannjs');
const Dann = Dannjs.dann;
Enter fullscreen mode Exit fullscreen mode

For training the network to do XOR operations, we import the xor dataset from Dann.js:

const xor = Dannjs.xor;
Enter fullscreen mode Exit fullscreen mode

Now lets initialize the network by typing Dann with the arguments 2,1. Why are we using 2 input neurons and 1 output neuron? Because a XOR operation requires 2 input bits and outputs a single bit. So we are assigning only the amount of needed bits, which is (2,1).

const xorDann = new Dann(2,1);
Enter fullscreen mode Exit fullscreen mode

Setting up the calculations

Setting up the hidden layers

As you must have read in the previous tutorial, a hidden layer is essentially a neuron layer that can perform calculations. The name 'hidden' comes from the way that you don't have to see the values of every neuron, in contrast to the input/output layers. You can learn more about hidden layers & the basics surrounding it here.

Here we are setting up a hidden layer using the .addHiddenLayer method, which takes a argument of number of neurons assigned to the layer, and we are taking the neuron count to be 12. You can change the number way as you want, but I have found this to be the most stable.

xorDann.addHiddenLayer(12);
Enter fullscreen mode Exit fullscreen mode

Now, to set up the hidden layers, we do

xorDann.makeWeights();
Enter fullscreen mode Exit fullscreen mode
Testing the network

Testing is essential in anything, isn't it? Here we will test our network to see the stats of it, and detect any errors.

Since we have not trained it, we will simply being logging the details.

xorDann.log();
Enter fullscreen mode Exit fullscreen mode

In my case, it outputs:

Dann NeuralNetwork:
  Layers:
    Input Layer:   2
    hidden Layer: 12  (sigmoid)
    output Layer: 1  (sigmoid)
  Other Values:
    Learning rate: 0.001
    Loss Function: mse
    Current Epoch: 0
    Latest Loss: 0
Enter fullscreen mode Exit fullscreen mode

If it outputs the same thing, go on.
Comment out the line by prepending // in front of it.

Training the network

Our network doesn't know anything at all. Throw it a value and it will give a random bit.
But how do we train it?
We don't have a dataset!

You remember the Dannjs.xor we imported? it is a XOR dataset which has the training values to train the network about XOR.

So we should set a for loop to train, i.e, backPropagate the data to the network.

for (data of xor){
    xorDann.backpropagate(data.input, data.output);
}
Enter fullscreen mode Exit fullscreen mode

Running the network now, feeding it 0,0 should output 0, should not it?
Lets feed it:

xorDann.feedForward([0,0],{log:true});
Enter fullscreen mode Exit fullscreen mode

Let us run the network and see what happens:

Dann NeuralNetwork:
  Layers:
    Input Layer:   2
    hidden Layer: 12  (sigmoid)
    output Layer: 1  (sigmoid)
  Other Values:
    Learning rate: 0.001
    Loss Function: mse
    Current Epoch: 0
    Latest Loss: 0
Prediction:
[0.416897070979890]
Enter fullscreen mode Exit fullscreen mode

The output can be different on yours. You will say, 0.4 is not even close to 0! Yes. You are right. We have trained this network only one time, and like a newborn child it will make mistakes. But why not train it, say 100000 times??
Lets train it:

for(let i = 0; i < 100000; i++){
for (data of xor){
    xorDann.backpropagate(data.input, data.output);
}}
Enter fullscreen mode Exit fullscreen mode

Now let's run the network:

xorDann.feedForward([0,0],{log:true});
Enter fullscreen mode Exit fullscreen mode

And in my case, the new output is:

Dann NeuralNetwork:
  Layers:
    Input Layer:   2
    hidden Layer: 12  (sigmoid)
    output Layer: 1  (sigmoid)
  Other Values:
    Learning rate: 0.001
    Loss Function: mse
    Current Epoch: 0
    Latest Loss: 0
Prediction:
[0.0224234234324]
Enter fullscreen mode Exit fullscreen mode

After running it about 10 times, the output became:

Dann NeuralNetwork:
  Layers:
    Input Layer:   2
    hidden Layer: 12  (sigmoid)
    output Layer: 1  (sigmoid)
  Other Values:
    Learning rate: 0.001
    Loss Function: mse
    Current Epoch: 0
    Latest Loss: 0
Prediction:
[0.0044234234324]
Enter fullscreen mode Exit fullscreen mode

Pretty close, right?

Finishing up

You can experiment with your own gate, and for your reference the XOR class is actually:

[
    { input: [ 1, 0 ], output: [1] },
    { input: [ 0, 1 ], output: [1] },
    { input: [ 1, 1 ], output: [0] },
    { input: [ 0, 0 ], output: [0] }
]
Enter fullscreen mode Exit fullscreen mode

You can modify this gate and make your own gate!

The whole code used in this tutorial is:

const Dannjs = require('dannjs');
const Dann = Dannjs.dann;
const xor = Dannjs.xor; //datasource you can change it
const xorDann = new Dann(2,1);

xorDann.addHiddenLayer(12);
xorDann.makeWeights();

for(let i = 0; i < 100000; i++){
for (data of xor){
    xorDann.backpropagate(data.input, data.output);
}}

// running it

xorDann.feedForward([0,0],{log:true});
Enter fullscreen mode Exit fullscreen mode

You can experiment with different values and see what you get!

Top comments (0)