Welcome back to our tutorial series!
Remember in the previous tutorial, we taught a network how to do XOR operations? In this tutorial we will teach it to predict the house prices of San Francisco.
For predicting the house prices of San Francisco, we need these parameters:
- longitude
- latitude
- housing_median_age
- total_rooms
- total_bedrooms
- population
- households
- median_income
- ocean_proximity
We will be providing these to the network as a dictionary in JavaScript, so the network can easily understand it. A sample house price for a time period would be:
{"inputs":[0.1769999999999996,0.5663157894736844,0.7843137254901961,0.08932295640673484,0.06621146285950755,0.05960555695694012,0.082223318533136,0.5396689655172414,0.75],"target":[0.9022663824066705]
Here we are giving the network the nine parameters and giving a sample target
which is the median house price for that time period.
You can access the used dataset here.
What you will need
- Node.js
- A good computer with more than 2 GB of RAM and a good CPU
Getting started!
Setting up the basic components
Environment
First, we need to set up our environment.
First we need to install Dann.js so we can use it in our programs. So run this code in your terminal after switching to the folder:
npm install dannjs
Main file
As you have learnt in the previous tutorial, we start using Dann.js in our JavaScript program by typing:
const Dannjs = require('dannjs');
const Dann = Dannjs.dann;
Now lets initialize the network by typing Dann
with the arguments 9,1
. Why are we using 9 input neurons and 1 output neuron? Because for the housing status, we are inputting 9 parameters. The output is only one value, so we are requiring only 1 output neuron. So we are assigning only the amount of needed neurons, which are (9,1)
.
const nn = Dann(9,1);
Setting up the dataset
Download the dataset from github here. Save them in the project directory as dataset.js
. We will be using this in our main file.
Import the dataset to the main file:
const dataset = require("./dataset").houses
Setting up the hidden layers
I have found that 3 hidden layers work well. You can experiment with other values:
nn.addHiddenLayer(8,'leakyReLU');
nn.addHiddenLayer(8,'leakyReLU');
nn.addHiddenLayer(6,'tanH');
We are setting the loss function, to mae
. MAE loss function is defined as The sum of absolute differences between our target and predicted variables
. You can read more about it (here)[https://heartbeat.fritz.ai/5-regression-loss-functions-all-machine-learners-should-know-4fb140e9d4b0].
nn.outputActivation('sigmoid');
nn.setLossFunction('mae');
Now we finally weight the hidden layers:
nn.makeWeights();
Training the model
We are training the model using the dataset using traditional way, aka backPropagation
. In this methods, we do a n
number of times training through feeding the data manually to the network using the .backPropagate
method.
let trainIndex = 0;
function train(epoch = 1) {
// save dataset length
let len = houses.data.length;
// Train multiple epochs
for (let e = 0; e < epoch; e++) {
// Log epochs
if (logs == true) {
console.log("Epoch " + nn.epoch);
}
// Train 1 epoch
let sum = 0;
for (let i = 0; i < len; i++) {
nn.backpropagate(
houses.data[i].inputs,
houses.data[i].target
);
sum += nn.loss;
}
// Save average epoch loss
losses.push(sum/len);
// Save epoch's accuracy with testing dataset
let result = test(logs);
accuracies.push(result);
// Increment the Dann model's epoch
nn.epoch++;
}
}
train(1000);
Here we are moving away from the out-of-box training and packing the training in a callable function, and calling it.
We are also saving the returned data as we need it to view long series.
Finishing up
Now you can test the model by using .feedForward
method.
Happy neural-networking!
Top comments (3)
Thank you for this breakdown.
I'll like to ask what this line does.
nn.outputActivation('sigmoid')
It sets the activation function of the output to
sigmoid
. By default, activation functions are set to sigmoid thus this doesn't change anything other than confirming we have an output activation set to sigmoid.Thank you.