DEV Community

Cover image for Creating a recommender system in 10 lines of javascript
eerk
eerk

Posted on

Creating a recommender system in 10 lines of javascript

I've been dabbling in Machine Learning algorithms for Javascript, coming from a background of front-end development.

AI and Machine Learning can get complex very quickly, but I found that some basic algorithms, such as a decision tree or nearest neighbour, are easy to grasp and can still be surprisingly powerful.

In this super short tutorial we use K-Nearest-Neighbour to find people who have similar interests to you.

KNN is basically the pythagorean theorem. It compares a point in a space to several other points and tells you which is the closest.

Check this codepen to see an example

knn

The "K" in KNN stands for the amount of points you want to check. If we use a K of 1 we just get the point closest to us.

Using the X,Y axes as data

The trick of using KNN is that we can use the X,Y axes to project data. If we want to plot cats and dogs, we could draw their body weight on the X axis, and their ear length on the Y axis.

Image description

If we draw an unknown animal in this same plot, it should be easy to see if it's a cat or a dog, just by checking which are the closest animals.

Multidimensional madness

From the above examples you might get the impression that good old pythagoras was just a two-dimensional character. But fascinatingly, his formula doesn't care how many dimensions there are.

3d

We can add a third dimension and still draw it in a graph. But we could add many more, and the math still works.

In the example of pets, we could add many more pet features to find out what kind of pet we're dealing with.

In pseudo code it could look like this:

let pet = {ears:2, weight:92, height:14, speed:93}
let prediction = knn.predict(pet)
// returns: "Jaguar"
Enter fullscreen mode Exit fullscreen mode

KNN gives us the ability to classify data by pretending all the features are distances

How about those 10 lines of javascript?

Let's use this algorithm to create a recommendation system. We'll find a person that has similar interests to ourselves.

We will use this 9-years-old github repo which still seems to work! Pythagoras himself is a lot older so I don't see any problem here.

npm install knear
Enter fullscreen mode Exit fullscreen mode

We are going to teach the algorithm that every point is a unique person, and all the features of that person are their interests. Then, by finding the one closest person, you can find your match.

import knn from 'knear'

const data = [
    { cooking: 1, painting: 10, name: 'erik' },
    { cooking: 0, painting: 1, name: 'bob' },
    { cooking: 10, painting: 1, name: 'ellen' },
    { cooking: 4, painting: 6, name: 'jill' },
    { cooking: 3, painting: 8, name: 'ramon' },
]

const machine = new knn.kNear(1)
for (let d of data) {
    machine.learn([d.cooking, d.painting], d.name)
}
Enter fullscreen mode Exit fullscreen mode

You can find your match:

const prediction = machine.classify([1,9])
console.log(`Your closest match is ${prediction}`)
Enter fullscreen mode Exit fullscreen mode

In this example we are only interested in cooking and painting (in any order), but you can add as many features as you want.

Thanks for reading!

P.S. Doesn't it boggle your mind that they already had a machine learning algorithm in 500BC???

Top comments (1)

Collapse
 
faizbyp profile image
Faiz Byputra

Very interesting and straightforward! I wonder what projects could be made with this recommender.