DEV Community

Cover image for K-Nearest Neighbor(K-NN) Algorithms for Machine Learning
Victor Alando
Victor Alando

Posted on • Edited on

K-Nearest Neighbor(K-NN) Algorithms for Machine Learning

Introduction

In this tutorial, you are going to learn about how K-Nearest Neighbors (K-NN) as applied in Machine Learning models and also in Classification.

What is K-Nearest Neighbor (K-NN)

  • K-Nearest Neighbor is one of the simplest Machine Learning algorithm based on supervised learning techniques.

  • K-Nearest Neighbor assumes the similarity between the new case and data available cases and put the new case into the category that is most similar to the available categories.

  • K-NN algorithm stores all the available data and classifies a new data point based on the similarity. This means that when new data point appears then it can be easily classified into a well suite category by using K-NN algorithm.

  • K-NN algorithm can be used for Regression as well as for classification problems.

  • K-NN is non-parametric algorithm because it does not learn from the training set immediately instead it stores the dataset and at the time of classification, it performs an action on the dataset.

  • K-NN algorithm at the training phase just stores the dataset and when it gets new data, then it classifies that data into category that is much similar to the new data.

Example
Suppose we have an image of a creature that looks similar to a Cheater and a Leopard, but we want to know either it is a Cheater or a Leopard. So for this identification, we can use the KNN algorithm, as it works on similarity measure.

Our KNN Model will find the similar features of the new dataset to the Cheaters and Leopards images and based on the similar features it will put it either Cheater or a Leopard category.

Why do we need a K-NN Algorithm?

Top comments (0)