K-Nearest Neighbors Algorithm Solved Example

 

K-Nearest Neighbors Algorithm Solved Example in Machine Learning

K-Nearest Neighbors Algorithm is an instance-based supervised machine learning algorithm. It is also known as the Lazy Learner algorithm as it delays the learning process till the arrival of a new example.

In this tutorial, we will understand how to apply k nearest neighbors algorithm to classify the new example.

Problem Deninition:

“Restaurant A” sells burgers with optional flavors: Pepper, Ginger, and Chilly.

Every day this week you have tried a burger (A to E) and kept a record of which you liked.

Using Hamming distance, show how the 3NN classifier with majority voting would classify { pepper: false, ginger: true, chilly: true}

Click Here for Python Program to Implement K-Nearest Neighbors Algorithm

Training Examples:

PepperGingerChillyLiked
ATrueTrueTrueFalse
BTrueFalseFalseTrue
CFalseTrueTrueFalse
DFalseTrueFalseTrue
ETrueFalseFalseTrue

Solution:

The training examples contain three attributes, Pepper, Ginger, and Chilly. Each of these attributes takes either True or False as the attribute values. Liked is the target that takes either True or False as the value.

In the k-nearest neighbor’s algorithm, first, we calculate the distance between the new example and the training examples. using this distance we find k-nearest neighbors from the training examples.

To calculate the distance the attribute values must be real numbers. But in our case, the dataset set contains the categorical values. Hence we use hamming distance measure to find the distance between the new example and training examples.

Let x1 and x2 be the attribute values of two instances.

Then, in the hamming distance, if the categorical values are the same or matching that is x1 is the same as x2 then the distance is 0, otherwise 1.

For example,

If the value of x1 is blue and x2 is also blue then the distance between x1 and x2 is 0.

If the value of x1 is blue and x2 is red then the distance between x1 and x2 is 1.

The following table shows the distance between the new example and the training example, calculated using hamming distance.

PepperGingerChillyLikedDistance
ATrueTrueTrueFalse1 + 0 + 0 = 1
BTrueFalseFalseTrue1 + 1 + 1 = 3
CFalseTrueTrueFalse0 + 0 + 0 = 0
DFalseTrueFalseTrue 0 + 0 + 1 = 1
ETrueFalseFalseTrue1 + 1 + 1 = 3

Next, Based on the distance we find 3 nearest neighbors (3NN), which are marked in the last column.

PepperGingerChillyLikedDistance3NN
ATrueTrueTrueFalse1 + 0 + 0 = 12
BTrueFalseFalseTrue1 + 1 + 1 = 3
CFalseTrueTrueFalse0 + 0 + 0 = 01
DFalseTrueFalseTrue 0 + 0 + 1 = 12
ETrueFalseFalseTrue1 + 1 + 1 = 3

Finally, majority voting is used to assign the classification label to the new example. In this case, we have, two False and one True nearest examples. Hence the new example is classified as FLASE.

Video Tutorial of K-Nearest Neighbors Algorithm Solved Example

Summary

This tutorial discusses the K-Nearest Neighbors Algorithm Solved Example in Machine Learning. If you like the tutorial share with your friends. Like the Facebook page for regular updates and YouTube channel for video tutorials.

Leave a Comment

Your email address will not be published. Required fields are marked *