AND GATE Perceptron Training Rule Machine Learning

AND GATE Perceptron Training Rule – Artificial Neural Network in Machine Learning – 17CS73

Video Tutorial

Truth Table of AND Logical GATE is,

Weights w1 = 1.2, w2 = 0.6, Threshold = 1 and Learning Rate n = 0.5 are given

For Training Instance 1: A=0, B=0 and Target = 0

wi.xi = 0*1.2 + 0*0.6 = 0

This is not greater than the threshold of 1, so the output = 0, Here the target is same as calculated output.

For Training Instance 2: A=0, B=1 and Target = 0

wi.xi = 0*1.2 + 1*0.6 = 0.6

This is not greater than the threshold of 1, so the output = 0. Here the target is same as calculated output.

For Training Instance 2: A=1, B=0 and Target = 0

wi.xi = 1*1.2 + 0*0.6 = 1.2

This is greater than the threshold of 1, so the output = 1. Here the target does not match with the calculated output.

Hence we need to update the weights.

AND GATE Perceptron Training Rule

Now,

After updating weights are w1 = 0.7, w2 = 0.6 Threshold = 1 and Learning Rate n = 0.5

See also  18CSL76 Artificial Intelligence Machine Learning Laboratory

w1 = 0.7, w2 = 0.6 Threshold = 1 and Learning Rate n = 0.5

For Training Instance 1: A=0, B=0 and Target = 0

wi.xi = 0*0.7 + 0*0.6 = 0

This is not greater than the threshold of 1, so the output = 0. Here the target is same as calculated output.

For Training Instance 2: A=0, B=1 and Target = 0

wi.xi = 0*0.7 + 1*0.6 = 0.6

This is not greater than the threshold of 1, so the output = 0. Here the target is same as calculated output.

For Training Instance 3: A=1, B=0 and Target = 0

wi.xi = 1*0.7 + 0*0.6 = 0.7

This is not greater than the threshold of 1, so the output = 0. Here the target is same as calculated output.

For Training Instance 4: A=1, B=1 and Target = 1

wi.xi = 1*0.7 + 1*0.6 = 1.3

This is greater than the threshold of 1, so the output = 1. Here the target is same as calculated output.

Hence the final weights are w1= 0.7 and w2 = 0.6, Threshold = 1 and Learning Rate n = 0.5.

AND GATE Perceptron Training

Summary

This tutorial discusses the AND GATE Perceptron Training Rule in Machine Learning. If you like the tutorial share it with your friends. Like the Facebook page for regular updates and YouTube channel for video tutorials.

See also  Gradient Descent Algorithm

Leave a Comment

Your email address will not be published. Required fields are marked *