Iris Plant Classification: Back-Propagation Algorithm

Iris_dataset_scatterplot.svg

% ===================
% Filename: Iris_bp.m
% ===================

echo off

disp(‘ =====================================================’)
disp(‘ Iris plant classification: back-propagation algorithm’)
disp(‘ =====================================================’)

disp(‘ ============================================================================’)
disp(‘ Reference: Negnevitsky, M., “Artificial Intelligence: A Guide to Intelligent’)
disp(‘ Systems”, 3rd edn. Addison Wesley, Harlow, England, 2011. ‘)
disp(‘ pp. 332 Classification neural network with competitive learning’)
disp(‘ Source : Github- S Mathieu. Edited by Srinath Krishnamoorthy’)
disp(‘ ============================================================================’)

disp(‘ ===================================================================================’)
disp(‘ Problem: The Iris plant data set contains 3 classes, and each class is represented ‘)
disp(‘ by 50 plants. A plant is characterised by its sepal length, sepal width, ‘)
disp(‘ petal length and petal width. A three-layer back-propagation network is ‘)
disp(‘ required to classify Iris plants. ‘)
disp(‘ ===================================================================================’)

Iris_data= [5.1 3.5 1.4 0.2 % Iris-setosa
4.9 3.0 1.4 0.2 % Iris-setosa
4.7 3.2 1.3 0.2 % Iris-setosa
4.6 3.1 1.5 0.2 % Iris-setosa
5.0 3.6 1.4 0.2 % Iris-setosa
5.4 3.9 1.7 0.4 % Iris-setosa
4.6 3.4 1.4 0.3 % Iris-setosa
5.0 3.4 1.5 0.2 % Iris-setosa
4.4 2.9 1.4 0.2 % Iris-setosa
4.9 3.1 1.5 0.1 % Iris-setosa
5.4 3.7 1.5 0.2 % Iris-setosa
4.8 3.4 1.6 0.2 % Iris-setosa
4.8 3.0 1.4 0.1 % Iris-setosa
4.3 3.0 1.1 0.1 % Iris-setosa
5.8 4.0 1.2 0.2 % Iris-setosa
5.7 4.4 1.5 0.4 % Iris-setosa
5.4 3.9 1.3 0.4 % Iris-setosa
5.1 3.5 1.4 0.3 % Iris-setosa
5.7 3.8 1.7 0.3 % Iris-setosa
5.1 3.8 1.5 0.3 % Iris-setosa
5.4 3.4 1.7 0.2 % Iris-setosa
5.1 3.7 1.5 0.4 % Iris-setosa
4.6 3.6 1.0 0.2 % Iris-setosa
5.1 3.3 1.7 0.5 % Iris-setosa
4.8 3.4 1.9 0.2 % Iris-setosa
5.0 3.0 1.6 0.2 % Iris-setosa
5.0 3.4 1.6 0.4 % Iris-setosa
5.2 3.5 1.5 0.2 % Iris-setosa
5.2 3.4 1.4 0.2 % Iris-setosa
4.7 3.2 1.6 0.2 % Iris-setosa
4.8 3.1 1.6 0.2 % Iris-setosa
5.4 3.4 1.5 0.4 % Iris-setosa
5.2 4.1 1.5 0.1 % Iris-setosa
5.5 4.2 1.4 0.2 % Iris-setosa
4.9 3.1 1.5 0.1 % Iris-setosa
5.0 3.2 1.2 0.2 % Iris-setosa
5.5 3.5 1.3 0.2 % Iris-setosa
4.9 3.1 1.5 0.1 % Iris-setosa
4.4 3.0 1.3 0.2 % Iris-setosa
5.1 3.4 1.5 0.2 % Iris-setosa
5.0 3.5 1.3 0.3 % Iris-setosa
4.5 2.3 1.3 0.3 % Iris-setosa
4.4 3.2 1.3 0.2 % Iris-setosa
5.0 3.5 1.6 0.6 % Iris-setosa
5.1 3.8 1.9 0.4 % Iris-setosa
4.8 3.0 1.4 0.3 % Iris-setosa
5.1 3.8 1.6 0.2 % Iris-setosa
4.6 3.2 1.4 0.2 % Iris-setosa
5.3 3.7 1.5 0.2 % Iris-setosa
5.0 3.3 1.4 0.2 % Iris-setosa
7.0 3.2 4.7 1.4 % Iris-versicolor
6.4 3.2 4.5 1.5 % Iris-versicolor
6.9 3.1 4.9 1.5 % Iris-versicolor
5.5 2.3 4.0 1.3 % Iris-versicolor
6.5 2.8 4.6 1.5 % Iris-versicolor
5.7 2.8 4.5 1.3 % Iris-versicolor
6.3 3.3 4.7 1.6 % Iris-versicolor
4.9 2.4 3.3 1.0 % Iris-versicolor
6.6 2.9 4.6 1.3 % Iris-versicolor
5.2 2.7 3.9 1.4 % Iris-versicolor
5.0 2.0 3.5 1.0 % Iris-versicolor
5.9 3.0 4.2 1.5 % Iris-versicolor
6.0 2.2 4.0 1.0 % Iris-versicolor
6.1 2.9 4.7 1.4 % Iris-versicolor
5.6 2.9 3.6 1.3 % Iris-versicolor
6.7 3.1 4.4 1.4 % Iris-versicolor
5.6 3.0 4.5 1.5 % Iris-versicolor
5.8 2.7 4.1 1.0 % Iris-versicolor
6.2 2.2 4.5 1.5 % Iris-versicolor
5.6 2.5 3.9 1.1 % Iris-versicolor
5.9 3.2 4.8 1.8 % Iris-versicolor
6.1 2.8 4.0 1.3 % Iris-versicolor
6.3 2.5 4.9 1.5 % Iris-versicolor
6.1 2.8 4.7 1.2 % Iris-versicolor
6.4 2.9 4.3 1.3 % Iris-versicolor
6.6 3.0 4.4 1.4 % Iris-versicolor
6.8 2.8 4.8 1.4 % Iris-versicolor
6.7 3.0 5.0 1.7 % Iris-versicolor
6.0 2.9 4.5 1.5 % Iris-versicolor
5.7 2.6 3.5 1.0 % Iris-versicolor
5.5 2.4 3.8 1.1 % Iris-versicolor
5.5 2.4 3.7 1.0 % Iris-versicolor
5.8 2.7 3.9 1.2 % Iris-versicolor
6.0 2.7 5.1 1.6 % Iris-versicolor
5.4 3.0 4.5 1.5 % Iris-versicolor
6.0 3.4 4.5 1.6 % Iris-versicolor
6.7 3.1 4.7 1.5 % Iris-versicolor
6.3 2.3 4.4 1.3 % Iris-versicolor
5.6 3.0 4.1 1.3 % Iris-versicolor
5.5 2.5 4.0 1.3 % Iris-versicolor
5.5 2.6 4.4 1.2 % Iris-versicolor
6.1 3.0 4.6 1.4 % Iris-versicolor
5.8 2.6 4.0 1.2 % Iris-versicolor
5.0 2.3 3.3 1.0 % Iris-versicolor
5.6 2.7 4.2 1.3 % Iris-versicolor
5.7 3.0 4.2 1.2 % Iris-versicolor
5.7 2.9 4.2 1.3 % Iris-versicolor
6.2 2.9 4.3 1.3 % Iris-versicolor
5.1 2.5 3.0 1.1 % Iris-versicolor
5.7 2.8 4.1 1.3 % Iris-versicolor
6.3 3.3 6.0 2.5 % Iris-verginica
5.8 2.7 5.1 1.9 % Iris-verginica
7.1 3.0 5.9 2.1 % Iris-verginica
6.3 2.9 5.6 1.8 % Iris-verginica
6.5 3.0 5.8 2.2 % Iris-verginica
7.6 3.0 6.6 2.1 % Iris-verginica
4.9 2.5 4.5 1.7 % Iris-verginica
7.3 2.9 6.3 1.8 % Iris-verginica
6.7 2.5 5.8 1.8 % Iris-verginica
7.2 3.6 6.1 2.5 % Iris-verginica
6.5 3.2 5.1 2.0 % Iris-verginica
6.4 2.7 5.3 1.9 % Iris-verginica
6.8 3.0 5.5 2.1 % Iris-verginica
5.7 2.5 5.0 2.0 % Iris-verginica
5.8 2.8 5.1 2.4 % Iris-verginica
6.4 3.2 5.3 2.3 % Iris-verginica
6.5 3.0 5.5 1.8 % Iris-verginica
7.7 3.8 6.7 2.2 % Iris-verginica
7.7 2.6 6.9 2.3 % Iris-verginica
6.0 2.2 5.0 1.5 % Iris-verginica
6.9 3.2 5.7 2.3 % Iris-verginica
5.6 2.8 4.9 2.0 % Iris-verginica
7.7 2.8 6.7 2.0 % Iris-verginica
6.3 2.7 4.9 1.8 % Iris-verginica
6.7 3.3 5.7 2.1 % Iris-verginica
7.2 3.2 6.0 1.8 % Iris-verginica
6.2 2.8 4.8 1.8 % Iris-verginica
6.1 3.0 4.9 1.8 % Iris-verginica
6.4 2.8 5.6 2.1 % Iris-verginica
7.2 3.0 5.8 1.6 % Iris-verginica
7.4 2.8 6.1 1.9 % Iris-verginica
7.9 3.8 6.4 2.0 % Iris-verginica
6.4 2.8 5.6 2.2 % Iris-verginica
6.3 2.8 5.1 1.5 % Iris-verginica
6.1 2.6 5.6 1.4 % Iris-verginica
7.7 3.0 6.1 2.3 % Iris-verginica
6.3 3.4 5.6 2.4 % Iris-verginica
6.4 3.1 5.5 1.8 % Iris-verginica
6.0 3.0 4.8 1.8 % Iris-verginica
6.9 3.1 5.4 2.1 % Iris-verginica
6.7 3.1 5.6 2.4 % Iris-verginica
6.9 3.1 5.1 2.3 % Iris-verginica
5.8 2.7 5.1 1.9 % Iris-verginica
6.8 3.2 5.9 2.3 % Iris-verginica
6.7 3.3 5.7 2.5 % Iris-verginica
6.7 3.0 5.2 2.3 % Iris-verginica
6.3 2.5 5.0 1.9 % Iris-verginica
6.5 3.0 5.2 2.0 % Iris-verginica
6.2 3.4 5.4 2.3 % Iris-verginica
5.9 3.0 5.1 1.8]; % Iris-verginica

 

[iris_data] = Iris_data;
iris_data = (iris_data(:,[1:4]))’;

% Massaged values for the Iris plant data set

for n=1:4;
iris_inputs(n,:)=(iris_data(n,:)-min(iris_data(n,:)))/…
(max(iris_data(n,:)-min(iris_data(n,:))));
end

iris_target1 = [1 0 0]’; setosa=find(iris_target1);
iris_target2 = [0 1 0]’; versicolor=find(iris_target2);
iris_target3 = [0 0 1]’; verginica=find(iris_target3);

for n=1:(50-1)
iris_target1=[iris_target1 iris_target1(:,1)];
iris_target2=[iris_target2 iris_target2(:,1)];
iris_target3=[iris_target3 iris_target3(:,1)];
end

iris_targets = [iris_target1 iris_target2 iris_target3];

disp(‘Hit any key to randomly select input vectors to be used in training.’)
disp(‘ ‘)
pause

p=[]; t=[]; test_p=[]; test_t=[];

for n=1:150
if rand(1)>1/3
p=[p iris_inputs(:,n)];
t=[t iris_targets(:,n)];
else
test_p=[test_p iris_inputs(:,n)];
test_t=[test_t iris_targets(:,n)];
end
end

[m n]=size(test_p);

disp(‘ ‘)
fprintf(1,’ The training data set contains %.0f elements.\n’,(150-n));
fprintf(1,’ The test data set contains %.0f elements.\n’,n);
disp(‘ ‘)

echo on

% Hit any key to define the network architecture.
pause

s1=5; % Five neurons in the hidden layer
s2=3; % Three neuron in the output layer

% Hit any key to create the network, initialise its weights and biases,
% and set up training parameters.
pause

rand(‘seed’,1243);

net = newff([4.3 7.9; 2.0 4.4; 1.0 6.9; 0.1 2.5],[s1 s2],{‘logsig’ ‘purelin’},’traingdx’);

net.trainParam.show=20; % Number of epochs between showing the progress
net.trainParam.epochs=1000; % Maximum number of epochs
net.trainParam.goal=0.001; % Performance goal
net.trainParam.lr=0.01; % Learning rate
net.trainParam.lr_inc=1.05; % Learning rate increase multiplier
net.trainParam.lr_dec=0.7; % Learning rate decrease multiplier
net.trainParam.mc=0.9; % Momentum constant

% Hit any key to train the back-propagation network.
pause

net=train(net,p,t);

echo off

disp(‘ ‘)
fprintf(1,’ Iris-setosa is represented by output: %.0f \n’,setosa);
fprintf(1,’ Iris-versicolor is represented by output: %.0f \n’,versicolor);
fprintf(1,’ Iris-verginica is represented by output: %.0f \n’,verginica);

disp(‘ ‘)
disp(‘ Hit any key to test the network using the test data set.’)
disp(‘ ‘)
pause

n_setosa=0; n_versicolor=0; n_verginica=0;
error_setosa=0; error_versicolor=0; error_verginica=0; error=0;

fprintf(‘ Sepal length Sepal width Petal length Petal width Desired output Actual output Error\n’);

for i=1:n
fprintf(‘ %.1f %.1f %.1f %.1f’,test_p(1,i),test_p(2,i),test_p(3,i),test_p(4,i));
a=compet(sim(net,test_p(:,i))); a=find(a);
b=compet(test_t(:,i)); b=find(b);
if b==1
n_setosa=n_setosa+1;
fprintf(‘ Iris-setosa ‘);
if abs(a-b)>0
error_setosa=error_setosa+1;
fprintf(‘%.0f Yes\n’,a);
else
fprintf(‘%.0f No\n’,a);
end
elseif b==2
n_versicolor=n_versicolor+1;
fprintf(‘ Iris-versicolor ‘);
if abs(a-b)>0
error_versicolor=error_versicolor+1;
fprintf(‘%.0f Yes\n’,a);
else
fprintf(‘%.0f No\n’,a);
end
else
n_verginica=n_verginica+1;
fprintf(‘ Iris-verginica ‘);
if abs(a-b)>0
error_verginica=error_verginica+1;
fprintf(‘%.0f Yes\n’,a);
else
fprintf(‘%.0f No\n’,a);
end
end
end

error=(error_setosa+error_versicolor+error_verginica)/n*100;

error_setosa=error_setosa/n_setosa*100;
error_versicolor=error_versicolor/n_versicolor*100;
error_verginica=error_verginica/n_verginica*100;

fprintf(1,’ \n’)
fprintf(1,’ Iris-setosa recognition error: %.2f \n’,error_setosa);
fprintf(1,’ Iris-versicolor recognition error: %.2f \n’,error_versicolor);
fprintf(1,’ Iris-verginica recognition error: %.2f \n’,error_verginica);
fprintf(1,’ \n’)
fprintf(1,’ Total Iris plant recognition error: %.2f \n’,error);
fprintf(1,’ \n’)

disp(‘end of Iris_bp.m’);

% =======================================
% End of the program. Find the outputs below
% =======================================

 

If you have any queries, you can e-mail me at : srinath.krishnamoorthy@villacollege.edu.mv

About me :

I m Srinath Krishnamoorthy. I m an MTech in Computer Science and Engineering from MA College of Engineering, Kothamangalam, and BTech in Information Technology from Government Engineering College, Palakkad. I m teaching Artificial Intelligence for the University of West of England (BSc Computer Science) in Male`, Maldives. My area of interests is AI, Data Analytics, Computational Intelligence and Theory of Computation.

Multilayer Neural Network – Implementing Back-Propagation Algorithm-sciLab Program

//Multi Layer neural Network BP Algogithm
//Author : Srinath Krishnamoorthy
//Date : 10-08-2017
//(c)Copyright Srinath Krishnamoorthy-2017

clear
clear all
clc

//Initialise the input values at Layer 1
x=[0 0
0 1
1 0
1 1];

yd=[0;1;1;0];//Desired Output at Y
ya=rand(4,1);//Actual Output

//Initialise the weights from i/p to hidden layer

w_ih=rand(2,2);
w_initial=w_ih;

//Initialise the weights from hidden to output layer

w_h1y=rand(1); //Hidden Neuron 1 to o/p neuron Y
w_h2y=rand(1); //Hidden Neuron 2 to o/p neuron Y

w_h1y_initial=w_h1y;
w_h2y_initial=w_h2y;

//Set the bias of the neurons-h1,h2 and y

bh1=-1;
bh2=-1;
by=-1;

//Set the thresholds for each neuron

th1=rand(1);
th2=rand(1);
ty=rand(1);

//Error at Y

err_y=0.00;

//Error gradient at h1,h2 and Y
err_grad_h1=0.00;
err_grad_h2=0.00;
err_grad_y =0.00;

lr=0.5;//Learning rate

flag=0;
net_h1=0;//Net ouput at h1
net_h2=0;//Net ouput at h2
net_y =0;//Net ouput at Y

//Actual output of h1,h2 and Y will be the sigmoid of their net outputs.

actual_h1=0.00;
actual_h2=0.00;
actual_y=0.00;

epoch=0;//Counts the number of cycles

delta_wh1y=0.00;
delta_wh2y=0.00;
delta_ty=0.00;

//Sum of squared error. Will be executed till it gets below a certain range

sum_sqr_err=0.00;
errors=zeros(4,1);

while flag==0 do

for i=1:4
//calculate the net output of hidden neuron h1
for j=1:2
net_h1=net_h1+[x(i,j)*w_ih(j,1)];
end;

//calculate the net output of hidden neuron h2
for j=1:2
net_h2=net_h2+[x(i,j)*w_ih(j,2)];
end;
//Applying Bias and Threshold, net values at h1 and h2 will be
net_h1=net_h1+(bh1*th1);
net_h2=net_h2+(bh2*th2);

//Actual Output is the Sigmoid of net output at h1 and h2

actual_h1=1/[1+%e^(-1*net_h1)];
actual_h2=1/[1+%e^(-1*net_h2)];

//Now we need to calculate the net output at Y
net_y=(actual_h1*w_h1y)+(actual_h2*w_h2y)+(by*ty);
//Thus actual output at Y is sigmoid of net_y
actual_y=1/[1+%e^(-1*net_y)];

//Calculate the error at Y
err_y=yd(i,1)-actual_y;
ya(i,1)=actual_y;
errors(i,1)=err_y;
//Calculate the error gradient at Y
err_grad_y=actual_y*(1-actual_y)*err_y;
//Now we go for weight correction
delta_wh1y=lr*actual_h1*err_grad_y;
delta_wh2y=lr*actual_h2*err_grad_y;
delta_ty=lr*by*err_grad_y;

// Now we calculate the err gradient of hidden neurons
err_grad_h1=actual_h1*(1-actual_h1)*err_grad_y*w_h1y;
err_grad_h2=actual_h2*(1-actual_h2)*err_grad_y*w_h2y;

//Weight corrections for hidden neuron h1:

for j=1:2
w_ih(j,1)=w_ih(j,1)+[lr*x(i,j)*err_grad_h1];
end;
//Adjust the threshold of the hidden neuron h1
th1=th1+[lr*bh1*err_grad_h1];

//Weight corrections for hidden neuron h2:

for j=1:2
w_ih(j,2)=w_ih(j,2)+[lr*x(i,j)*err_grad_h2];
end;
//Adjust the threshold of the hidden neuron h1
th2=th2+[lr*bh2*err_grad_h2];

//Now we adjust all weights and threshold levels from hidden layer to output layer

w_h1y=w_h1y+delta_wh1y;
w_h2y=w_h2y+delta_wh2y;
ty=ty+delta_ty;

//We reset the output values prior to next iteration
net_h1=0.00;
net_h2=0.00;
net_y=0.00;
actual_h1=0.00;
actual_h2=0.00;
actual_y=0.00;
err_y=0.00;
err_grad_y=0.00;
err_grad_h1=0.00;
err_grad_h2=0.00;
delta_wh1y=0.00;
delta_wh2y=0.00;
delta_ty=0.00;
end //End of main for() loop
epoch=epoch+1;

for k=1:4
sum_sqr_err=sum_sqr_err + [errors(k,1)^2];
end;
//Sum of squared errors (SSE) is an useful indicator of network’s performance. As per Nagnevitsky 3rd edition page 183, ideal value is set less than or equal to 0.0010

if sum_sqr_err > 0.0010 then
flag=0;
else
flag=1;
end;
disp(sum_sqr_err,’Sum of Squared Errors = ‘);
disp(errors,’The errors after this epoch is : ‘);
sum_sqr_err=0.00;
errors=zeros(4,1);
disp(ya,’Actual Output = ‘);
disp(yd,’Desired Output’);
disp(epoch,’end of epoch’);
disp(‘********************************************************************************’);
end
//End of the do-while loop
disp(epoch,’The number of epochs required is :’,lr,’For the learning rate’)

disp(w_initial,’Initial Weights between input layer and hidden layer is : ‘);

disp(w_ih,’Final Weights between input layer and hidden layer is : ‘);

disp(w_h1y_initial,’Initial weight between hidden layer neuron h1 and output neuron Y is : ‘);

disp(w_h2y_initial,’Initial weight between hidden layer neuron h2 and output neuron Y is : ‘);

disp(w_h1y,’Final weight between hidden layer neuron h1 and output neuron Y is : ‘);

disp(w_h2y,’Final weight between hidden layer neuron h2 and output neuron Y is : ‘);

plot(yd,ya);
//The plot should yield a straight line

Backpropagation algorithm

Backpropagation algorithm2

 

If you have any queries, you can e-mail me at : srinath.krishnamoorthy@villacollege.edu.mv

About me :

I m Srinath Krishnamoorthy. I m an MTech in Computer Science and Engineering from MA College of Engineering, Kothamangalam, and BTech in Information Technology from Government Engineering College, Palakkad. I m teaching Artificial Intelligence for the University of West of England (BSc Computer Science) in Male`, Maldives. My area of interests is AI, Data Analytics, Computational Intelligence and Theory of Computation.

Srinath Krishnamoorthy (1)Srinath Krishnamoorthy

sciLab Program for Training a a Perceptron

//Author : Srinath Krishnamoorthy
//Date : 10-07-2017

clear
clear all
clc
//initialise the inputs
x=[1 0 0
1 0 1
1 1 0
1 1 1];
disp(‘Input is :’);
disp(‘ B x1 x2’);
disp(x);
yd=[0;0;0;1]; // This is for AND gate you can change it to any linearly separable function of your choice
disp(‘Target Output Yd Is :’);
disp(yd);

ya=rand(4,1);

//Initialise the weights
w=rand(1,3);
w1=w

disp(‘Initialise Random Weights:’);
disp(‘ W1 W2 W3’);
disp(w);

lr=0.5;
disp(‘Learning Coefficient =’);
disp(lr);

flag=0;
net=0;
err=0;
epoch=0;
thresh=0;

while flag==0 do
for i=1:4
for j=1:3
net=net+w(1,j)*x(i,j);
end;
if net >= thresh then
ya(i,1)=1;
else
ya(i,1)=0;
end;

err=yd(i,1)-ya(i,1);
for j=1:3
w(1,j)=w(1,j)+ (lr*x(i,j)*err);
end;
net=0.00; //Reset net for next iteration
end
disp(ya,’Actual Output’);
disp(yd,’Desired Output’);

epoch=epoch+1;
disp(‘End of Epoch No:’);
disp(epoch);
disp(‘************************************************************’);
if epoch > 1000 then
disp(‘Learning Attempt Failed !’)
break
end;

if yd(1,1) == ya(1,1)& yd(2,1) == ya(2,1) & yd(3,1) == ya(3,1) & yd(4,1) == ya(4,1) then
flag=1;
else
flag=0;
end
end
disp(‘Initial Random Weights -‘);
disp(w1);
disp(‘Final Adjusted Weights -‘);
disp(w);
disp(lr,’Learning rate is – ‘)
disp(‘***********************************’)
plot(yd,ya);

 

Fuzzy Logic Controller for an Automatic Braking System

This is a step-by-step tutorial on how to create a Fuzzy logic Controller for an Automatic Braking System using MATLAB(R2015a).  Below is a typical automotive braking system and we shall discuss how the system can be automated based on a set of input parameters.

Image result for Automatic Braking System

Source: www.carparts.com

The problem at hand has been adopted from Artificial Intelligence Illuminated – by Ben Coppin (1st Edition, Jones & Bartlett) – p.516-522.

Image result for ai illuminated ben coppin

Before we look at the problem, we need to understand the four key steps in creating a Fuzzy Logic System (FLS):

  1. Define inputs and outputs
  2. Create a membership function
  3. Create rules
  4. Simulate the fuzzy logic system / Fuzzy Inference System (FIS)

To get clarity on the above steps, we have to analyze the problem at hand. The problem given in the text book may look trivial but the actual scenario might be quite complicated and may comprise of thousands of rules.

Rules 1

 

As per the above problem, there are three inputs:

  1. Pressure on the brake pedal (brake pressure)
  2. Car speed
  3. Wheel speed

And one output:

  1. Brake (which has two decision parameters- Apply Brake or Release Brake)

To apply these rules, using Mamdani inference, the first step is to fuzzify the crisp input values. To do this, we need first to define the fuzzy sets for the various linguistic variables we are using. For this simple example, we will assume that brake pressure is measured from 0 (no pressure) to 100 (brake fully applied).

We will define brake pressure as having three linguistic values: high (H), medium (M), and low (L), which we will define as follows:

                                             H = {(50, 0), (100, 1)}
                                            M = {(30, 0), (50, 1), (70, 0)}
                                            L = {(0, 1), (50, 0)}

Even though the author has considered the the membership function for the input parameter “brake_pressure” as triangular, I would opt for gaussian membership function (which suits the input requirement here)

You can create FIS using MATLAB or sciLab. Both have built-in fuzzy logic toolkits for this. I’m using MATLAB.

Step 1 : Open Matlab and type fuzzy on the command line to invoke the  FLT as shown:

fuzzy

This will open the fuzzy logic designer console as follows:

FL Designer.JPG

Step 2 : Click on the “inputs” section (in yellow) to add inputs. Note that an input is already defined here, we need to add two more for our problem. For this goto–>edit–>add variable–>input

inputoutput.JPG

Step 3: Double click on “BrakePressure” and click on the red-curve. Rename it from “mf1” to “low” and change the Type from ” trimf to gaussmf” as shown below. Similarly, do it for medium and high as well by clicking over the curves:

brkprss.JPG

Step 4: Now let’s fuzzify the car speed input. This can be trapezoidal and triangular since we have a speedometer that can objectively tell the speed (eg. 55.5 mph). Double click on the rectangle that has CarSpeed variable. We will define the car speed as also having three linguistic values: slow, medium, and fast.We will define the membership functions for these values for a universe of discourse of values from 0 to 100:
                                                          S = {(0, 1), (60, 0)}
                                                         M = {(20, 0), (50, 1), (80, 0)}
                                                         F = {(40, 0), (100, 1)}
If the wheel speed is in fact 55, then this gives us membership values as follows:
                                                           M[slow(55)] = 0.083
                                                           M[medium(55)]= 0.833
                                                           M[fast(55)] = 0.25

**M[] for “membership of”

carspeed.JPG

Step 5 : Set the wheel speed on the same parameters as car speed and based on three linguistic variables: Slow, Medium and Fast.

Slow

wheel speeed 1.JPG

medium

wheel speed 2.JPG

fast

fast.JPG

Step 6 : Now that we have defined the inputs, let’s define the output- that is break (Apply_break and Release Break). Both are trapmfs:

brake.JPG

Source: AI Illuminated, Ben Coppin, p.520

 

brake2.JPG

By default, the system uses “Mamdani” model. Here the defuzzification is centroid methoddefuzz.JPG

The formula for centroid method is:

formula

How the fuzzy values for the antecedents of the rules are applied to the consequents is being shown below:

clipping.JPG

Source: AI Illuminated, Ben Coppin, p.521

Step 7 : Remember the rules? Yeah… now since we defined the fuzzy variables and their membership attributes, we need to create the rule base by feeding it into the system. For this go to –> edit –>rules to get the Rules Editor. You can add-edit-delete rules as shown below. Once done, click ‘close’ to return to the fuzzy logic designer.

rulesrama.JPG

Step 8 : You can save it to the hard disk by exporting it as shown below:

save.JPG

You can export it to your workspace as well so that it’s instance is available throughout the session and you can invoke it whenever you want. To do this goto –> file–>export to the workspace:

click “OK”

sys.JPG

conso.JPG

Step 9 : Now your FIS (Fuzzy Inference System) for the automatic Breaking System is ready.

Click on your Fuzzy Logic Designer window-  view–>rules to see all the rules sets as shown below:

rulesets.JPG

You can drag the red line (|) to left (<–) or right (–>) and see how the crisp output varies for various inputs as below:elow :

vary.JPG

Step 10 : Plotting the surface. If you are using sciLab, you need to use a function called “plotsurf()” to plot the surface of the plane created by the FIS you have just created, but MATLAB makes things easier. You just have to click –> view –>surface to view the surface as shown below:

Surfaceplot.JPG

You can rotate it by dragging the surface :

drag.JPG

You can aslo view the output in terms of just one parameter by varying the selection criteria :

single.JPG

The above plot shows how the output (brake) varies with respect to Car_Speed. You can try it out for other input parameters like Wheel_Speed and Brake_Pressure.

References:

  1. Artificial Intelligence Illuminated, Ben Coppin, First Edition, Jones and Bartlett.
  2. Artificial Intelligence – A Guide to Intelligence Systems, Michael Negnevitsky, 3rd Edition
  3. http://www.mathworks.com

If you have any queries, you can e-mail me at : srinath.krishnamoorthy@villacollege.edu.mv

About me :

I m Srinath Krishnamoorthy. I m an MTech in Computer Science and Engineering from MA College of Engineering, Kothamangalam, and BTech in Information Technology from Government Engineering College, Palakkad. I m teaching Artificial Intelligence for the University of West of England (BSc Computer Science) in Male`, Maldives. My area of interests is AI, Data Analytics, Computational Intelligence and Theory of Computation.

Srinath Krishnamoorthy (1)
Srinath Krishnamoorthy

 

 

 

 

Perceptron Learning Algorithm in C

Image result for rosenblatt's perceptron

Capture1Capture2Capture3Capture4Capture5Capture6

You can vary the learning coefficients (0<lr<1), the desired output, initial weights etc. And then see the difference in the number of epoch’s it takes for the perceptron to learn a particular pattern.

For example : For AND Gate Yd=[0 0 0 1]  and for OR Gate Yd=[0 1 1 1]

Here we display the output for AND operation:

Capture7Capture8Capture9

 

Thank you.

 

The Author is an MTech in Computer Science and Engineering from M.A. College of Engineering, Kothamangalam. He currently teaches Computer Science(UWE) at Villa College, Male`, Maldives.

For any queries, you can reach the author at – srinathtk86@gmail.com

The Lesson

 

Image may contain: 2 people, selfie

A big lesson for youngsters who are thinking that revolution is just around the corner:

1. Study well and get a good job.

2. Family first. Because those who are giving heart thumping speeches, writing lengthy blogs and making revolutionary movies have cleverly insulated themselves and their families from the very thing they stand for. Look around. They are getting hefty salaries and pensions. Their children are millionaires riding fancy cars and bikes. Most of them are studying abroad and are enjoying a luxurious life. Don’t waste your time on political parties, politicians or political ideologies. They are never gonna make your life better. In fact, they have never made anybody’s life better. They are used to using you. When was the last time a student body; affiliated to any political party in India; stage a protest against unemployment and lack of infrastructure in educational institutions?(Hard nut to crack… )

There is absolutely no point in wasting your time idolising politicians or movie stars or anything or anybody for any reason.

They have their lives and most of them don’t care beyond that. Plain and simple.

3. Never let any ideology or politician or religious entity brainwash you for any fu***’in reason under the sky ‘coz that baggage will be nothing but a burden in the latter stages of your life. You gotta live with it forever or flush it down the closet at some point in time.

4. The loss is yours and yours alone.

5. The people you are willing to die for will drag your parents on the streets once you are gone. That’s just how important each one of you(us) are for them and maybe for the society as a whole.

6. Movies will come, rake profits and fade out into the oblivion. Do not let the things that you watch in theatres get into your brain. Try to critically evaluate things.

7. At the end of the day, your life is your responsibility… ‘coz no one gives a shit.

8. And last but not the least, people who revolt on #Facebook and #twitter will remain there forever. They will never swarm the streets and stand for you. Open your eyes and get a life before it is too late.

– Srinath Krishnamoorthy

#Justice4Jishnu

Forward by Suhail Mathur

Image result for hope we never meet again

The first thing that comes to my mind after reading Srinath Krishnamoorthy’s spellbinding novel is that the savage in man is never quite eradicated. I’m a person who loves travelling and many a time has come across random phone numbers scribbled on nameless walls. But I never gave it a second though and never expected in my life to read a mesmerising thriller that spins out a plot from a mobile number scribbled on the walls of a railway toilet.

Hope We Never Meet Again is a novel that explores the journey of a man and his evolution from existence to death. The book is a deep dive into our own minds and our own sense of truth and fallacy. After reading the novel, I am sure most of the readers would contemplate a lot on the fundamental questions of existence. About death and its aftermath. About crime and the ultimate punishment at the hand’s destiny.

Are all deaths natural? Or are they murders in disguise? Will the murderer ever get punished in this shallow world?   Srinath Krishnamoorthy was successful in creating a plot that keeps the reader hooked, and he does answer it all. The novel is unique in its narrative and the craft. Yes, it’s a new way of storytelling.

When I read the first chapter, I was hooked and did not put it down till I had finished the last page.  Srinath immediately caught my attention by breaking the silence and taboos of sexuality and relationships by the way he described the thoughts of desire that naturally run through everyone’s mind.  He has carried that “Indian spiciness” throughout the book especially in the last three chapters… Wow!   He gives the auto-rickshaw a whole new meaning!  No wonder this book is flying off shelves and into the hands of readers who turn the pages like the wind!  Sexuality is a normal part of life but some refuse to acknowledge that.  Pretending it doesn’t exist is unrealistic and unfortunate. Bravo for speaking up about it all, I was hooked and obviously, I wasn’t the only one!

What I feel is unique about the novel is the fact that the author has masterfully blended the idea of dreams, realities, myths and the bitter hard truths of life into an intricate, yet beautiful novel.

I was literally stunned by the way the author had bound different story lines happening within flashbacks and suspenseful twists and turns at the end of each chapter. It’s all tied up with one astounding string called “words”. There are several lessons in life integrated within the book as well like how it’s important to say what you need to say to someone before it’s too late.  And sometimes by taking the wrong path in life is okay because it leads you to the right one eventually.  Most of all, one small shift or decision such as scribbling a phone number down from a dirty railway toilet wall can change the very course of life because ultimately it’s our actions and decisions that shape us.

I would like to give two big thumbs up for Srinath Krishnamoorthy and “Hope We Never meet Again”. It is one of the best novels I have read in ages.

Suhail Mathur