Sunday, October 12, 2008

A20 – Neural Networks

This activity is similar to the last two activities in which we try to identify what class an object belongs to. This time we are going to use neural networks which compose of three parts. These are the input layer, hidden layer and the output layer. The input layer contains the characteristics in which the objects can be distinguished and the output layer would tell which class the object belongs to. I would use the same classes I used in activity 19. The two classes are the quail eggs and the squid balls. Neural networks needs to train themselves with the objects and their classification in order to be more efficient in classifying the objects later on. The code was given by Jeric. The modified code is given below for my data set. The code is shown below.

--------------------------------------------------------------------------------------------------

// Simple NN that learns 'and' logic

// ensure the same starting point each time
rand('seed',0);

// network def.
// - neurons per layer, including input
//3 neurons in the input layer, 8 in the hidden layer and 1 in the ouput layer
N = [3,8,1];

// inputs
x = train = fscanfMat("train.txt")';// Training Set

// targets, 0 if squidballs, 1 if quail eggs
t = [0 0 0 0 1 1 1 1 ];
// learning rate is 4 and 0 is the threshold for the error tolerated by the network
lp = [4,0];

W = ann_FF_init(N);

// 1000 training cyles
T = 1000;
W = ann_FF_Std_online(x,t,N,W,lp,T);
//x is the training t is the output W is the initialized weights,
//N is the NN architecture, lp is the learning rate and T is the number of iterations

// full run
j=ann_FF_run(x,N,W) //the network N was tested using x as the test set, and W as the weights of the connections

--------------------------------------------------------------------------------------------------

The results were
[5.34E-03 7.41E-02 1.44E-02 8.34E-03 0.99132471 0.952957436 0.97008475 0.981131133]

Rounding off shows a 100% accuracy. The weakness of this method is it needs to train first a lot of times and also needs a number of samples in order to distinguish between classes. Our brain is far more advance since we can automatically tell which is which even if we just saw it once.

I give myself a grade of 10 in this activity since I was able to understand and employ neural networks in classifying objects. Thank you for Jeric for the code.

Saturday, October 4, 2008

A19 – Probabilistic Classification

In this activity we try to classify objects into classes using Linear Discriminant Analysis (LDA). A detailed discussion was given by Dr. Sheila Marcos in the pdf file given by Dr. Maricor Soriano.
We used two classes from the previous activity. I used the quail eggs class and the squid balls class. I already obtained its features from the previous activity. The features I chose were the mean of normalized r and g value as well as the product of their standard deviation. The features are shown below for the two classes.



Using this set of data I calculated the covariance matrix in scilab using the process described in the pdf file. The covariance matrix calculated is shown below.



Then using this covariance matrix we calculate for the discriminant values using scilab. The results are shown below.



We see that the classification was 100% accurate which means the we were successful in employin LDA to our given data.

I give myself a grade of 10 for this activity since I have achieved a good result using LDA. No one helped me in this activity. I think this is one of the easiest activity.

chdir('D:\Mer\Documents\Majors\AP186\A19\');
vector=[];
for j=1:8
img = imread("k"+string(j)+".jpg");

i = (img(:, :, 1) + img(:, :, 2) + img(:, :, 3));
r = img(:, :, 1)./i;
g = img(:, :, 2)./i;
b = img(:, :, 3)./i;

mnr=mean(r);
mng=mean(g);
str=stdev(r);
stg=stdev(g);
strg=str*stg;


k(j,:)=[mnr,mng,strg];
end

uk=[mean(k(:,1)),mean(k(:,2)),mean(k(:,3))];

for j=1:8
img = imread("sq"+string(j)+".jpg");

i = (img(:, :, 1) + img(:, :, 2) + img(:, :, 3));
r = img(:, :, 1)./i;
g = img(:, :, 2)./i;
b = img(:, :, 3)./i;

mnr=mean(r);
mng=mean(g);
str=stdev(r);
stg=stdev(g);
strg=str*stg;


s(j,:)=[mnr,mng,strg];
end

us=[mean(s(:,1)),mean(s(:,2)),mean(s(:,3))];
u=uk*8+us*8;
u=u/16;
xk=[];
xs=[];

for j=1:8
xk(j,:)=k(j,:)-u;
xs(j,:)=s(j,:)-u;
end

ck=(xk'*xk)/8;
cs=(xs'*xs)/8;
C=.5*(ck+cs);
Ci=inv(C);
kk=uk*Ci*k'-0.5*uk*Ci*uk'+log(0.5);
ks=us*Ci*k'-0.5*us*Ci*us'+log(0.5);

sk=uk*Ci*s'-0.5*uk*Ci*uk'+log(0.5);
ss=us*Ci*s'-0.5*us*Ci*us'+log(0.5);