您的位置:首页 > 编程语言 > MATLAB

LDA matlab code

2014-02-19 13:39 381 查看
% prophet Mohammed said [ALLAH will help any one helped his/her brother/sister] PBUH
%This code to apply LDA (Linear Discriminant Analysis)
% for any information please send to engalaatharwat@hotmail.com
%Egypt - HICIT - +20106091638

% This example deals with 2 classes
c1=[1 2;2 3;3 3;4 5;5 5]  % the first class 5 observations
c2=[1 0;2 1;3 1;3 2;5 3;6 5] % the second class 6 observations
scatter(c1(:,1),c1(:,2),6,'r'),hold on;
scatter(c2(:,1),c2(:,2),6,'b');

% Number of observations of each class
n1=size(c1,1)
n2=size(c2,1)

%Mean of each class
mu1=mean(c1)
mu2=mean(c2)

% Average of the mean of all classes
mu=(mu1+mu2)/2

% Center the data (data-mean)
d1=c1-repmat(mu1,size(c1,1),1)
d2=c2-repmat(mu2,size(c2,1),1)

% Calculate the within class variance (SW)
s1=d1'*d1
s2=d2'*d2
sw=s1+s2
invsw=inv(sw)

% in case of two classes only use v
% v=invsw*(mu1-mu2)'

% if more than 2 classes calculate between class variance (SB)
sb1=n1*(mu1-mu)'*(mu1-mu)
sb2=n2*(mu2-mu)'*(mu2-mu)
SB=sb1+sb2
v=invsw*SB

% find eigne values and eigen vectors of the (v)
[evec,eval]=eig(v)

% Sort eigen vectors according to eigen values (descending order) and
% neglect eigen vectors according to small eigen values
% v=evec(greater eigen value)
% or use all the eigen vectors

% project the data of the first and second class respectively
y2=c2*v
y1=c1*v

运行结果:

c1 =

     1     2

     2     3

     3     3

     4     5

     5     5

c2 =

     1     0

     2     1

     3     1

     3     2

     5     3

     6     5

n1 =

     5

n2 =

     6

mu1 =

    3.0000    3.6000

mu2 =

    3.3333    2.0000

mu =

    3.1667    2.8000

d1 =

   -2.0000   -1.6000

   -1.0000   -0.6000

         0   -0.6000

    1.0000    1.4000

    2.0000    1.4000

d2 =

   -2.3333   -2.0000

   -1.3333   -1.0000

   -0.3333   -1.0000

   -0.3333         0

    1.6667    1.0000

    2.6667    3.0000

s1 =

   10.0000    8.0000

    8.0000    7.2000

s2 =

   17.3333   16.0000

   16.0000   16.0000

sw =

   27.3333   24.0000

   24.0000   23.2000

invsw =

    0.3991   -0.4128

   -0.4128    0.4702

sb1 =

    0.1389   -0.6667

   -0.6667    3.2000

sb2 =

    0.1667   -0.8000

   -0.8000    3.8400

SB =

    0.3056   -1.4667

   -1.4667    7.0400

v =

    0.7274   -3.4917

   -0.8157    3.9156

evec =

   -0.9790    0.6656

   -0.2040   -0.7463

eval =

         0         0

         0    4.6430

y2 =

    0.7274   -3.4917

    0.6391   -3.0679

    1.3666   -6.5596

    0.5508   -2.6440

    1.1900   -5.7119

    0.2859   -1.3725

y1 =

   -0.9041    4.3394

   -0.9924    4.7633

   -0.2649    1.2716

   -1.1690    5.6110

   -0.4415    2.1193

我的理解:

将特征值降序排列,对应的特征向量为:

w1:evec(:,2)

w2:evec(:,1)

 z1=c1*evec

z2=c2*evec

z1 =

   -1.3869   -0.8271

   -2.5698   -0.9079

   -3.5488   -0.2424

   -4.9357   -1.0695

   -5.9147   -0.4040

z2 =

   -0.9790    0.6656

   -2.1619    0.5848

   -3.1409    1.2503

   -3.3448    0.5040

   -5.5068    1.0887

   -6.8937    0.2616

c1*w1>c1*w2,c1属于类别1

c2*w2>c2*w1,c2属于类别2
内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息
标签: