Principal component analysis
2010-03-29 22:39
260 查看
function [finalData]=pca2(t,n,target)%t is the original data set.and n is the number of features you need
%UNTITLED2 Summary of this function goes here
% Detailed explanation goes here
temp = t;
result = cov(temp);%calculate the covariance matrix of the temp' matrix
[vect value] = eig(result);% calculate the eignvector and the eignvalue
EigArray = eig(result);%calculate the eignvalue
eign=EigArray';
e_length = length(eign);%compute the length of the eignvalue vector
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%the following loops are to sort the eignvector and record the subscript
%corresponding to each eignvalue
for j=1:1:e_length
val=-inf;
for i=1:1:e_length
if val<eign(i)
val=eign(i);
e_sub(j)=i;
end
end
eign(e_sub(j))=-inf;
end
featureVector=[vect(:,e_sub(1))];
% disp('feature is');
% featureVector
for m=2:1:n
featureVector=[featureVector,vect(:,e_sub(m))];
end
disp('feature is');
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%calculate the meanVector of the temp'
meanVector = temp;
for i=1:1:100
mean = 1/42*sum(meanVector(:,i));
for j=1:1:42
meanVector(j,i)=meanVector(j,i)-mean;
end
end
finalData=featureVector'*meanVector';%get the final data which is processed use PCA method
end
%UNTITLED2 Summary of this function goes here
% Detailed explanation goes here
temp = t;
result = cov(temp);%calculate the covariance matrix of the temp' matrix
[vect value] = eig(result);% calculate the eignvector and the eignvalue
EigArray = eig(result);%calculate the eignvalue
eign=EigArray';
e_length = length(eign);%compute the length of the eignvalue vector
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%the following loops are to sort the eignvector and record the subscript
%corresponding to each eignvalue
for j=1:1:e_length
val=-inf;
for i=1:1:e_length
if val<eign(i)
val=eign(i);
e_sub(j)=i;
end
end
eign(e_sub(j))=-inf;
end
featureVector=[vect(:,e_sub(1))];
% disp('feature is');
% featureVector
for m=2:1:n
featureVector=[featureVector,vect(:,e_sub(m))];
end
disp('feature is');
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%calculate the meanVector of the temp'
meanVector = temp;
for i=1:1:100
mean = 1/42*sum(meanVector(:,i));
for j=1:1:42
meanVector(j,i)=meanVector(j,i)-mean;
end
end
finalData=featureVector'*meanVector';%get the final data which is processed use PCA method
end
相关文章推荐
- 斯坦福大学公开课 :机器学习课程(Andrew Ng)——13、无监督学习:Principal Component Analysis (PCA)
- 主成分分析(Principal Component Analysis,PCA
- 主成分分析PCA(Principal Component Analysis)介绍
- matlab学习:人脸识别之PCA( Principal Component Analysis )
- Principal Component Analysis(PCA)
- 【转载】Implementing a Principal Component Analysis (PCA) in Python step by step
- 机器学习降维算法一:PCA (Principal Component Analysis)
- 核主成分分析(Kernel Principal Component Analysis, KPCA)的公式推导过程
- 『 特征降维』PCA原理-Principal Component Analysis
- 笔记:Online robust principal component analysis via truncated nuclear norm regularization
- PCA(Principal Component Analysis)
- PCA (Principal Component Analysis)主成分分析公式推导
- Principal Component Analysis(PVA) with SVD
- PCA(Principal Component Analysis)主成分分析
- Principal component analysis
- 模式识别特征降维-PCA(Principal Component Analysis)
- Stanford 机器学习 Week8 作业:K-means Clustering and Principal Component Analysis
- Robust principal component analysis?(RPCA简单理解)
- Principal Component Analysis
- Principal Component Analysis 主元分析