天天看點

matlab pca降維

參考部落格

https://www.cnblogs.com/kailugaji/p/11594507.html

感謝部落客 凱魯嘎吉

理論可參考b站 https://www.bilibili.com/video/BV1j7411t7NT?from=search&seid=3237724714365185604

感謝up主 厚道仁心

函數 pcafeatures.m

function [featuresTrainpca,featuresTestpca]=pcafeatures(featuresTrain,featuresTest,yuzhi)
% load featuresTrain;
% load featuresTest;
numTrain=size(featuresTrain,1);%參數1代表取行數就是樣本數
numTest=size(featuresTest,1);
features=[featuresTrain;featuresTest];

[coeff,score,latent,tsquare] = pca(features);

% yuzhi=95;
leijigongxiandu=0;
i=1;
while (leijigongxiandu<=yuzhi)
    leijigongxiandu=((latent(i)/sum(latent))*100)+leijigongxiandu;
    i=i+1;
end
disp(i)
featurespca=features*coeff(:,1:i);%和本來的向量做運算并取前i列
featuresTrainpca=featurespca(1:numTrain,:);
featuresTestpca=featurespca(numTrain+1:end,:);
% save featuresTrainpca featuresTrainpca;
% save featuresTestpca featuresTestpca;
end
           

調用的語句

[featuresTrainpca,featuresTestpca]=pcafeatures(featuresTrain,featuresTest,95);%95是累計貢獻度的門檻值
           

繼續閱讀