?? greedykpca.m~
字號:
function [model,Z]=greedykpca(X,options)% GREEDYKPCA Greedy Kernel Principal Component Analysis.%% Synopsis:% [model,Z] = greedykpca(X)% [model,Z] = greedykpca(X,options)%% Description:% This function implements a greedy version of the kernel PCA % algorithm [Franc03b]. The input data X are first approximated by % greedyappx.m and second the ordinary PCA is applyed on the % approximated data. This algorithm has the same objective as % the ordinary Kernel PCA with an extra condition imposed on the % maximal number of data in the resulting kernel projection % (expansion). It greedy KPCA is useful when sparse kernel projection % is desired and when the input data are large.% % Input:% X [dim x num_data] Input data.% % options [struct] Control parameters:% .ker [string] Kernel identifier. See 'help kernel' for more info.% .arg [1 x narg] Kernel argument.% .m [1x1] Maximal number of base vectors (Default m=0.25*num_data).% .p [1x1] Depth of search for the best basis vector (p=m).% .mserr [1x1] Desired mean squared reconstruction errors.% .maxerr [1x1] Desired maximal reconstruction error.% See 'help greedyappx' for more info about the stopping conditions.% .verb [1x1] If 1 then some info is displayed (default 0).% % Output:% model [struct] Kernel projection:% .Alpha [nsv x new_dim] Multipliers defining kernel projection.% .b [new_dim x 1] Bias the kernel projection.% .sv.X [dim x num_data] Seleted subset of the training vectors..% .nsv [1x1] Number of basis vectors.% .kercnt [1x1] Number of kernel evaluations.% .options [struct] Copy of used options.% .MaxErr [1 x nsv] Maximal reconstruction error for corresponding% number of base vectors.% .MsErr [1 x nsv] Mean square reconstruction error for corresponding% number of base vectors.%% Z [m x num_data] Training data projected by the found kernel projection.% % Example:% X = gencircledata([1;1],5,250,1);% model = greedykpca(X,struct('ker','rbf','arg',4,'new_dim',2));% XR = kpcarec(X,model); % figure; % ppatterns(X); ppatterns(XR,'+r');% ppatterns(model.sv.X,'ob',12);%% See also % KERNELPROJ, KPCA, GREEDYKPCA.%% About: Statistical Pattern Recognition Toolbox% (C) 1999-2003, Written by Vojtech Franc and Vaclav Hlavac% <a href="http://www.cvut.cz">Czech Technical University Prague</a>% <a href="http://www.feld.cvut.cz">Faculty of Electrical Engineering</a>% <a href="http://cmp.felk.cvut.cz">Center for Machine Perception</a>% Modifications:% 10-jun-2004, VF% 05-may-2004, VF% 14-mar-2004, VFstart_time = cputime;[dim,num_data]=size(X);% process input arguments%------------------------------------if nargin < 2, options = []; else options=c2s(options); endif ~isfield(options,'ker'), options.ker = 'linear'; endif ~isfield(options,'arg'), options.arg = 1; endif ~isfield(options,'m'), options.m = fix(0.25*num_data); endif ~isfield(options,'p'), options.p = options.m; endif ~isfield(options,'maxerr'), options.maxerr = 1e-6; endif ~isfield(options,'mserr'), options.mserr = 1e-6; end
if ~isfield(options,'verb'), options.verb = 0; end% greedy algorithm to select subset of training data%-------------------------------------------------------[inx,Alpha,Z,kercnt,MsErr,MaxErr] = ... greedyappx(X,options.ker,options.arg,... options.m,options.p,options.mserr,options.maxerr,options.verb); % apply ordinary PCA%------------------------------mu = sum(Z,2)/num_data;Z=Z-mu*ones(1,num_data);S = Z*Z';[U,D,V]=svd(S);model.eigval=diag(D);sum_eig = triu(ones(size(Z,1),size(Z,1)),1)*model.eigval;model.MsErr = MsErr(end)+sum_eig/num_data;options.new_dim = min([options.new_dim,size(Z,1)]);V = V(:,1:options.new_dim);% fill up the output model%-------------------------------------model.Alpha = Alpha'*V;model.nsv = length(inx); model.b = -V'*mu;model.sv.X= X(:,inx);model.sv.inx = inx;model.kercnt = kercnt;model.GreedyMaxErr = MaxErr;model.GreedyMsErr = MsErr;model.options = options;model.cputime = cputime - start_time;model.fun = 'kernelproj';return;
?? 快捷鍵說明
復制代碼
Ctrl + C
搜索代碼
Ctrl + F
全屏模式
F11
切換主題
Ctrl + Shift + D
顯示快捷鍵
?
增大字號
Ctrl + =
減小字號
Ctrl + -