?? greedykpca.html
字號(hào):
<html><head> <meta HTTP-EQUIV="Content-Type" CONTENT="text/html;charset=ISO-8859-1"> <title>greedykpca.m</title><link rel="stylesheet" type="text/css" href="../../../m-syntax.css"></head><body><code><span class=defun_kw>function</span> <span class=defun_out>[model,Z]</span>=<span class=defun_name>greedykpca</span>(<span class=defun_in>X,options</span>)<br><span class=h1>% GREEDYKPCA Greedy Kernel Principal Component Analysis.</span><br><span class=help>%</span><br><span class=help>% <span class=help_field>Synopsis:</span></span><br><span class=help>% [model,Z] = greedykpca(X)</span><br><span class=help>% [model,Z] = greedykpca(X,options)</span><br><span class=help>%</span><br><span class=help>% <span class=help_field>Description:</span></span><br><span class=help>% This function implements a greedy version of the kernel PCA </span><br><span class=help>% algorithm [Franc03b]. The input data X are first approximated by </span><br><span class=help>% greedyappx.m and second the ordinary PCA is applyed on the </span><br><span class=help>% approximated data. This algorithm has the same objective as </span><br><span class=help>% the ordinary Kernel PCA with an extra condition imposed on the </span><br><span class=help>% maximal number of data in the resulting kernel projection </span><br><span class=help>% (expansion). It greedy KPCA is useful when sparse kernel projection </span><br><span class=help>% is desired and when the input data are large.</span><br><span class=help>% </span><br><span class=help>% <span class=help_field>Input:</span></span><br><span class=help>% X [dim x num_data] Input data.</span><br><span class=help>% </span><br><span class=help>% options [struct] Control parameters:</span><br><span class=help>% .ker [string] Kernel identifier. See 'help kernel' for more info.</span><br><span class=help>% .arg [1 x narg] Kernel argument.</span><br><span class=help>% .m [1x1] Maximal number of base vectors (Default m=0.25*num_data).</span><br><span class=help>% .p [1x1] Depth of search for the best basis vector (p=m).</span><br><span class=help>% .mserr [1x1] Desired mean squared reconstruction errors.</span><br><span class=help>% .maxerr [1x1] Desired maximal reconstruction error.</span><br><span class=help>% See 'help greedyappx' for more info about the stopping conditions.</span><br><span class=help>% .verb [1x1] If 1 then some info is displayed (default 0).</span><br><span class=help>% </span><br><span class=help>% <span class=help_field>Output:</span></span><br><span class=help>% model [struct] Kernel projection:</span><br><span class=help>% .Alpha [nsv x new_dim] Multipliers defining kernel projection.</span><br><span class=help>% .b [new_dim x 1] Bias the kernel projection.</span><br><span class=help>% .sv.X [dim x num_data] Seleted subset of the training vectors..</span><br><span class=help>% .nsv [1x1] Number of basis vectors.</span><br><span class=help>% .kercnt [1x1] Number of kernel evaluations.</span><br><span class=help>% .options [struct] Copy of used options.</span><br><span class=help>% .MaxErr [1 x nsv] Maximal reconstruction error for corresponding</span><br><span class=help>% number of base vectors.</span><br><span class=help>% .MsErr [1 x nsv] Mean square reconstruction error for corresponding</span><br><span class=help>% number of base vectors.</span><br><span class=help>%</span><br><span class=help>% Z [m x num_data] Training data projected by the found kernel projection.</span><br><span class=help>% </span><br><span class=help>% <span class=help_field>Example:</span></span><br><span class=help>% X = gencircledata([1;1],5,250,1);</span><br><span class=help>% model = greedykpca(X,struct('ker','rbf','arg',4,'new_dim',2));</span><br><span class=help>% XR = kpcarec(X,model); </span><br><span class=help>% figure; </span><br><span class=help>% ppatterns(X); ppatterns(XR,'+r');</span><br><span class=help>% ppatterns(model.sv.X,'ob',12);</span><br><span class=help>%</span><br><span class=help>% See also </span><br><span class=help>% KERNELPROJ, KPCA, GREEDYKPCA.</span><br><span class=help>%</span><br><hr><span class=help1>% <span class=help1_field>About:</span> Statistical Pattern Recognition Toolbox</span><br><span class=help1>% (C) 1999-2003, Written by Vojtech Franc and Vaclav Hlavac</span><br><span class=help1>% <a href="http://www.cvut.cz">Czech Technical University Prague</a></span><br><span class=help1>% <a href="http://www.feld.cvut.cz">Faculty of Electrical Engineering</a></span><br><span class=help1>% <a href="http://cmp.felk.cvut.cz">Center for Machine Perception</a></span><br><br><span class=help1>% <span class=help1_field>Modifications:</span></span><br><span class=help1>% 10-jun-2004, VF</span><br><span class=help1>% 05-may-2004, VF</span><br><span class=help1>% 14-mar-2004, VF</span><br><br><hr>start_time = cputime;<br>[dim,num_data]=size(X);<br><br><span class=comment>% process input arguments</span><br><span class=comment>%------------------------------------</span><br><span class=keyword>if</span> <span class=stack>nargin</span> < 2, options = []; <span class=keyword>else</span> options=c2s(options); <span class=keyword>end</span><br><span class=keyword>if</span> ~isfield(options,<span class=quotes>'ker'</span>), options.ker = <span class=quotes>'linear'</span>; <span class=keyword>end</span><br><span class=keyword>if</span> ~isfield(options,<span class=quotes>'arg'</span>), options.arg = 1; <span class=keyword>end</span><br><span class=keyword>if</span> ~isfield(options,<span class=quotes>'m'</span>), options.m = fix(0.25*num_data); <span class=keyword>end</span><br><span class=keyword>if</span> ~isfield(options,<span class=quotes>'p'</span>), options.p = options.m; <span class=keyword>end</span><br><span class=keyword>if</span> ~isfield(options,<span class=quotes>'maxerr'</span>), options.maxerr = 1e-6; <span class=keyword>end</span><br><span class=keyword>if</span> ~isfield(options,<span class=quotes>'mserr'</span>), options.mserr = 1e-6; <span class=keyword>end</span>
<br><span class=keyword>if</span> ~isfield(options,<span class=quotes>'verb'</span>), options.verb = 0; <span class=keyword>end</span><br><br><span class=comment>% greedy algorithm to select subset of training data</span><br><span class=comment>%-------------------------------------------------------</span><br>[inx,Alpha,Z,kercnt,MsErr,MaxErr] = ...<br> greedyappx(X,options.ker,options.arg,...<br> options.m,options.p,options.mserr,options.maxerr,options.verb); <br> <br><span class=comment>% apply ordinary PCA</span><br><span class=comment>%------------------------------</span><br>mu = sum(Z,2)/num_data;<br>Z=Z-mu*ones(1,num_data);<br><br>S = Z*Z';<br>[U,D,V]=svd(S);<br><br>model.eigval=diag(D);<br>sum_eig = triu(ones(size(Z,1),size(Z,1)),1)*model.eigval;<br>model.MsErr = MsErr(<span class=keyword>end</span>)+sum_eig/num_data;<br><br>options.new_dim = min([options.new_dim,size(Z,1)]);<br><br>V = V(:,1:options.new_dim);<br><br><span class=comment>% fill up the output model</span><br><span class=comment>%-------------------------------------</span><br>model.Alpha = Alpha'*V;<br>model.nsv = length(inx); <br>model.b = -V'*mu;<br>model.sv.X= X(:,inx);<br>model.sv.inx = inx;<br>model.kercnt = kercnt;<br>model.GreedyMaxErr = MaxErr;<br>model.GreedyMsErr = MsErr;<br>model.options = options;<br>model.cputime = cputime - start_time;<br>model.fun = <span class=quotes>'kernelproj'</span>;<br><br><span class=jump>return</span>;<br><br><br></code>
?? 快捷鍵說明
復(fù)制代碼
Ctrl + C
搜索代碼
Ctrl + F
全屏模式
F11
切換主題
Ctrl + Shift + D
顯示快捷鍵
?
增大字號(hào)
Ctrl + =
減小字號(hào)
Ctrl + -