?? kpca.html
字號:
<html><head> <meta HTTP-EQUIV="Content-Type" CONTENT="text/html;charset=ISO-8859-1"> <title>kpca.m</title><link rel="stylesheet" type="text/css" href="../../../m-syntax.css"></head><body><code><span class=defun_kw>function</span> <span class=defun_out>model </span>= <span class=defun_name>kpca</span>(<span class=defun_in>X,options</span>)<br><span class=h1>% KPCA Kernel Principal Component Analysis.</span><br><span class=help>% </span><br><span class=help>% <span class=help_field>Synopsis:</span></span><br><span class=help>% model = kpca(X)</span><br><span class=help>% model = kpca(X,options)</span><br><span class=help>%</span><br><span class=help>% <span class=help_field>Description:</span></span><br><span class=help>% This function is implementation of Kernel Principal Component </span><br><span class=help>% Analysis (KPCA) [Schol98b]. The input data X are non-linearly</span><br><span class=help>% mapped to a new high dimensional space induced by prescribed</span><br><span class=help>% kernel function. The PCA is applied on the non-linearly mapped </span><br><span class=help>% data. The result is a model describing non-linear data projection.</span><br><span class=help>% See 'help kernelproj' for info how to project data.</span><br><span class=help>%</span><br><span class=help>% <span class=help_field>Input:</span></span><br><span class=help>% X [dim x num_data] Training data.</span><br><span class=help>% </span><br><span class=help>% options [struct] Decribes kernel and output dimension:</span><br><span class=help>% .ker [string] Kernel identifier (see 'help kernel'); </span><br><span class=help>% (default 'linear').</span><br><span class=help>% .arg [1 x narg] kernel argument; (default 1).</span><br><span class=help>% .new_dim [1x1] Output dimension (number of used principal </span><br><span class=help>% components); (default dim).</span><br><span class=help>%</span><br><span class=help>% <span class=help_field>Output:</span></span><br><span class=help>% model [struct] Kernel projection:</span><br><span class=help>% .Alpha [num_data x new_dim] Multipliers.</span><br><span class=help>% .b [new_dim x 1] Bias.</span><br><span class=help>% .sv.X [dim x num_data] Training vectors.</span><br><span class=help>% </span><br><span class=help>% .nsv [1x1] Number of training data.</span><br><span class=help>% .eigval [1 x num_data] Eigenvalues of centered kernel matrix.</span><br><span class=help>% .mse [1x1] Mean square representation error of maped data.</span><br><span class=help>% .MsErr [dim x 1] MSE with respect to used basis vectors;</span><br><span class=help>% mse=MsErr(new_dim).</span><br><span class=help>% .kercnt [1x1] Number of used kernel evaluations.</span><br><span class=help>% .options [struct] Copy of used options.</span><br><span class=help>% .cputime [1x1] CPU time used for training.</span><br><span class=help>%</span><br><span class=help>% <span class=help_field>Example:</span></span><br><span class=help>% X = gencircledata([1;1],5,250,1);</span><br><span class=help>% model = kpca( X, struct('ker','rbf','arg',4,'new_dim',2));</span><br><span class=help>% XR = kpcarec( X, model );</span><br><span class=help>% figure; </span><br><span class=help>% ppatterns( X ); ppatterns( XR, '+r' );</span><br><span class=help>% </span><br><span class=help>% See also </span><br><span class=help>% KERNELPROJ, PCA, GDA.</span><br><span class=help>% </span><br><hr><span class=help1>% <span class=help1_field>About:</span> Statistical Pattern Recognition Toolbox</span><br><span class=help1>% (C) 1999-2003, Written by Vojtech Franc and Vaclav Hlavac</span><br><span class=help1>% <a href="http://www.cvut.cz">Czech Technical University Prague</a></span><br><span class=help1>% <a href="http://www.feld.cvut.cz">Faculty of Electrical Engineering</a></span><br><span class=help1>% <a href="http://cmp.felk.cvut.cz">Center for Machine Perception</a></span><br><br><span class=help1>% <span class=help1_field>Modifications:</span></span><br><span class=help1>% 4-may-2004, VF</span><br><span class=help1>% 10-july-2003, VF, computation of kercnt added</span><br><span class=help1>% 22-jan-2003, VF</span><br><span class=help1>% 11-july-2002, VF, mistake "Jt=zeros(N,L)/N" repared </span><br><span class=help1>% (reported by SH_Srinivasan@Satyam.com).</span><br><span class=help1>% 5-July-2001, V.Franc, comments changed</span><br><span class=help1>% 20-dec-2000, V.Franc, algorithm was implemented</span><br><br><hr><span class=comment>% timer</span><br>start_time = cputime;<br><br><span class=comment>% gets dimensions</span><br>[dim,num_data] = size(X); <br><br><span class=comment>% process input arguments</span><br><span class=comment>%-----------------------------------</span><br><span class=keyword>if</span> <span class=stack>nargin</span> < 2, options = []; <span class=keyword>else</span> options=c2s(options); <span class=keyword>end</span><br><span class=keyword>if</span> ~isfield(options,<span class=quotes>'ker'</span>), options.ker = <span class=quotes>'linear'</span>; <span class=keyword>end</span><br><span class=keyword>if</span> ~isfield(options,<span class=quotes>'arg'</span>), options.arg = 1; <span class=keyword>end</span><br><span class=keyword>if</span> ~isfield(options,<span class=quotes>'new_dim'</span>), options.new_dim = dim; <span class=keyword>end</span><br><br><span class=comment>% compute kernel matrix</span><br>K = kernel(X,options.ker,options.arg);<br><br><span class=comment>% Centering kernel matrix (non-linearly mapped data).</span><br>J = ones(num_data,num_data)/num_data;<br>Kc = K - J*K - K*J + J*K*J;<br><br><span class=comment>% eigen decomposition of the kernel marix</span><br>[U,D] = eig(Kc);<br>Lambda=real(diag(D));<br><br><span class=comment>% normalization of eigenvectors to be orthonormal </span><br><span class=keyword>for</span> k = 1:num_data,<br> <span class=keyword>if</span> Lambda(k) ~= 0,<br> U(:,k)=U(:,k)/sqrt(Lambda(k));<br> <span class=keyword>end</span><br><span class=keyword>end</span><br><br><span class=comment>% Sort the eigenvalues and the eigenvectors in descending order.</span><br>[Lambda,ordered]=sort(-Lambda); <br>Lambda=-Lambda;<br>U=U(:,ordered); <br><br><span class=comment>% use first new_dim principal components</span><br>A=U(:,1:options.new_dim); <br><br><span class=comment>% compute Alpha and compute bias (implicite centering)</span><br><span class=comment>% of kernel projection</span><br>model.Alpha = (eye(num_data,num_data)-J)*A;<br>Jt=ones(num_data,1)/num_data;<br>model.b = A<span class=quotes>'*(J'</span>*K*Jt-K*Jt);<br><br><span class=comment>% fill output structure</span><br>model.sv.X = X;<br>model.nsv = num_data;<br>model.options = options;<br>model.eigval = Lambda;<br>model.kercnt = num_data*(num_data+1)/2;<br>model.MsErr = triu(ones(num_data,num_data),1)*model.eigval/num_data;<br>model.mse = model.MsErr(options.new_dim);<br>model.cputime = cputime - start_time;<br>model.fun = <span class=quotes>'kernelproj'</span>;<br><br><span class=jump>return</span>;<br></code>
?? 快捷鍵說明
復制代碼
Ctrl + C
搜索代碼
Ctrl + F
全屏模式
F11
切換主題
Ctrl + Shift + D
顯示快捷鍵
?
增大字號
Ctrl + =
減小字號
Ctrl + -