?? nmc.m
字號:
%NMC Nearest Mean Classifier% % W = NMC(A)% W = A*NMC%% INPUT% A Dataset%% OUTPUT% W Nearest Mean Classifier %% DESCRIPTION% Computation of the nearest mean classifier between the classes in the% dataset A. The use of soft labels is supported. Prior probabilities are% not used.%% The difference with NMSC is that NMSC is based on an assumption of normal% distributions and thereby automatically scales the features and is% sensitive to class priors. NMC is a plain nearest mean classifiers that% is feature scaling sensitive and unsensitive to class priors.%% SEE ALSO% DATASETS, MAPPINGS, NMSC, LDC, FISHERC, QDC, UDC % Copyright: R.P.W. Duin, r.p.w.duin@prtools.org% Faculty EWI, Delft University of Technology% P.O. Box 5031, 2600 GA Delft, The Netherlands% $Id: nmc.m,v 1.10 2007/11/30 16:29:49 duin Exp $function W = nmc(a) prtrace(mfilename); if nargin < 1 | isempty(a) W = mapping(mfilename); W = setname(W,'Nearest Mean'); return end islabtype(a,'crisp','soft'); isvaldfile(a,1,2); % at least 1 object per class, 2 classes [m,k,c] = getsize(a); if c == 2 % 2-class case: store linear classifier u = meancov(a); u1 = +u(1,:); u2 = +u(2,:); R = [u1-u2]'; offset =(u2*u2' - u1*u1')/2; W = affine([R -R],[offset -offset],a,getlablist(a)); W = cnormc(W,a); W = setname(W,'Nearest Mean'); else if all(classsizes(a) > 1) a = setprior(a,0); % NMC should be independent of priors: make tehm equal p = getprior(a); U = zeros(c,k); V = zeros(c,k); for j=1:c b = seldat(a,j); U(j,:) = +mean(b); V(j,:) = +var(b); end %G = mean(V'*p') * eye(k); G = mean(V(:)); w.mean = +U; w.cov = G; w.prior =p; W = normal_map(w,getlablist(a),k,c); W = setname(W,'Nearest Mean'); W = setcost(W,a); else u = meancov(a); W = knnc(u,1); W = setname(W,'Nearest Mean'); end end W = setcost(W,a); return
?? 快捷鍵說明
復制代碼
Ctrl + C
搜索代碼
Ctrl + F
全屏模式
F11
切換主題
Ctrl + Shift + D
顯示快捷鍵
?
增大字號
Ctrl + =
減小字號
Ctrl + -