?? osusvmclass.m
字號(hào):
function [Labels, scores]= osuSVMClass(Samples, Ns, AlphaY, SVs, Bias, Parameters)
% USAGE:
% [Labels, scores]= osuSVMClass(Samples, Ns, AlphaY, SVs, Bias, Parameters)
%
% DESCRIPTION:
% Classify a group of input patterns given a trained svm classifier.
%
% INPUTS:
% Samples: all the input patterns. (a row of column vectors)
% Ns: number of SVs for each class (a row vector). This parameter is valid only
% for the multi-class case, and is 0 for the 1-svm and 2-class case.(a row vector or a scalar)
% AlphaY: Alpha * Y, where Alpha is the non-zero Lagrange Coefficients
% Y is the corresponding {1 -1} labels.
% in multi-class case:
% [AlphaY_Class1, AlphaY_Class2, ..., AlphaY_ClassM]
% +----Ns(1)----+----Ns(2)-----+----+---Ns(M)------+
% SVs : support vectors. That is, the patterns corresponding the non-zero
% Alphas.
% in multi-class case:
% [SVs_Class1, SVs_Class2, ..., SVs_ClassM]
% +--Ns(1)---+---Ns(2)---+----+---Ns(M)---+
% Bias : the bias in the decision function, which is AlphaY*Kernel(SVs',x)-Bias.
% in multi-class case:
% [Bias_Class1, Bias_Class2, ..., Bias_ClassM]
% Parameters: the paramters required by the training algorithm.
% (a 10-element row vector)
% +-----------------------------------------------------------------
% |Kernel Type| Degree | Gamma | Coefficient | C |Cache size|epsilon|
% +-----------------------------------------------------------------
% -------------------------------------------+
% | SVM Type | nu (nu-svm) | loss tolerance |
% -------------------------------------------+
% where Kernel Type:
% 0 --- Linear
% 1 --- Polynomial: (Gamma*<X(:,i),X(:,j)>+Coefficient)^Degree
% 2 --- RBF: (exp(-Gamma*|X(:,i)-X(:,j)|^2))
% 3 --- Sigmoid: tanh(Gamma*<X(:,i),X(:,j)>+Coefficient)
% Gamma: If the input value is zero, Gamma will be set defautly as
% 1/(max_pattern_dimension) in the function. If the input
% value is non-zero, Gamma will remain unchanged in the
% function.
% C: Cost of the constrain violation (for C-SVC & C-SVR)
% Cache Size: as the buffer to hold the <X(:,i),X(:,j)> (in MB)
% epsilon: tolerance of termination criterion
% SVM Type:
% 0 --- c-SVM classifier
% 1 --- nu-SVM classifier
% 2 --- 1-SVM
% 3 --- c-SVM regressioner
% nu: the nu used in nu-SVM classifer (for 1-SVM and nu-SVM)
% loss tolerance: the epsilon in epsilon-insensitive loss function
%
% OUTPUTS
% Lables: the corresponding estimated class labels for the input patterns in Samples.
% In multi-class case, it's a row vector, and the labels are {1, 2, 3, ..., M }
% In 2-class case, it's a
% scores: In the multi-class case, the decision function output for each class.
% (a M row matrix, each row is the decision function for a class)
% In 2-class case, the decision funciton output of this 2-class problem
%
if (nargin ~= 6)
disp(' Incorrect number of input variables.\n');
help osuSVMClass;
else
[svDim n]=size(SVs);
[sampleDim n]=size(Samples);
if sampleDim ~=svDim
[Samples]=DimFit(Samples,svDim);
end
if Ns == 0 % the 1-svm or 2-class case (including c-svm and u-svm)
[Labels, scores]= SVMClass(Samples, AlphaY, SVs, Bias, Parameters);
if (Parameters(8) == 0) | (Parameters(8) == 1)
Labels = (-1*Labels)/2+1.5; % change the labels from {1 -1} format to {1 2} format
end
else % the multi-class case
M=length(Ns);
pos=1;
for i=1:M
[ClassRate, curDecisionValue]= SVMClass(Samples, AlphaY(pos:pos+Ns(i)-1), SVs(:,pos:pos+Ns(i)-1), Bias(i), Parameters);
if i == 1
scores = curDecisionValue;
else
scores = [scores; curDecisionValue];
end
pos = pos + Ns(i);
end
[v Labels] = max(scores,[],1);
end
end
?? 快捷鍵說明
復(fù)制代碼
Ctrl + C
搜索代碼
Ctrl + F
全屏模式
F11
切換主題
Ctrl + Shift + D
顯示快捷鍵
?
增大字號(hào)
Ctrl + =
減小字號(hào)
Ctrl + -