亚洲欧美第一页_禁久久精品乱码_粉嫩av一区二区三区免费野_久草精品视频

? 歡迎來到蟲蟲下載站! | ?? 資源下載 ?? 資源專輯 ?? 關于我們
? 蟲蟲下載站

?? svmtrain.m

?? 糾錯輸出編碼的多類支持向量機,自己編輯的碼本,完全隨機編碼,對于7-15類效果很好
?? M
?? 第 1 頁 / 共 2 頁
字號:
function net = svmtrain(net, X, Y, alpha0, dodisplay)% SVMTRAIN - Train a Support Vector Machine classifier%%   NET = SVMTRAIN(NET, X, Y)%   Train the SVM given by NET using the training data X with target values%   Y. X is a matrix of size (N,NET.nin) with N training examples (one per%   row). Y is a column vector containing the target values (classes) for%   each example in X. Each element of Y that is >=0 is treated as class%   +1, each element <0 is treated as class -1.%   SVMTRAIN normally uses L1-norm of all training set errors in the%   objective function. If NET.use2norm==1, L2-norm is used.%%   All training parameters are given in the structure NET. Relevant%   parameters are mainly NET.c, for fine-tuning also NET.qpsize,%   NET.alphatol and NET.kkttol. See function SVM for a description of%   these fields.%%   NET.c is a weight for misclassifying a particular example. NET.c may%   either be a scalar (where all errors have the same weights), or it may%   be a column vector (size [N 1]) where entry NET.c(i) corresponds to the%   error weight for example X(i,:). If NET.c is e vector of length 2,%   NET.c(1) specifies the error weight for all positive examples, NET.c(2)%   is the error weight for all negative examples. Specifying a different%   weight for each example may be used for imbalanced data sets.%%   NET = SVMTRAIN(NET, X, Y, ALHPA0) uses the column vector ALPHA0 as%   the initial values for the coefficients NET.alpha. ALPHA0 may result%   from a previous training with different parameters.%   NET = SVMTRAIN(NET, X, Y, ALPHA0, 1) displays information on the%   training progress (number of errors in the current iteration, etc)%   SVMTRAIN uses either the function LOQO (Matlab-Interface to Smola's%   LOQO code) or the routines QP/QUADPROG from the Matlab Optimization%   Toolbox to solve the quadratic programming problem.%%   See also:%   SVM, SVMKERNEL. SVMFWD%% % Copyright (c) Anton Schwaighofer (2001)% $Revision: 1.19 $ $Date: 2002/01/09 12:11:41 $% mailto:anton.schwaighofer@gmx.net% % This program is released unter the GNU General Public License.% % Training a SVM involves solving a quadratic programming problem that% scales quadratically with the number of examples. SVMTRAIN uses the% decomposed training algorithm proposed by Osuna, Freund and Girosi, where% the maximum size of a quadratic program is constant.% (ftp://ftp.ai.mit.edu/pub/cbcl/nnsp97-svm.ps)% For selecting the working set, the approximation proposed by Joachims% (http://www-ai.cs.uni-dortmund.de/DOKUMENTE/joachims_99a.ps.gz) is used.% Check arguments for consistencyerrstring = consist(net, 'svm', X, Y);if ~isempty(errstring);  error(errstring);end[N, d] = size(X);if N==0,  error('No training examples given');endnet.nbexamples = N;if nargin<5,  dodisplay = 0;endif nargin<4,  alpha0 = [];elseif (~isempty(alpha0)) & (~all(size(alpha0)==[N 1])),  error(['Initial values ALPHA0 must be a column vector with the same length' ...	 ' as X']); end% Find the indices of examples from class +1 and -1class1 = logical(uint8(Y>=0));class0 = logical(uint8(Y<0));if length(net.c(:))==1,  C = repmat(net.c, [N 1]);  % The same upper bound for all exampleselseif length(net.c(:))==2,  C = zeros([N 1]);  C(class1) = net.c(1);  C(class0) = net.c(2);  % Different upper bounds C for the positive and negative exampleselse  C = net.c;  if ~all(size(C)==[N 1]),    error(['Upper bound C must be a column vector with the same length' ...	   ' as X']);   endendif min(C)<net.alphatol,  error('NET.C must be positive and larger than NET.alphatol');endif ~isfield(net, 'use2norm'),  net.use2norm = 0;endif ~isfield(net, 'qpsolver'),  net.qpsolver = '';endqpsolver = net.qpsolver;if isempty(qpsolver),  % QUADPROG is the fastest solver for both 1norm and 2norm SVMs, if  % qpsize is around 10-70 (loqo is best for large 1norm SVMs)  checkseq = {'quadprog', 'loqo', 'qp'};  i = 1;  while (i <= length(checkseq)),    e = exist(checkseq{i});    if (e==2) | (e==3),      qpsolver = checkseq{i};      break;    end    i = i+1;  end  if isempty(qpsolver),    error('No quadratic programming solver (QUADPROG,LOQO,QP) found.');  endend% Mind that there may occur problems with the QUADPROG solver. At least in% early versions of Matlab 5.3 there are severe numerical problems somewhere% deep in QUADPROG% Turn off all messages coming from quadprog, increase the maximum number% of iterations from 200 to 500 - good for low-dimensional problemsif strcmp(qpsolver, 'quadprog') & (dodisplay==0),  quadprogopt = optimset('Display', 'off', 'MaxIter', 500);else  quadprogopt = [];end% Actual size of quadratic program during training may not be larger than% the number of examplesQPsize = min(N, net.qpsize);chsize = net.chunksize;% SVMout contains the output of the SVM decision function for each% example. This is updated iteratively during training.SVMout = zeros(N, 1);% Make sure there are no other values in Y than +1 and -1Y(class1) = 1;Y(class0) = -1;if dodisplay>0,  fprintf('Training set: %i examples (%i positive, %i negative)\n', ...	  length(Y), length(find(class1)), length(find(class0)));end% Start with a vector of zeros for the coefficients alpha, or the% parameter ALPHA0, if it is given. Those values will be used to perform% an initial working set selection, by assuming they are the true weights% for the training set at hand.if ~any(alpha0),  net.alpha = zeros([N 1]);  % If starting with a zero vector: randomize the first working set search  randomWS = 1;else  randomWS = 0;  % for 1norm SVM: make the initial values conform to the upper bounds  if ~net.use2norm,    net.alpha = min(C, alpha0);  endendalphaOld = net.alpha;if length(find(Y>0))==N,  % only positive examples  net.bias = 1;  net.svcoeff = [];  net.sv = [];  net.svind = [];  net.alpha = zeros([N 1]);  return;elseif length(find(Y<0))==N,  % only negative examples  net.bias = 1;  net.svcoeff = [];  net.sv = [];  net.svind = [];  net.alpha = zeros([N 1]);  return;enditeration = 0;workset = logical(uint8(zeros(N, 1)));sameWS = 0;net.bias = 0;while 1,  if dodisplay>0,    fprintf('\nIteration %i: ', iteration+1);  end  % Step 1: Determine the Support Vectors.  [net, SVthresh, SV, SVbound, SVnonbound] = findSV(net, C);  if dodisplay>0,    fprintf(['Working set of size %i: %i Support Vectors, %i of them at' ...	     ' bound C\n'], length(find(workset)), length(find(workset & SV)), ...	    length(find(workset & SVbound)));     fprintf(['Whole training set: %i Support Vectors, %i of them at upper' ...	     ' bound C.\n'], length(net.svind), length(find(SVbound)));    if dodisplay>1,      fprintf('The Support Vectors (threshold %g) are the examples\n', ...	      SVthresh);      fprintf(' %i', net.svind);      fprintf('\n');    end  end    % Step 2: Find the output of the SVM for all training examples  if (iteration==0) | (mod(iteration, net.recompute)==0),    % Every NET.recompute iterations the SVM output is built from    % scratch. Use all Support Vectors for determining the output.    changedSV = net.svind;    changedAlpha = net.alpha(changedSV);    SVMout = zeros(N, 1);    if strcmp(net.kernel, 'linear'),      net.normalw = zeros([1 d]);    end  else    % A normal iteration: Find the coefficients that changed and adjust    % the SVM output only by the difference of old and new alpha    changedSV = find(net.alpha~=alphaOld);    changedAlpha = net.alpha(changedSV)-alphaOld(changedSV);  end    if strcmp(net.kernel, 'linear'),    chunks = ceil(length(changedSV)/chsize);    % Linear kernel: Build the normal vector of the separating    % hyperplane by computing the weighted sum of all Support Vectors    for ch = 1:chunks,      ind = (1+(ch-1)*chsize):min(length(changedSV), ch*chsize);      temp = changedAlpha(ind).*Y(changedSV(ind));      net.normalw = net.normalw+temp'*X(changedSV(ind), :);    end    % Find the output of the SVM by multiplying the examples with the    % normal vector    SVMout = zeros(N, 1);    chunks = ceil(N/chsize);    for ch = 1:chunks,      ind = (1+(ch-1)*chsize):min(N, ch*chsize);      SVMout(ind) = X(ind,:)*(net.normalw');    end  else    % A normal kernel function: Split both the examples and the Support    % Vectors into small chunks    chunks1 = ceil(N/chsize);    chunks2 = ceil(length(changedSV)/chsize);    for ch1 = 1:chunks1,      ind1 = (1+(ch1-1)*chsize):min(N, ch1*chsize);      for ch2 = 1:chunks2,	% Compute the kernel function for a chunk of Support Vectors and        % a chunk of examples	ind2 = (1+(ch2-1)*chsize):min(length(changedSV), ch2*chsize);	K12 = svmkernel(net, X(ind1, :), X(changedSV(ind2), :));	% Add the weighted kernel matrix to the SVM output. In update        % cycles, the kernel matrix is weighted by the difference of        % alphas, in other cycles it is weighted by the value alpha alone.	coeff = changedAlpha(ind2).*Y(changedSV(ind2));	SVMout(ind1) = SVMout(ind1)+K12*coeff;      end      if dodisplay>2,	K1all = svmkernel(net, X(ind1,:), X(net.svind,:));	coeff2 = net.alpha(net.svind).*Y(net.svind);	fprintf('Maximum error due to matrix partitioning: %g\n', ...		max((SVMout(ind1)-K1all*coeff2)'));      end    end  end    % Step 3: Compute the bias of the SVM decision function.  if net.use2norm,    % The bias can be found from the SVM output for Support Vectors. For    % those vectors, the output should be 1-alpha/C resp -1+alpha/C.    workSV = find(SV & workset);    if ~isempty(workSV),      net.bias = mean((1-net.alpha(workSV)./C(workSV)).*Y(workSV)- ...                      SVMout(workSV));    end  else    % normal 1norm SVM:    % The bias can be found from Support Vector whose value alpha is not at    % the upper bound. For those vectors, the SVM output should be +1    % resp. -1.    workSV = find(SVnonbound & workset);    if ~isempty(workSV),      net.bias = mean(Y(workSV)-SVMout(workSV));    end  end  % The nasty case that no SVs to determine the bias have been found.  % The only sensible thing do to is to leave the bias unchanged.

?? 快捷鍵說明

復制代碼 Ctrl + C
搜索代碼 Ctrl + F
全屏模式 F11
切換主題 Ctrl + Shift + D
顯示快捷鍵 ?
增大字號 Ctrl + =
減小字號 Ctrl + -
亚洲欧美第一页_禁久久精品乱码_粉嫩av一区二区三区免费野_久草精品视频
国产精品1区2区3区在线观看| 国产xxx精品视频大全| 蜜臀a∨国产成人精品| 国产精品99久久久久久有的能看| av一区二区三区在线| 91精品欧美综合在线观看最新| 久久精品欧美日韩| 日日欢夜夜爽一区| 99re视频精品| 久久影音资源网| 日韩和欧美的一区| 91蝌蚪porny九色| 国产色91在线| 久久99精品久久只有精品| 色婷婷激情久久| 国产欧美精品一区| 国产一区91精品张津瑜| 在线成人av网站| 亚洲一区二区三区自拍| eeuss影院一区二区三区| 久久只精品国产| 蜜臀a∨国产成人精品| 欧美日韩黄视频| 亚洲一区二区在线视频| 91小视频免费观看| 中文字幕不卡的av| 国产成人免费9x9x人网站视频| 91精品国产综合久久久蜜臀图片| 亚洲麻豆国产自偷在线| 99这里只有久久精品视频| 26uuu精品一区二区| 久久国产成人午夜av影院| 欧洲亚洲国产日韩| 亚洲激情网站免费观看| 91麻豆精品国产91久久久更新时间 | 色婷婷久久一区二区三区麻豆| 国产欧美综合在线| 国产xxx精品视频大全| 久久精品欧美一区二区三区麻豆| 国产美女精品在线| 国产性色一区二区| 成人爽a毛片一区二区免费| 国产精品污www在线观看| 成人av影院在线| 亚洲手机成人高清视频| 91视频xxxx| 亚洲成人av电影| 宅男在线国产精品| 韩国精品久久久| 国产日韩欧美不卡在线| av在线播放一区二区三区| 中文字幕一区二区三区视频| 色系网站成人免费| 午夜久久福利影院| 精品福利二区三区| 成人精品电影在线观看| 亚洲免费成人av| 欧美高清视频在线高清观看mv色露露十八| 亚洲成人免费观看| 精品国产髙清在线看国产毛片| 国产成人一区二区精品非洲| 国产精品每日更新在线播放网址 | 国产亚洲一区二区三区在线观看| 国产成人在线影院| 亚洲蜜臀av乱码久久精品蜜桃| 欧美日韩一区二区在线观看| 久久99久久精品| 国产精品福利av| 欧美年轻男男videosbes| 国产麻豆视频一区| 亚洲永久精品国产| 久久伊人蜜桃av一区二区| 99久久久无码国产精品| 日本91福利区| 国产精品久久久久久一区二区三区| 欧美在线免费视屏| 精品午夜久久福利影院| 一区在线观看视频| 欧美成人女星排行榜| 91香蕉国产在线观看软件| 奇米综合一区二区三区精品视频| 国产精品人妖ts系列视频| 欧美一区二区三区在线| 成人app软件下载大全免费| 午夜国产不卡在线观看视频| 久久久国产一区二区三区四区小说 | 色综合中文字幕| 久久99久久久欧美国产| 亚洲伦理在线精品| 久久久久国产精品麻豆ai换脸| 欧美天堂亚洲电影院在线播放| 国产一区二区三区精品欧美日韩一区二区三区| 国产精品国产自产拍高清av王其| 日韩视频不卡中文| 日本二三区不卡| 国产mv日韩mv欧美| 久久国内精品自在自线400部| 一区二区三区在线影院| 亚洲成人动漫在线观看| 日本一区二区免费在线| 欧美一区二区免费观在线| 91在线观看免费视频| 国产在线精品不卡| 全国精品久久少妇| 亚洲电影一级片| 亚洲欧美激情视频在线观看一区二区三区| 日韩欧美电影一二三| 欧美日韩精品综合在线| av一二三不卡影片| 国产**成人网毛片九色 | 欧洲一区在线电影| 色狠狠色狠狠综合| 99久久精品99国产精品| www.亚洲激情.com| av不卡在线观看| 波多野结衣一区二区三区| 国产成人精品亚洲午夜麻豆| 国产一级精品在线| 国产高清亚洲一区| 国产成人一级电影| 成人激情黄色小说| 91网站在线播放| 91在线视频观看| 色婷婷av久久久久久久| 一本高清dvd不卡在线观看| 91麻豆国产自产在线观看| 91天堂素人约啪| 欧美午夜精品一区| 在线播放日韩导航| 777奇米四色成人影色区| 欧美精品乱人伦久久久久久| 欧美乱妇一区二区三区不卡视频| 欧美色网一区二区| 欧美一级日韩一级| 精品久久久久久久久久久久久久久| 欧美刺激午夜性久久久久久久| 精品国产一区二区在线观看| www激情久久| 亚洲欧洲日产国码二区| 亚洲综合免费观看高清完整版在线| 一区二区三区美女视频| 亚洲国产日韩a在线播放| 日本中文字幕一区二区有限公司| 久久电影网电视剧免费观看| 国产成人精品免费| 欧美影视一区二区三区| 日韩一区二区免费在线电影| 国产亚洲短视频| 亚洲美女免费视频| 免费观看一级欧美片| 国产成人啪免费观看软件 | 国产免费久久精品| 亚洲中国最大av网站| 麻豆国产精品视频| 91小视频在线观看| 日韩欧美电影一二三| 亚洲日本在线观看| 免费日本视频一区| 99国产精品久久久久久久久久| 欧美精品第1页| 欧美激情一二三区| 日韩精品电影一区亚洲| 成人激情图片网| 91精品国产综合久久香蕉麻豆| 久久久久久亚洲综合影院红桃| 亚洲欧美色图小说| 黄色小说综合网站| 欧美日韩一区中文字幕| 国产视频一区二区在线观看| 亚洲电影第三页| 99久久精品国产麻豆演员表| 7777精品伊人久久久大香线蕉| 国产精品伦理在线| 久久国产福利国产秒拍| 欧美亚洲丝袜传媒另类| 国产精品私房写真福利视频| 日本成人在线看| 色偷偷成人一区二区三区91| 久久九九国产精品| 日本不卡在线视频| 在线观看免费亚洲| 国产精品久久久久一区| 黄色成人免费在线| 欧美一区二区久久久| 亚洲综合在线电影| 96av麻豆蜜桃一区二区| 久久久久久久性| 久草中文综合在线| 欧美一区二区日韩一区二区| 亚洲综合成人在线视频| 91麻豆免费视频| 国产欧美日韩麻豆91| 国产一区在线精品| 欧美成人精品福利| 麻豆精品一区二区三区| 欧美一级在线视频| 青青草97国产精品免费观看无弹窗版 | 青青国产91久久久久久| 欧美浪妇xxxx高跟鞋交| 亚洲成在人线免费|