亚洲欧美第一页_禁久久精品乱码_粉嫩av一区二区三区免费野_久草精品视频

? 歡迎來到蟲蟲下載站! | ?? 資源下載 ?? 資源專輯 ?? 關(guān)于我們
? 蟲蟲下載站

?? svmtrain.asv

?? SVM數(shù)據(jù)挖掘分類實驗系統(tǒng)
?? ASV
?? 第 1 頁 / 共 2 頁
字號:
function net = svmtrain(net, X, Y, alpha0, dodisplay)% 訓(xùn)練一個支持量機分類器% %   NET = SVMTRAIN(NET, X, Y)%   Train the SVM given by NET using the training data X with target values%   Y. X is a matrix of size (N,NET.nin) with N training examples (one per%   row). Y is a column vector containing the target values (classes) for%   each example in X. Each element of Y that is >=0 is treated as class%   +1, each element <0 is treated as class -1.%   SVMTRAIN normally uses L1-norm of all training set errors in the%   objective function. If NET.use2norm==1, L2-norm is used.%%   All training parameters are given in the structure NET. Relevant%   parameters are mainly NET.c, for fine-tuning also NET.qpsize,%   NET.alphatol and NET.kkttol. See function SVM for a description of%   these fields.%%   NET.c is a weight for misclassifying a particular example. NET.c may%   either be a scalar (where all errors have the same weights), or it may%   be a column vector (size [N 1]) where entry NET.c(i) corresponds to the%   error weight for example X(i,:). If NET.c is e vector of length 2,%   NET.c(1) specifies the error weight for all positive examples, NET.c(2)%   is the error weight for all negative examples. Specifying a different%   weight for each example may be used for imbalanced data sets.%%   NET = SVMTRAIN(NET, X, Y, ALHPA0) uses the column vector ALPHA0 as%   the initial values for the coefficients NET.alpha. ALPHA0 may result%   from a previous training with different parameters.%   NET = SVMTRAIN(NET, X, Y, ALPHA0, 1) displays information on the%   training progress (number of errors in the current iteration, etc)%   SVMTRAIN uses either the function LOQO (Matlab-Interface to Smola's%   LOQO code) or the routines QP/QUADPROG from the Matlab Optimization%   Toolbox to solve the quadratic programming problem.%%   See also:%   SVM, SVMKERNEL. SVMFWD%% % Copyright (c) Anton Schwaighofer (2001)% $Revision: 1.19 $ $Date: 2002/01/09 12:11:41 $% mailto:anton.schwaighofer@gmx.net% % This program is released unter the GNU General Public License.% % Training a SVM involves solving a quadratic programming problem that% scales quadratically with the number of examples. SVMTRAIN uses the% decomposed training algorithm proposed by Osuna, Freund and Girosi, where% the maximum size of a quadratic program is constant.% (ftp://ftp.ai.mit.edu/pub/cbcl/nnsp97-svm.ps)% For selecting the working set, the approximation proposed by Joachims% (http://www-ai.cs.uni-dortmund.de/DOKUMENTE/joachims_99a.ps.gz) is used.% 核對參數(shù)是否一致errstring = consist(net, 'svm', X, Y);if ~isempty(errstring);  error(errstring);end[N, d] = size(X);if N==0,  error('沒有給出訓(xùn)練樣本');endnet.nbexamples = N;if nargin<5,  dodisplay = 0;endif nargin<4,  alpha0 = [];elseif (~isempty(alpha0)) & (~all(size(alpha0)==[N 1])),  error(['ALPHA0的初始值必須是和X一樣長度的列向量']); end%從類+1和-1找到樣本的索引class1 = logical(uint8(Y>=0));class0 = logical(uint8(Y<0));if length(net.c(:))==1,  C = repmat(net.c, [N 1]);  %所有樣本一樣的上限elseif length(net.c(:))==2,  C = zeros([N 1]);  C(class1) = net.c(1);  C(class0) = net.c(2);  % 樣本類型+1和-1的不同的上限Celse  C = net.c;  if ~all(size(C)==[N 1]),    error(['上限C必須是和X一樣長度的列向量']);   endendif min(C)<net.alphatol,  error('NET.C必須為正并比NET.alphatol大');endif ~isfield(net, 'use2norm'),  net.use2norm = 0;endif ~isfield(net, 'qpsolver'),  net.qpsolver = '';endqpsolver = net.qpsolver;if isempty(qpsolver),  checkseq = {'quadprog', 'loqo', 'qp'};  i = 1;  while (i <= length(checkseq)),    e = exist(checkseq{i});    if (e==2) | (e==3),      qpsolver = checkseq{i};      break;    end    i = i+1;  end  if isempty(qpsolver),    error('No quadratic programming solver (QUADPROG,LOQO,QP) found.');  endend% Mind that there may occur problems with the QUADPROG solver. At least in% early versions of Matlab 5.3 there are severe numerical problems somewhere% deep in QUADPROG% Turn off all messages coming from quadprog, increase the maximum number% of iterations from 200 to 500 - good for low-dimensional problemsif strcmp(qpsolver, 'quadprog') & (dodisplay==0),  quadprogopt = optimset('Display', 'off', 'MaxIter', 500);else  quadprogopt = [];end% Actual size of quadratic program during training may not be larger than% the number of examplesQPsize = min(N, net.qpsize);chsize = net.chunksize;% SVMout contains the output of the SVM decision function for each% example. This is updated iteratively during training.SVMout = zeros(N, 1);%確保Y的值僅為+1和-1Y(class1) = 1;Y(class0) = -1;if dodisplay>0,  fprintf('Training set: %i examples (%i positive, %i negative)\n', ...	  length(Y), length(find(class1)), length(find(class0)));end% Start with a vector of zeros for the coefficients alpha, or the% parameter ALPHA0, if it is given. Those values will be used to perform% an initial working set selection, by assuming they are the true weights% for the training set at hand.if ~any(alpha0),  net.alpha = zeros([N 1]);  % If starting with a zero vector: randomize the first working set search  randomWS = 1;else  randomWS = 0;  % for 1norm SVM: make the initial values conform to the upper bounds  if ~net.use2norm,    net.alpha = min(C, alpha0);  endendalphaOld = net.alpha;if length(find(Y>0))==N,  % only positive examples  net.bias = 1;  net.svcoeff = [];  net.sv = [];  net.svind = [];  net.alpha = zeros([N 1]);  return;elseif length(find(Y<0))==N,  % only negative examples  net.bias = 1;  net.svcoeff = [];  net.sv = [];  net.svind = [];  net.alpha = zeros([N 1]);  return;enditeration = 0;workset = logical(uint8(zeros(N, 1)));sameWS = 0;net.bias = 0;while 1,  if dodisplay>0,    fprintf('\nIteration %i: ', iteration+1);  end  % 步驟1:決定支持向量  [net, SVthresh, SV, SVbound, SVnonbound] = findSV(net, C);  if dodisplay>0,    fprintf(['Working set of size %i: %i Support Vectors, %i of them at' ...	     ' bound C\n'], length(find(workset)), length(find(workset & SV)), ...	    length(find(workset & SVbound)));     fprintf(['Whole training set: %i Support Vectors, %i of them at upper' ...	     ' bound C.\n'], length(net.svind), length(find(SVbound)));    if dodisplay>1,      fprintf('The Support Vectors (threshold %g) are the examples\n', ...	      SVthresh);      fprintf(' %i', net.svind);      fprintf('\n');    end  end    % 步驟2: 找到SVM所有樣本的輸出  if (iteration==0) | (mod(iteration, net.recompute)==0),    % Every NET.recompute iterations the SVM output is built from    % scratch. Use all Support Vectors for determining the output.    changedSV = net.svind;    changedAlpha = net.alpha(changedSV);    SVMout = zeros(N, 1);    if strcmp(net.kernel, 'linear'),      net.normalw = zeros([1 d]);    end  else    % A normal iteration: Find the coefficients that changed and adjust    % the SVM output only by the difference of old and new alpha    changedSV = find(net.alpha~=alphaOld);    changedAlpha = net.alpha(changedSV)-alphaOld(changedSV);  end    if strcmp(net.kernel, 'linear'),    chunks = ceil(length(changedSV)/chsize);    % Linear kernel: Build the normal vector of the separating    % hyperplane by computing the weighted sum of all Support Vectors    for ch = 1:chunks,      ind = (1+(ch-1)*chsize):min(length(changedSV), ch*chsize);      temp = changedAlpha(ind).*Y(changedSV(ind));      net.normalw = net.normalw+temp'*X(changedSV(ind), :);                           %計算成本    end    % Find the output of the SVM by multiplying the examples with the    % normal vector    SVMout = zeros(N, 1);    chunks = ceil(N/chsize);    for ch = 1:chunks,      ind = (1+(ch-1)*chsize):min(N, ch*chsize);      SVMout(ind) = X(ind,:)*(net.normalw');    end  else    % A normal kernel function: Split both the examples and the Support    % Vectors into small chunks    chunks1 = ceil(N/chsize);    chunks2 = ceil(length(changedSV)/chsize);    for ch1 = 1:chunks1,      ind1 = (1+(ch1-1)*chsize):min(N, ch1*chsize);      for ch2 = 1:chunks2,	% Compute the kernel function for a chunk of Support Vectors and        % a chunk of examples	ind2 = (1+(ch2-1)*chsize):min(length(changedSV), ch2*chsize);	K12 = svmkernel(net, X(ind1, :), X(changedSV(ind2), :));	% Add the weighted kernel matrix to the SVM output. In update        % cycles, the kernel matrix is weighted by the difference of        % alphas, in other cycles it is weighted by the value alpha alone.	coeff = changedAlpha(ind2).*Y(changedSV(ind2));	SVMout(ind1) = SVMout(ind1)+K12*coeff;      end      if dodisplay>2,	K1all = svmkernel(net, X(ind1,:), X(net.svind,:));	coeff2 = net.alpha(net.svind).*Y(net.svind);	fprintf('Maximum error due to matrix partitioning: %g\n', ...		max((SVMout(ind1)-K1all*coeff2)'));      end    end  end    % Step 3: Compute the bias of the SVM decision function.  if net.use2norm,    % The bias can be found from the SVM output for Support Vectors. For    % those vectors, the output should be 1-alpha/C resp -1+alpha/C.    workSV = find(SV & workset);    if ~isempty(workSV),      net.bias = mean((1-net.alpha(workSV)./C(workSV)).*Y(workSV)- ...                      SVMout(workSV));    end  else    % normal 1norm SVM:    % The bias can be found from Support Vector whose value alpha is not at    % the upper bound. For those vectors, the SVM output should be +1    % resp. -1.    workSV = find(SVnonbound & workset);    if ~isempty(workSV),      net.bias = mean(Y(workSV)-SVMout(workSV));    end  end  % The nasty case that no SVs to determine the bias have been found.  % The only sensible thing do to is to leave the bias unchanged.  if isempty(workSV) & (dodisplay>0),    disp('No Support Vectors in the current working set.');

?? 快捷鍵說明

復(fù)制代碼 Ctrl + C
搜索代碼 Ctrl + F
全屏模式 F11
切換主題 Ctrl + Shift + D
顯示快捷鍵 ?
增大字號 Ctrl + =
減小字號 Ctrl + -
亚洲欧美第一页_禁久久精品乱码_粉嫩av一区二区三区免费野_久草精品视频
日韩精品一区国产麻豆| 精品国产免费一区二区三区香蕉| 青青草91视频| 国产精品福利影院| 日韩欧美精品在线视频| 色婷婷激情一区二区三区| 国产乱码字幕精品高清av| 亚洲小说春色综合另类电影| 中文字幕av一区二区三区高| 91精品国产综合久久婷婷香蕉| www.色精品| 国产精品一二三区在线| 免费看欧美女人艹b| 亚洲制服丝袜一区| 亚洲人妖av一区二区| 久久影院午夜论| 欧美一级xxx| 欧美三级电影一区| 色老综合老女人久久久| 成人在线综合网站| 国产麻豆精品在线观看| 麻豆成人av在线| 日本va欧美va欧美va精品| 亚洲国产日日夜夜| 亚洲伦理在线免费看| 国产精品二三区| 中文字幕高清一区| 欧美国产一区在线| 久久久精品天堂| 精品欧美黑人一区二区三区| 欧美性色黄大片| 日本道免费精品一区二区三区| 91亚洲午夜精品久久久久久| www.性欧美| 91在线国内视频| 91一区二区三区在线播放| 99精品久久久久久| 97精品视频在线观看自产线路二| 99久久国产免费看| 91视频.com| 欧美中文一区二区三区| 欧美自拍偷拍一区| 欧美视频在线不卡| 欧美高清一级片在线| 欧美日韩大陆一区二区| 欧美日韩在线精品一区二区三区激情| 在线观看亚洲一区| 欧美日韩免费不卡视频一区二区三区| 欧洲精品一区二区| 日韩欧美中文字幕公布| 欧美sm极限捆绑bd| 国产亚洲成av人在线观看导航| 精品国产伦一区二区三区免费| 精品成人一区二区| 国产日产欧美一区二区视频| 国产精品丝袜在线| 一区二区三区四区亚洲| 亚洲国产中文字幕在线视频综合 | 99久久精品免费| 91论坛在线播放| 欧美系列亚洲系列| 欧美电影精品一区二区| 久久九九国产精品| 国产精品久久久久久久岛一牛影视 | 欧美丰满少妇xxxxx高潮对白 | 欧美日韩日日夜夜| 欧美大白屁股肥臀xxxxxx| 久久午夜色播影院免费高清| 国产精品国产a级| 午夜视频一区在线观看| 韩国一区二区在线观看| 成人高清在线视频| 欧美日韩精品福利| 久久蜜桃av一区精品变态类天堂 | 在线欧美日韩国产| 日韩三级精品电影久久久| 国产日产欧产精品推荐色| 亚洲精品视频免费看| 激情六月婷婷久久| 色成年激情久久综合| 日韩免费高清电影| 综合分类小说区另类春色亚洲小说欧美| 午夜视频一区在线观看| 国产福利一区二区三区视频 | 中文字幕高清一区| 亚洲成av人影院| 成人黄色综合网站| 91精品在线观看入口| 国产精品久99| 麻豆国产欧美日韩综合精品二区| 99久久99久久综合| 亚洲精品一区二区三区精华液 | 亚洲国产婷婷综合在线精品| 国内精品视频666| 欧美日韩综合在线| 中文字幕精品一区| 久久国产人妖系列| 欧美亚洲综合色| 中文字幕在线视频一区| 麻豆精品视频在线观看| 色先锋久久av资源部| 日韩精品一区二区三区四区| 成人欧美一区二区三区1314| 精品一区二区成人精品| 在线不卡中文字幕| 亚洲九九爱视频| 国产成人综合视频| 欧美大尺度电影在线| 亚洲午夜电影在线观看| 波多野洁衣一区| 久久久国产精华| 麻豆精品久久精品色综合| 在线观看日韩毛片| 最新久久zyz资源站| 韩国欧美国产1区| 欧美一区二区高清| 天天爽夜夜爽夜夜爽精品视频| 99精品欧美一区二区三区综合在线| 精品欧美黑人一区二区三区| 日本va欧美va欧美va精品| 欧美综合一区二区| 日韩伦理电影网| av高清久久久| 亚洲欧洲成人自拍| 风间由美一区二区三区在线观看| 在线综合视频播放| 婷婷综合在线观看| 欧美丰满美乳xxx高潮www| 亚洲欧美日韩久久| 99久久精品情趣| 亚洲欧洲在线观看av| 不卡一卡二卡三乱码免费网站| 亚洲成人免费影院| 欧美性三三影院| 日日嗨av一区二区三区四区| 欧美日韩卡一卡二| 五月激情六月综合| 欧美精品777| 免费亚洲电影在线| 日韩免费电影网站| 国产精品夜夜嗨| 国产精品久久毛片| caoporen国产精品视频| 中文字幕日本不卡| 91成人在线免费观看| 亚洲成av人片在线| 日韩午夜三级在线| 国产一区不卡精品| 国产精品少妇自拍| 91久久人澡人人添人人爽欧美| 一区二区在线观看免费 | 国产亚洲va综合人人澡精品| 国产精品 日产精品 欧美精品| 久久网这里都是精品| 成人黄色大片在线观看| 中文字幕在线观看不卡| 欧美性视频一区二区三区| 日韩精品亚洲一区二区三区免费| 欧美一级片免费看| 国产一区二区在线观看免费| 国产精品欧美极品| 日本久久电影网| 五月天欧美精品| 亚洲精品一区二区三区在线观看| 成人h动漫精品| 亚洲二区视频在线| 精品国产网站在线观看| 丁香五精品蜜臀久久久久99网站| 亚洲人成亚洲人成在线观看图片| 欧美日韩一级视频| 国产一区二区在线看| 亚洲精品国产一区二区精华液| 日韩一区二区电影在线| 成人激情小说网站| 一区二区三区在线免费观看| 欧美一级二级三级蜜桃| 成人动漫av在线| 爽好多水快深点欧美视频| 久久婷婷久久一区二区三区| 不卡的av在线播放| 美女诱惑一区二区| 中文字幕在线观看一区| 欧美一区二区三区啪啪| 成人黄色小视频在线观看| 日韩高清不卡一区| 国产精品久久看| 日韩欧美一区二区久久婷婷| jvid福利写真一区二区三区| 婷婷综合另类小说色区| 日本一区二区久久| 欧美一区二区免费| 97se亚洲国产综合自在线不卡| 麻豆国产精品一区二区三区| 亚洲精品高清在线| 久久精品一级爱片| 日韩一级片网址| 在线观看免费一区| 福利一区二区在线观看| 日韩av不卡在线观看| 亚洲欧美欧美一区二区三区|