亚洲欧美第一页_禁久久精品乱码_粉嫩av一区二区三区免费野_久草精品视频

? 歡迎來到蟲蟲下載站! | ?? 資源下載 ?? 資源專輯 ?? 關于我們
? 蟲蟲下載站

?? svmcv.m

?? LibSVM工具箱
?? M
字號:
function [net, CVErr, paramSeq] = svmcv(net, X, Y, range, step, nfold, Xv, Yv, dodisplay)% SVMCV - Kernel parameter selection for SVM via cross validation% %   NET = SVMCV(NET, X, Y, RANGE)%   Given an initialised Support Vector Machine structure NET, the best%   setting of the kernel parameters is computed via 10fold cross%   validation. CV is done on the data points X (one point per row) with%   target values Y (+1 or -1). The kernel parameters that are tested lie%   between MIN(RANGE) and MAX(RANGE), starting with MIN(RANGE) for 'rbf'%   kernels and MAX(RANGE) for all other kernel functions.%   RANGE may also be a vector of length >2, in this case RANGE is taken%   as the explicit sequence of kernel parameters that are tested.%   SVMCV only works for kernel functions that require one parameter.%%   [NET, CVERR, PARAMSEQ] = SVMCV(NET, X, Y, RANGE, STEP, NFOLD)%   The sequence of test parameters is generated by Param(t+1) =%   Param(t)*STEP for 'rbf' kernels, and  Param(t+1) = Param(t)+STEP for%   all other kernels. Default value: SQRT(2) for 'rbf', -1 otherwise.%   Determine the parameters based on NFOLD cross validation.%   If STEP==[], RANGE is again interpreted as the explict sequence of%   kernel parameters.%%   The tested parameter sequence is returned in PARAMSEQ. For each entry%   PARAMSEQ(i), there is one line CVERR(i,:) that describes the%   estimated test set error. CVERR(i,1) is the mean,  CVERR(i,2) is the%   variance of the test set error over all NFOLD runs.%%   [NET, CVERR, PARAMSEQ] = SVMCV(NET, X, Y, RANGE, STEP, 1, XV, YV)%   does parameter selection based on one fixed validation set XV and%   YV. CVERR(i,2)==0 for all tested parameter settings.%   NET = SVMCV(NET, X, Y, RANGE, STEP, 1, XV, YV, DODISPLAY) displays%   error information for all tested parameters. DODISPLAY==0 shows%   nothing, DODISPLAY==1 shows a final CV summary (default),%   DODISPLAY==2 also shows the test set error for each trained SVM,%   DODISPLAY==3 includes the output produced by SVMTRAIN.%%   See also%   SVM, SVMTRAIN%   % % Copyright (c) by Anton Schwaighofer (2001)% $Revision: 1.6 $ $Date: 2001/06/05 19:20:00 $% mailto:anton.schwaighofer@gmx.net% % This program is released unter the GNU General Public License.% % Check arguments for consistencyerrstring = consist(net, 'svm', X, Y);if ~isempty(errstring);  error(errstring);endif nargin<9,  dodisplay = 1;endif nargin<8,  Xv = [];endif nargin<7,  Yv = [];endif nargin<6,  nfold = 10;endif (~isempty(Xv)) & (~isempty(Yv)),  errstring = consist(net, 'svm', Xv, Yv);  if ~isempty(errstring);    error(errstring);  end  if (nfold~=1),    error('Input parameters XV and YV may only be used with NFOLD==1');  endendif nargin<5,  step = 0;endrange = range(:)';N = size(X, 1);if N<nfold,  error('At least NFOLD (default 10) training examples must be given');endif (length(range)>2) | isempty(step),  % If range parameter has more than only min/max entries: Use this as  % the sequence of parameters to test  paramSeq = range;else  paramSeq = [];  switch net.kernel    case 'rbf'      if step==0,        step = sqrt(2);      end    % Multiplicative update, step size < 1 : start with max value    if abs(step)<1,      param = max(range);      while (param>=min(range)),        paramSeq = [paramSeq param];        param = param*abs(step);      end    else      % Multiplicative update, step size > 1 : start with min value      param = min(range);      while (param<=max(range)),        paramSeq = [paramSeq param];        param = param*abs(step);      end    end    otherwise      % Additive update for kernels other than 'rbf'      if step==0,        step = -1;      end      if step<0,        paramSeq = max(range):step:min(range);      else        paramSeq = min(range):step:max(range);      end  endend  % Storing all validation set errors for each parameter choiceallErr = zeros(nfold, length(paramSeq));% Storing the confusion matrices for each parameter choicecm = cell(1, length(paramSeq));for j = 1:length(paramSeq),  cm{j} = zeros(2);end% shuffle X and Y in the same wayperm = randperm(N);X = X(perm,:);Y = Y(perm,:);% size of one test setchsize = floor (N/nfold);% the training set is not exactly the whole data minus the one test set,% but it is the union of the other test sets. So only effsize examples% of the data set will ever be usedeffsize = nfold*chsize;% check if leave-one-out CV (or almost such) is requiredusePrev = (nfold>=(N/2));prevInd = [];for i = 1:nfold,  % currentX/Y is the current training set  if (nfold == 1),    currentX = X;    currentY = Y;    testX = Xv;    testY = Yv;  else    % start and end index of current test set    ind1 = 1+(i-1)*chsize;    ind2 = i*chsize;    currentInd = [1:(ind1-1), (ind2+1):effsize];    currentX = X(currentInd, :);    currentY = Y(currentInd, :);    testX = X(ind1:ind2,:);    testY = Y(ind1:ind2,:);  end;  % We start out with the most powerful kernel (smallest sigma for RBF  % kernel, highest degree for polynomial). We assume that all training  % examples will be support vectors due to overfitting, thus we start  % the optimization with a value of C/2 for each example.  if length(net.c(:))==1,    alpha0 = repmat(net.c, [length(currentY) 1]);    % The same upper bound for all examples  elseif length(net.c(:))==2,    alpha0 = zeros([length(currentY) 1]);    alpha0(currentY>0) = net.c(1);    alpha0(currentY<=0) = net.c(2);    % Different upper bounds C for the positive and negative examples  else    net2.c = net.c(perm);    alpha0 = net2.c(currentInd);    % Use different C for each example: permute the original C's  end  alpha0 = alpha0/2;  % Start out with alpha = C/2 for the optimization routine.  % another little trick for leave-one-out CV: training sets will only  % slightly differ, thus use the alphas from the previous iteration,  % even if it resulted from a different parameter setting, as initial  % values alpha0  if usePrev & ~isempty(prevInd),    a = zeros(N, 1);    a(currentInd) = alpha0;    a(prevInd) = prevAlpha;    alpha0 = a(currentInd);  end    % Now loop over all parameter settings and train the SVM on currentX/Y  net2 = net;  if (dodisplay>0),    fprintf('Split %i of the training data:\n', i);  end  for j = 1:length(paramSeq),    param = paramSeq(j);    net2.kernelpar = param;    % Plug the current parameter settings into the SVM and train on the    % current training set    net2 = svmtrain(net2, currentX, currentY, alpha0, max(0,dodisplay-2));    % Evaluate on the non-training data    testPred = svmfwd(net2, testX);    allErr(i, j) = mean(testY ~= testPred);    % Compute the confusion matrix    for k = [1 2],      for l = [1 2],        c(k,l) = sum(((testPred>=0)==(l-1)).*((testY>=0)==(k-1)));      end    end    cm{j} = cm{j}+c;    % take out the computed coefficients alpha and use them as starting    % values for the next iteration (next parameter choice)    alpha0 = net2.alpha;    if (dodisplay>1),      fprintf('Split %i with parameter %g:\n', i, param);      fprintf('  Test set error = %2.3f%%\n', allErr(i, j)*100);      [fracSV, normW] = svmstat(net2, (dodisplay>2));      fprintf('  Norm of the separating hyperplane: %g\n', normW);      fprintf('  Fraction of support vectors: %2.3f%%\n', fracSV*100);    end  end  if usePrev,    prevAlpha = net2.alpha;    prevInd = currentInd;  endend% Compute mean and standard deviation over all nfold runsmeanErr = mean(allErr, 1);stdErr = std(allErr, 0, 1);CVErr = [meanErr; stdErr]';% Find the point of minimum mean error and plug that parameter into the% output SVM structure. If there should be several points of minimal% error, select the one with minimal standard deviation[sortedMean, sortedInd] = sort(meanErr);minima = find(sortedMean(1)==meanErr);[dummy, sortedInd2] = sort(stdErr(minima));net.kernelpar = paramSeq(minima(sortedInd2(1)));if (dodisplay>0),  for j = 1:length(paramSeq),    fprintf('kernelpar=%g: Avg CV error %2.3f%% with stddev %1.4f\n', ...            paramSeq(j), meanErr(j)*100, stdErr(j)*100);    if any(cm{j}~=0),      fprintf('  Confusion matrix, averaged over all runs:\n');      fprintf('                  Predicted class:\n');      fprintf('               %5i         %5i\n', -1, +1);      c1 = cm{j}';      c2 = 100*c1./repmat(sum(c1), [2 1]);      c3 = [c1(:) c2(:)]';      fprintf(['  True -1: %5i (%3.2f%%)  %5i (%3.2f%%)\n  True +1: %5i' ...               ' (%3.2f%%)  %5i (%3.2f%%)\n'], c3(:));    end  endend

?? 快捷鍵說明

復制代碼 Ctrl + C
搜索代碼 Ctrl + F
全屏模式 F11
切換主題 Ctrl + Shift + D
顯示快捷鍵 ?
增大字號 Ctrl + =
減小字號 Ctrl + -
亚洲欧美第一页_禁久久精品乱码_粉嫩av一区二区三区免费野_久草精品视频
91色在线porny| 美女视频一区二区三区| 久久久久高清精品| 日韩欧美www| 欧美成人一区二区| 欧美电影免费观看高清完整版在线 | 久久成人麻豆午夜电影| 国产美女视频91| 成人免费看黄yyy456| 在线视频国内一区二区| 欧美三级蜜桃2在线观看| 4438x成人网最大色成网站| 日韩一区二区免费在线观看| 欧美mv日韩mv亚洲| 国产三级精品三级| 亚洲欧美日韩在线播放| 午夜久久久影院| 久久 天天综合| 大尺度一区二区| 欧美亚洲动漫精品| 91精品国产一区二区人妖| 色综合一个色综合| 亚洲卡通动漫在线| 在线观看视频一区| 日本网站在线观看一区二区三区| 久久久久久久久97黄色工厂| 久久精品久久99精品久久| 日韩欧美一卡二卡| 韩国欧美国产一区| 国产精品家庭影院| 欧美婷婷六月丁香综合色| 秋霞av亚洲一区二区三| 精品国产麻豆免费人成网站| 成人午夜激情影院| 亚洲成人动漫在线免费观看| 午夜精品影院在线观看| 国产乱码精品一区二区三区忘忧草 | 国产精品视频麻豆| 国产福利一区在线观看| 欧美国产精品久久| 极品美女销魂一区二区三区免费| 懂色av一区二区三区蜜臀 | 欧美一区二区三区人| 亚洲一区欧美一区| 色婷婷狠狠综合| 亚洲欧美激情插 | 99re8在线精品视频免费播放| 久久色成人在线| 亚洲综合视频网| 欧美在线播放高清精品| 亚洲三级电影全部在线观看高清| 国产99久久久久| 国产精品久久久久久久久快鸭 | 一区二区三区在线视频观看58 | 日韩免费视频一区二区| 日韩精品乱码av一区二区| 国产精品一区二区在线观看网站 | 亚洲色图丝袜美腿| 欧美私人免费视频| 日韩精品一级二级| 欧美一区二区视频网站| 午夜成人免费电影| 欧美va在线播放| 激情综合一区二区三区| 日韩一区二区电影| 国产麻豆一精品一av一免费| 国产日韩在线不卡| 日韩你懂的在线观看| 亚洲图片欧美色图| 国产激情视频一区二区在线观看| 99视频精品在线| 久久久久99精品一区| 亚洲成av人片| 色婷婷综合五月| 国产精品动漫网站| 欧美日本一区二区在线观看| 秋霞av亚洲一区二区三| 精品久久久久久久一区二区蜜臀| 风流少妇一区二区| 亚洲自拍偷拍av| 久久久噜噜噜久久人人看 | 国产人久久人人人人爽| 日本高清不卡在线观看| 国产成人精品一区二区三区网站观看| 久久久精品免费观看| 91极品视觉盛宴| 91视频在线看| 国产99久久久国产精品| 精品一区二区三区的国产在线播放 | 99国产一区二区三精品乱码| 久久国产免费看| 美女视频网站久久| 午夜精品国产更新| 亚洲网友自拍偷拍| 亚洲午夜视频在线| 一区二区三区美女视频| 亚洲精品乱码久久久久| 久久久蜜桃精品| 久久亚洲捆绑美女| 久久久久久黄色| 中文字幕精品在线不卡| 久久九九久精品国产免费直播| 欧美精品一区在线观看| 精品999久久久| 国产精品久久久久一区| 2017欧美狠狠色| 亚洲视频一二区| 亚洲观看高清完整版在线观看| 欧美激情在线一区二区三区| 亚洲免费大片在线观看| 日韩电影一区二区三区| 国产精品18久久久久久久久| 日韩电影免费一区| 国产乱码精品一区二区三区忘忧草| 国产精品一区在线观看你懂的| 91视视频在线直接观看在线看网页在线看| 91网站最新地址| 欧美一激情一区二区三区| 欧美经典一区二区| 天堂av在线一区| 风间由美一区二区av101 | 成人午夜碰碰视频| 欧美三级电影在线看| 91麻豆精品国产91久久久久久久久 | 免费观看久久久4p| 成人小视频免费在线观看| 欧美一区二区美女| 亚洲男人的天堂在线观看| 美女诱惑一区二区| 欧美肥大bbwbbw高潮| 亚洲精品自拍动漫在线| 国产一区视频在线看| 日韩欧美123| 婷婷六月综合网| 91精品在线免费观看| 亚洲欧洲精品成人久久奇米网| 狠狠色丁香久久婷婷综合丁香| 91福利社在线观看| 亚洲免费在线电影| 91网上在线视频| 亚洲精品伦理在线| 国产乱淫av一区二区三区| 欧美一卡二卡在线观看| 日韩精品久久理论片| 91精品国产91热久久久做人人| 丝袜美腿亚洲一区| 日韩一级精品视频在线观看| 欧美bbbbb| 亚洲欧洲精品成人久久奇米网| 欧美日韩一二三区| 国内精品在线播放| 一区二区三区 在线观看视频 | 亚洲午夜在线观看视频在线| 欧美xxxx在线观看| 在线观看亚洲精品| 国产91精品入口| 麻豆国产一区二区| 亚洲人成影院在线观看| 日韩视频免费观看高清在线视频| 成人美女视频在线看| 蜜桃av噜噜一区| 亚洲国产人成综合网站| 国产精品视频yy9299一区| 日韩一区二区三区四区| 欧美在线视频全部完| 成人国产在线观看| 韩国成人在线视频| 三级久久三级久久久| 日韩一区欧美一区| 国产精品色婷婷久久58| 久久久久久综合| 国产日韩欧美不卡在线| 久久亚洲私人国产精品va媚药| 91精品国产aⅴ一区二区| 欧美日韩视频在线第一区| 欧美无砖专区一中文字| 欧美三区在线观看| 欧美日本高清视频在线观看| 欧美人妇做爰xxxⅹ性高电影 | 中文字幕亚洲区| 中文字幕在线免费不卡| 一区二区三区在线观看网站| 一区二区三区四区高清精品免费观看 | 欧美成人精品二区三区99精品| 黄色日韩网站视频| 一区二区成人在线视频| 亚洲欧美aⅴ...| 亚洲国产精品久久人人爱| 亚洲男同性视频| 一区二区三区日韩在线观看| 亚洲欧美视频在线观看视频| 国产精品 日产精品 欧美精品| 国内外成人在线视频| 色中色一区二区| 欧美国产精品中文字幕| 麻豆中文一区二区| 国产精品天美传媒| 亚洲视频资源在线| 国产成人福利片| 日韩亚洲欧美一区二区三区|