亚洲欧美第一页_禁久久精品乱码_粉嫩av一区二区三区免费野_久草精品视频

? 歡迎來到蟲蟲下載站! | ?? 資源下載 ?? 資源專輯 ?? 關于我們
? 蟲蟲下載站

?? example43_run_b2.m

?? 神經網絡VC++代碼 人工神經網絡原理及仿真實例.
?? M
?? 第 1 頁 / 共 5 頁
字號:
function demsvm1()

% 

X = [2 7; 3 6; 2 2; 8 1; 6 4; 4 8; 9 5; 9 9; 9 4; 6 9; 7 4];
Y = [ +1;  +1;  +1;  +1;  +1;  -1;  -1;  -1;  -1;  -1;  -1];
% define a simple artificial data set

x1ran = [0 10];
x2ran = [0 10];
% range for plotting the data set and the decision boundary

% disp(' ');
% disp('This demonstration illustrates the use of a Support Vector Machine');
% disp('(SVM) for classification. The data is a set of 2D points, together');
% disp('with target values (class labels) +1 or -1.');
% disp(' ');
% disp('The data set consists of the points');

ind = [1:length(Y)]';
% fprintf('X%2i = (%2i, %2i) with label Y%2i = %2i\n', [ind, X, ind, Y]');
% disp(' ')
% disp('Press any key to plot the data set');
%  
% pause
% 
% f1 = figure;
% plotdata(X, Y, x1ran, x2ran);
% title('Data from class +1 (squares) and class -1 (crosses)');
% 
% % fprintf('\n\n\n\n');
% % fprintf('The data is plotted in figure %i, where\n', f1);
% % disp('  squares stand for points with label Yi = +1');
% % disp('  crosses stand for points with label Yi = -1');
% % disp(' ')
% % disp(' ');
% % disp('Now we train a Support Vector Machine classifier on this data set.');
% % disp('We use the most simple kernel function, namely the inner product');
% % disp('of points Xi, Xj (linear kernel K(Xi,Xj) = Xi''*Xj )');
% % disp(' ');
% % disp('Press any key to start training')
%  pause

% net = svm(size(X, 2), 'linear', [], 10);
% net = svmtrain(net, X, Y);
% 
% f2 = figure;
% plotboundary(net, x1ran, x2ran);
% plotdata(X, Y, x1ran, x2ran);
% plotsv(net, X, Y);
% title(['SVM with linear kernel: decision boundary (black) plus Support' ...
%        ' Vectors (red)']);

% fprintf('\n\n\n\n');
% fprintf('The resulting decision boundary is plotted in figure %i.\n', f2);
% disp('The contour plotted in black separates class +1 from class -1');
% disp('(this is the actual decision boundary)');
% disp('The contour plotted in red are the points at distance +1 from the');
% disp('decision boundary, the blue contour are the points at distance -1.');
% disp(' ');
% disp('All examples plotted in red are found to be Support Vectors.');
% disp('Support Vectors are the examples at distance +1 or -1 from the ');
% disp('decision boundary and all the examples that cannot be classified');
% disp('correctly.');
% disp(' ');
% disp('The data set shown can be correctly classified using a linear');
% disp('kernel. This can be seen from the coefficients alpha associated');
% disp('with each example: The coefficients are');
ind = [1:length(Y)]';
% fprintf('  Example %2i: alpha%2i = %5.2f\n', [ind, ind, net.alpha]');
% disp('The upper bound C for the coefficients has been set to');
% fprintf('C = %5.2f. None of the coefficients are at the bound,\n', ...
% 	net.c(1));
% disp('this means that all examples in the training set can be correctly');
% disp('classified by the SVM.')
% disp(' ');
% disp('Press any key to continue')
%  pause

X = [X; [4 4]];
Y = [Y; -1];
% net = svm(size(X, 2), 'linear', [], 10);
% net = svmtrain(net, X, Y);
% 
% f1 = figure;
% plotboundary(net, x1ran, x2ran);
% plotdata(X, Y, x1ran, x2ran);
% plotsv(net, X, Y);
% title(['SVM with linear kernel: decision boundary (black) plus Support' ...
%        ' Vectors (red)']);

% fprintf('\n\n\n\n');
% disp('Adding an additional point X12 with label -1 gives a data set');
% disp('that can not be linearly separated. The SVM handles this case by');
% disp('allowing training points to be misclassified.');
% disp(' ');
% disp('Training the SVM on this modified data set we see that the points');
% disp('X5, X11 and X12 can not be correctly classified. The decision');
% fprintf('boundary is shown in figure %i.\n', f3);
% disp('The coefficients alpha associated with each example are');
ind = [1:length(Y)]';
% fprintf('  Example %2i: alpha%2i = %5.2f\n', [ind, ind, net.alpha]');
% disp('The coefficients of the misclassified points are at the upper');
% disp('bound C.');
% disp(' ')
% disp('Press any key to continue')



% fprintf('\n\n\n\n');
% disp('Adding the new point X12 has lead to a more difficult data set');
% disp('that can no longer be separated by a simple linear kernel.');
% disp('We can now switch to a more powerful kernel function, namely');
% disp('the Radial Basis Function (RBF) kernel.');
% disp(' ')
% disp('The RBF kernel has an associated parameter, the kernel width.');
% disp('We will now show the decision boundary obtained from a SVM with');
% disp('RBF kernel for 3 different values of the kernel width.');
% disp(' ');
% disp('Press any key to continue')
% pause

net = svm(size(X, 2), 'rbf', [8], 100);
net = svmtrain(net, X, Y);

f1 = figure;
plotboundary(net, x1ran, x2ran);
plotdata(X, Y, x1ran, x2ran);
plotsv(net, X, Y);
title(['SVM with RBF kernel, width 8: decision boundary (black)' ...
       ' plus Support Vectors (red)']); 

% fprintf('\n\n\n\n');
% fprintf('Figure %i shows the decision boundary obtained from a SVM\n', ...
% 	f4);
% disp('with Radial Basis Function kernel, the kernel width has been');
% disp('set to 8.');
% disp('The SVM now interprets the new point X12 as evidence for a');
% disp('cluster of points from class -1, the SVM builds a small ''island''');
% disp('around X12.');
% disp(' ')
% disp('Press any key to continue')
% pause
% 
% 
% net = svm(size(X, 2), 'rbf', [1], 100);
% net = svmtrain(net, X, Y);
% 
% f3 = figure;
% plotboundary(net, x1ran, x2ran);
% plotdata(X, Y, x1ran, x2ran);
% plotsv(net, X, Y);
% title(['SVM with RBF kernel, width 1: decision boundary (black)' ...
%        ' plus Support Vectors (red)']); 

% fprintf('\n\n\n\n');
% fprintf('Figure %i shows the decision boundary obtained from a SVM\n', ...
% 	f5);
% disp('with radial basis function kernel, kernel width 1.');
% disp('The decision boundary is now highly shattered, since a smaller');
% disp('kernel width allows the decision boundary to be more curved.');
% disp(' ')
% disp('Press any key to continue')
% pause
% 
% 
% net = svm(size(X, 2), 'rbf', [36], 100);
% net = svmtrain(net, X, Y);
% 
% f4 = figure;
% plotboundary(net, x1ran, x2ran);
% plotdata(X, Y, x1ran, x2ran);
% plotsv(net, X, Y);
% title(['SVM with RBF kernel, width 36: decision boundary (black)' ...
%        ' plus Support Vectors (red)']); 

% fprintf('\n\n\n\n');
% fprintf('Figure %i shows the decision boundary obtained from a SVM\n', ...
% 	f6);
% disp('with radial basis function kernel, kernel width 36.');
% disp('This gives a decision boundary similar to the one shown in');
% fprintf('Figure %i for the SVM with linear kernel.\n', f2);
% 
% 
% fprintf('\n\n\n\n');
% disp('Press any key to end the demo')
% pause



function plotdata(X, Y, x1ran, x2ran)
% PLOTDATA - Plot 2D data set
% 

hold on;
ind = find(Y>0);
plot(X(ind,1), X(ind,2), 'ks');
ind = find(Y<0);
plot(X(ind,1), X(ind,2), 'kx');
text(X(:,1)+.2,X(:,2), int2str([1:length(Y)]'));
axis([x1ran x2ran]);
axis xy;


function plotsv(net, X, Y)
% PLOTSV - Plot Support Vectors
% 

hold on;
ind = find(Y(net.svind)>0);
plot(X(net.svind(ind),1),X(net.svind(ind),2),'rs');
ind = find(Y(net.svind)<0);
plot(X(net.svind(ind),1),X(net.svind(ind),2),'rx');


function [x11, x22, x1x2out] = plotboundary(net, x1ran, x2ran)
% PLOTBOUNDARY - Plot SVM decision boundary on range X1RAN and X2RAN
% 

hold on;
nbpoints = 100;
x1 = x1ran(1):(x1ran(2)-x1ran(1))/nbpoints:x1ran(2);
x2 = x2ran(1):(x2ran(2)-x2ran(1))/nbpoints:x2ran(2);
[x11, x22] = meshgrid(x1, x2);
[dummy, x1x2out] = svmfwd(net, [x11(:),x22(:)]);
x1x2out = reshape(x1x2out, [length(x1) length(x2)]);
contour(x11, x22, x1x2out, [-0.99 -0.99], 'b-');
contour(x11, x22, x1x2out, [0 0], 'k-');
contour(x11, x22, x1x2out, [0.99 0.99], 'g-');

function net = svmtrain(net, X, Y, alpha0, dodisplay)

% Check arguments for consistency
errstring = consist(net, 'svm', X, Y);
if ~isempty(errstring);
  error(errstring);
end
[N, d] = size(X);
if N==0,
  error('No training examples given');
end
net.nbexamples = N;
if nargin<5,
  dodisplay = 0;
end
if nargin<4,
  alpha0 = [];
elseif (~isempty(alpha0)) & (~all(size(alpha0)==[N 1])),
  error(['Initial values ALPHA0 must be a column vector with the same length' ...
	 ' as X']); 
end

% Find the indices of examples from class +1 and -1
class1 = logical(uint8(Y>=0));
class0 = logical(uint8(Y<0));

if length(net.c(:))==1,
  C = repmat(net.c, [N 1]);
  % The same upper bound for all examples
elseif length(net.c(:))==2,
  C = zeros([N 1]);
  C(class1) = net.c(1);
  C(class0) = net.c(2);
  % Different upper bounds C for the positive and negative examples
else
  C = net.c;
  if ~all(size(C)==[N 1]),
    error(['Upper bound C must be a column vector with the same length' ...
	   ' as X']); 
  end
end
if min(C)<net.alphatol,
  error('NET.C must be positive and larger than NET.alphatol');
end

if ~isfield(net, 'use2norm'),
  net.use2norm = 0;
end

if ~isfield(net, 'qpsolver'),
  net.qpsolver = '';
end
qpsolver = net.qpsolver;
if isempty(qpsolver),
  % QUADPROG is the fastest solver for both 1norm and 2norm SVMs, if
  % qpsize is around 10-70 (loqo is best for large 1norm SVMs)
  checkseq = {'quadprog', 'loqo', 'qp'};
  i = 1;
  while (i <= length(checkseq)),
    e = exist(checkseq{i});
    if (e==2) | (e==3),
      qpsolver = checkseq{i};
      break;
    end
    i = i+1;
  end
  if isempty(qpsolver),
    error('No quadratic programming solver (QUADPROG,LOQO,QP) found.');
  end
end
% Mind that there may occur problems with the QUADPROG solver. At least in
% early versions of Matlab 5.3 there are severe numerical problems somewhere
% deep in QUADPROG

% Turn off all messages coming from quadprog, increase the maximum number
% of iterations from 200 to 500 - good for low-dimensional problems
if strcmp(qpsolver, 'quadprog') & (dodisplay==0),
  quadprogopt = optimset('Display', 'off', 'MaxIter', 500);
else
  quadprogopt = [];
end

% Actual size of quadratic program during training may not be larger than
% the number of examples
QPsize = min(N, net.qpsize);
chsize = net.chunksize;

% SVMout contains the output of the SVM decision function for each
% example. This is updated iteratively during training.
SVMout = zeros(N, 1);

% Make sure there are no other values in Y than +1 and -1
Y(class1) = 1;
Y(class0) = -1;
if dodisplay>0,
  fprintf('Training set: %i examples (%i positive, %i negative)\n', ...
	  length(Y), length(find(class1)), length(find(class0)));
end

% Start with a vector of zeros for the coefficients alpha, or the
% parameter ALPHA0, if it is given. Those values will be used to perform
% an initial working set selection, by assuming they are the true weights
% for the training set at hand.
if ~any(alpha0),
  net.alpha = zeros([N 1]);
  % If starting with a zero vector: randomize the first working set search
  randomWS = 1;
else
  randomWS = 0;
  % for 1norm SVM: make the initial values conform to the upper bounds
  if ~net.use2norm,
    net.alpha = min(C, alpha0);
  end
end
alphaOld = net.alpha;

if length(find(Y>0))==N,
  % only positive examples
  net.bias = 1;
  net.svcoeff = [];
  net.sv = [];
  net.svind = [];
  net.alpha = zeros([N 1]);
  return;
elseif length(find(Y<0))==N,
  % only negative examples
  net.bias = 1;
  net.svcoeff = [];
  net.sv = [];
  net.svind = [];
  net.alpha = zeros([N 1]);
  return;
end

iteration = 0;
workset = logical(uint8(zeros(N, 1)));
sameWS = 0;
net.bias = 0;

while 1,

  if dodisplay>0,
    fprintf('\nIteration %i: ', iteration+1);
  end

  % Step 1: Determine the Support Vectors.
  [net, SVthresh, SV, SVbound, SVnonbound] = findSV(net, C);
  if dodisplay>0,
    fprintf(['Working set of size %i: %i Support Vectors, %i of them at' ...
	     ' bound C\n'], length(find(workset)), length(find(workset & SV)), ...
	    length(find(workset & SVbound))); 
    fprintf(['Whole training set: %i Support Vectors, %i of them at upper' ...
	     ' bound C.\n'], length(net.svind), length(find(SVbound)));
    if dodisplay>1,
      fprintf('The Support Vectors (threshold %g) are the examples\n', ...
	      SVthresh);
      fprintf(' %i', net.svind);
      fprintf('\n');
    end
  end

  
  % Step 2: Find the output of the SVM for all training examples
  if (iteration==0) | (mod(iteration, net.recompute)==0),
    % Every NET.recompute iterations the SVM output is built from
    % scratch. Use all Support Vectors for determining the output.
    changedSV = net.svind;
    changedAlpha = net.alpha(changedSV);
    SVMout = zeros(N, 1);
    if strcmp(net.kernel, 'linear'),
      net.normalw = zeros([1 d]);
    end
  else

?? 快捷鍵說明

復制代碼 Ctrl + C
搜索代碼 Ctrl + F
全屏模式 F11
切換主題 Ctrl + Shift + D
顯示快捷鍵 ?
增大字號 Ctrl + =
減小字號 Ctrl + -
亚洲欧美第一页_禁久久精品乱码_粉嫩av一区二区三区免费野_久草精品视频
一区二区欧美在线观看| 韩国一区二区三区| 日韩av一级电影| av综合在线播放| 欧美大片一区二区| 亚洲国产美女搞黄色| 国产999精品久久| 欧美一级淫片007| 一区二区三区免费看视频| 国产激情一区二区三区四区 | 日韩免费高清电影| 亚洲乱码中文字幕| 成人激情免费视频| 久久香蕉国产线看观看99| 视频一区视频二区在线观看| 一本久久精品一区二区| 国产日韩av一区二区| 九九热在线视频观看这里只有精品| 欧美探花视频资源| 亚洲美女精品一区| 99这里都是精品| 中国色在线观看另类| 国产精品综合一区二区| 欧美α欧美αv大片| 免费高清不卡av| 欧美日韩和欧美的一区二区| 亚洲一区二区三区四区在线观看| 99久久精品一区二区| 国产精品久久久久久久久快鸭| 国产精品亚洲一区二区三区妖精 | 亚洲私人影院在线观看| 成人午夜激情影院| 国产精品视频一二三区| av影院午夜一区| 一区二区三区欧美亚洲| 欧美色综合网站| 热久久国产精品| 欧美成人三级在线| 国产精品69久久久久水密桃| 欧美国产日本视频| 99久久精品国产毛片| 亚洲男人的天堂网| 欧美日韩国产小视频在线观看| 亚洲国产精品久久人人爱蜜臀 | 成人高清视频在线| 亚洲人成小说网站色在线| 91视频.com| 午夜精品久久久久久久| 日韩一区二区三区四区五区六区 | 亚洲第一二三四区| 欧美一区二区三区四区久久| 激情久久久久久久久久久久久久久久| 精品免费日韩av| av影院午夜一区| 亚洲成人激情社区| 久久久精品综合| 91免费在线看| 玖玖九九国产精品| 国产精品毛片无遮挡高清| 在线看国产一区| 青青草91视频| 国产精品久久久久精k8| 欧美三级午夜理伦三级中视频| 亚洲一二三区在线观看| 欧美成人一区二区三区在线观看 | 国产激情视频一区二区在线观看 | 91精品综合久久久久久| 九九久久精品视频| 一区二区三区在线观看国产| 91精品国产品国语在线不卡| 成人动漫在线一区| 亚洲va韩国va欧美va| 国产喂奶挤奶一区二区三区| 在线视频你懂得一区| 精品一区二区三区免费播放| 亚洲男人电影天堂| 精品日韩欧美一区二区| 色激情天天射综合网| 国产一区二区在线看| 亚洲第一狼人社区| 国产精品嫩草影院av蜜臀| 日韩一区二区三区高清免费看看| 菠萝蜜视频在线观看一区| 奇米精品一区二区三区在线观看 | 国产成人免费在线视频| 亚洲国产一区在线观看| 国产精品毛片大码女人| 久久综合丝袜日本网| 欧美日韩国产电影| 日本韩国一区二区| 不卡的av网站| 国产成人在线色| 免费成人你懂的| 日韩成人av影视| 亚洲一区二区三区影院| 亚洲日本乱码在线观看| 国产精品午夜久久| 久久精品一区二区三区不卡| 欧美电视剧免费全集观看| 欧美日韩日日夜夜| 欧美日韩国产一级| 欧美日韩中文国产| 欧美性一区二区| 欧美日韩视频在线观看一区二区三区| 99久久亚洲一区二区三区青草| 国产成人精品一区二区三区四区 | 免费精品视频在线| 五月婷婷另类国产| 日韩中文字幕亚洲一区二区va在线 | 国产激情视频一区二区三区欧美 | 国产日韩av一区| 2021国产精品久久精品| 日韩精品一区二区三区视频在线观看| 在线视频一区二区免费| 欧美色综合天天久久综合精品| 欧美最猛性xxxxx直播| 色久综合一二码| 欧美日韩卡一卡二| 欧美日韩aaa| 欧美一区二区福利在线| 日韩欧美一区在线观看| 精品国产一区久久| 久久久综合网站| 国产女人aaa级久久久级| 中文字幕精品在线不卡| 国产精品国产三级国产三级人妇| 国产精品色在线观看| 亚洲人妖av一区二区| 亚洲欧美激情视频在线观看一区二区三区 | 久久国产福利国产秒拍| 黄网站免费久久| 成人黄动漫网站免费app| 99久久伊人网影院| 欧美午夜一区二区三区| 91精品一区二区三区在线观看| 日韩欧美国产电影| 欧美极品少妇xxxxⅹ高跟鞋| 亚洲视频狠狠干| 日本亚洲天堂网| 国产盗摄一区二区| 欧美午夜宅男影院| 日韩一区二区麻豆国产| 国产欧美日韩激情| 亚洲午夜影视影院在线观看| 久久狠狠亚洲综合| av高清不卡在线| 欧美一级夜夜爽| 中文字幕av一区 二区| 亚洲综合激情小说| 激情成人午夜视频| 欧美日韩小视频| 国产日本欧洲亚洲| 午夜影院久久久| 国产福利一区二区三区| 欧美日韩国产免费一区二区 | 国产日本欧美一区二区| 天天操天天干天天综合网| 国产福利一区二区| 欧美精品久久99久久在免费线 | 欧美激情在线一区二区三区| 亚洲成人一区在线| 99re热这里只有精品免费视频| 3d成人动漫网站| 亚洲欧美韩国综合色| 国产馆精品极品| 91精品国产欧美一区二区18| 国产精品福利一区| 精品亚洲porn| 欧美网站一区二区| 国产精品久久久久久妇女6080 | 色94色欧美sute亚洲线路一久| 欧美成人三级在线| 日韩制服丝袜av| 一本大道久久a久久综合| 欧美精品一区二区三区蜜桃视频 | 丁香桃色午夜亚洲一区二区三区| 91麻豆精品久久久久蜜臀| 亚洲青青青在线视频| av在线不卡免费看| 久久精品视频网| 精彩视频一区二区三区| 日韩三级精品电影久久久 | 亚洲制服丝袜av| 99国内精品久久| 国产欧美一二三区| 国产成人综合在线播放| 精品av久久707| 久久av资源网| 日韩欧美色电影| 久久综合综合久久综合| 91麻豆精品国产自产在线观看一区| 亚洲黄色录像片| 色婷婷久久99综合精品jk白丝 | 久久夜色精品国产欧美乱极品| 午夜精品在线视频一区| 欧美日韩国产高清一区二区 | 日韩av电影一区| 91精品久久久久久蜜臀| 婷婷综合久久一区二区三区| 欧美色手机在线观看|