亚洲欧美第一页_禁久久精品乱码_粉嫩av一区二区三区免费野_久草精品视频

? 歡迎來到蟲蟲下載站! | ?? 資源下載 ?? 資源專輯 ?? 關于我們
? 蟲蟲下載站

?? example43_run_b2.m

?? 神經網絡VC++代碼 人工神經網絡原理及仿真實例.
?? M
?? 第 1 頁 / 共 5 頁
字號:
function demsvm1()

% 

X = [2 7; 3 6; 2 2; 8 1; 6 4; 4 8; 9 5; 9 9; 9 4; 6 9; 7 4];
Y = [ +1;  +1;  +1;  +1;  +1;  -1;  -1;  -1;  -1;  -1;  -1];
% define a simple artificial data set

x1ran = [0 10];
x2ran = [0 10];
% range for plotting the data set and the decision boundary

% disp(' ');
% disp('This demonstration illustrates the use of a Support Vector Machine');
% disp('(SVM) for classification. The data is a set of 2D points, together');
% disp('with target values (class labels) +1 or -1.');
% disp(' ');
% disp('The data set consists of the points');

ind = [1:length(Y)]';
% fprintf('X%2i = (%2i, %2i) with label Y%2i = %2i\n', [ind, X, ind, Y]');
% disp(' ')
% disp('Press any key to plot the data set');
%  
% pause
% 
% f1 = figure;
% plotdata(X, Y, x1ran, x2ran);
% title('Data from class +1 (squares) and class -1 (crosses)');
% 
% % fprintf('\n\n\n\n');
% % fprintf('The data is plotted in figure %i, where\n', f1);
% % disp('  squares stand for points with label Yi = +1');
% % disp('  crosses stand for points with label Yi = -1');
% % disp(' ')
% % disp(' ');
% % disp('Now we train a Support Vector Machine classifier on this data set.');
% % disp('We use the most simple kernel function, namely the inner product');
% % disp('of points Xi, Xj (linear kernel K(Xi,Xj) = Xi''*Xj )');
% % disp(' ');
% % disp('Press any key to start training')
%  pause

% net = svm(size(X, 2), 'linear', [], 10);
% net = svmtrain(net, X, Y);
% 
% f2 = figure;
% plotboundary(net, x1ran, x2ran);
% plotdata(X, Y, x1ran, x2ran);
% plotsv(net, X, Y);
% title(['SVM with linear kernel: decision boundary (black) plus Support' ...
%        ' Vectors (red)']);

% fprintf('\n\n\n\n');
% fprintf('The resulting decision boundary is plotted in figure %i.\n', f2);
% disp('The contour plotted in black separates class +1 from class -1');
% disp('(this is the actual decision boundary)');
% disp('The contour plotted in red are the points at distance +1 from the');
% disp('decision boundary, the blue contour are the points at distance -1.');
% disp(' ');
% disp('All examples plotted in red are found to be Support Vectors.');
% disp('Support Vectors are the examples at distance +1 or -1 from the ');
% disp('decision boundary and all the examples that cannot be classified');
% disp('correctly.');
% disp(' ');
% disp('The data set shown can be correctly classified using a linear');
% disp('kernel. This can be seen from the coefficients alpha associated');
% disp('with each example: The coefficients are');
ind = [1:length(Y)]';
% fprintf('  Example %2i: alpha%2i = %5.2f\n', [ind, ind, net.alpha]');
% disp('The upper bound C for the coefficients has been set to');
% fprintf('C = %5.2f. None of the coefficients are at the bound,\n', ...
% 	net.c(1));
% disp('this means that all examples in the training set can be correctly');
% disp('classified by the SVM.')
% disp(' ');
% disp('Press any key to continue')
%  pause

X = [X; [4 4]];
Y = [Y; -1];
% net = svm(size(X, 2), 'linear', [], 10);
% net = svmtrain(net, X, Y);
% 
% f1 = figure;
% plotboundary(net, x1ran, x2ran);
% plotdata(X, Y, x1ran, x2ran);
% plotsv(net, X, Y);
% title(['SVM with linear kernel: decision boundary (black) plus Support' ...
%        ' Vectors (red)']);

% fprintf('\n\n\n\n');
% disp('Adding an additional point X12 with label -1 gives a data set');
% disp('that can not be linearly separated. The SVM handles this case by');
% disp('allowing training points to be misclassified.');
% disp(' ');
% disp('Training the SVM on this modified data set we see that the points');
% disp('X5, X11 and X12 can not be correctly classified. The decision');
% fprintf('boundary is shown in figure %i.\n', f3);
% disp('The coefficients alpha associated with each example are');
ind = [1:length(Y)]';
% fprintf('  Example %2i: alpha%2i = %5.2f\n', [ind, ind, net.alpha]');
% disp('The coefficients of the misclassified points are at the upper');
% disp('bound C.');
% disp(' ')
% disp('Press any key to continue')



% fprintf('\n\n\n\n');
% disp('Adding the new point X12 has lead to a more difficult data set');
% disp('that can no longer be separated by a simple linear kernel.');
% disp('We can now switch to a more powerful kernel function, namely');
% disp('the Radial Basis Function (RBF) kernel.');
% disp(' ')
% disp('The RBF kernel has an associated parameter, the kernel width.');
% disp('We will now show the decision boundary obtained from a SVM with');
% disp('RBF kernel for 3 different values of the kernel width.');
% disp(' ');
% disp('Press any key to continue')
% pause

net = svm(size(X, 2), 'rbf', [8], 100);
net = svmtrain(net, X, Y);

f1 = figure;
plotboundary(net, x1ran, x2ran);
plotdata(X, Y, x1ran, x2ran);
plotsv(net, X, Y);
title(['SVM with RBF kernel, width 8: decision boundary (black)' ...
       ' plus Support Vectors (red)']); 

% fprintf('\n\n\n\n');
% fprintf('Figure %i shows the decision boundary obtained from a SVM\n', ...
% 	f4);
% disp('with Radial Basis Function kernel, the kernel width has been');
% disp('set to 8.');
% disp('The SVM now interprets the new point X12 as evidence for a');
% disp('cluster of points from class -1, the SVM builds a small ''island''');
% disp('around X12.');
% disp(' ')
% disp('Press any key to continue')
% pause
% 
% 
% net = svm(size(X, 2), 'rbf', [1], 100);
% net = svmtrain(net, X, Y);
% 
% f3 = figure;
% plotboundary(net, x1ran, x2ran);
% plotdata(X, Y, x1ran, x2ran);
% plotsv(net, X, Y);
% title(['SVM with RBF kernel, width 1: decision boundary (black)' ...
%        ' plus Support Vectors (red)']); 

% fprintf('\n\n\n\n');
% fprintf('Figure %i shows the decision boundary obtained from a SVM\n', ...
% 	f5);
% disp('with radial basis function kernel, kernel width 1.');
% disp('The decision boundary is now highly shattered, since a smaller');
% disp('kernel width allows the decision boundary to be more curved.');
% disp(' ')
% disp('Press any key to continue')
% pause
% 
% 
% net = svm(size(X, 2), 'rbf', [36], 100);
% net = svmtrain(net, X, Y);
% 
% f4 = figure;
% plotboundary(net, x1ran, x2ran);
% plotdata(X, Y, x1ran, x2ran);
% plotsv(net, X, Y);
% title(['SVM with RBF kernel, width 36: decision boundary (black)' ...
%        ' plus Support Vectors (red)']); 

% fprintf('\n\n\n\n');
% fprintf('Figure %i shows the decision boundary obtained from a SVM\n', ...
% 	f6);
% disp('with radial basis function kernel, kernel width 36.');
% disp('This gives a decision boundary similar to the one shown in');
% fprintf('Figure %i for the SVM with linear kernel.\n', f2);
% 
% 
% fprintf('\n\n\n\n');
% disp('Press any key to end the demo')
% pause



function plotdata(X, Y, x1ran, x2ran)
% PLOTDATA - Plot 2D data set
% 

hold on;
ind = find(Y>0);
plot(X(ind,1), X(ind,2), 'ks');
ind = find(Y<0);
plot(X(ind,1), X(ind,2), 'kx');
text(X(:,1)+.2,X(:,2), int2str([1:length(Y)]'));
axis([x1ran x2ran]);
axis xy;


function plotsv(net, X, Y)
% PLOTSV - Plot Support Vectors
% 

hold on;
ind = find(Y(net.svind)>0);
plot(X(net.svind(ind),1),X(net.svind(ind),2),'rs');
ind = find(Y(net.svind)<0);
plot(X(net.svind(ind),1),X(net.svind(ind),2),'rx');


function [x11, x22, x1x2out] = plotboundary(net, x1ran, x2ran)
% PLOTBOUNDARY - Plot SVM decision boundary on range X1RAN and X2RAN
% 

hold on;
nbpoints = 100;
x1 = x1ran(1):(x1ran(2)-x1ran(1))/nbpoints:x1ran(2);
x2 = x2ran(1):(x2ran(2)-x2ran(1))/nbpoints:x2ran(2);
[x11, x22] = meshgrid(x1, x2);
[dummy, x1x2out] = svmfwd(net, [x11(:),x22(:)]);
x1x2out = reshape(x1x2out, [length(x1) length(x2)]);
contour(x11, x22, x1x2out, [-0.99 -0.99], 'b-');
contour(x11, x22, x1x2out, [0 0], 'k-');
contour(x11, x22, x1x2out, [0.99 0.99], 'g-');

function net = svmtrain(net, X, Y, alpha0, dodisplay)

% Check arguments for consistency
errstring = consist(net, 'svm', X, Y);
if ~isempty(errstring);
  error(errstring);
end
[N, d] = size(X);
if N==0,
  error('No training examples given');
end
net.nbexamples = N;
if nargin<5,
  dodisplay = 0;
end
if nargin<4,
  alpha0 = [];
elseif (~isempty(alpha0)) & (~all(size(alpha0)==[N 1])),
  error(['Initial values ALPHA0 must be a column vector with the same length' ...
	 ' as X']); 
end

% Find the indices of examples from class +1 and -1
class1 = logical(uint8(Y>=0));
class0 = logical(uint8(Y<0));

if length(net.c(:))==1,
  C = repmat(net.c, [N 1]);
  % The same upper bound for all examples
elseif length(net.c(:))==2,
  C = zeros([N 1]);
  C(class1) = net.c(1);
  C(class0) = net.c(2);
  % Different upper bounds C for the positive and negative examples
else
  C = net.c;
  if ~all(size(C)==[N 1]),
    error(['Upper bound C must be a column vector with the same length' ...
	   ' as X']); 
  end
end
if min(C)<net.alphatol,
  error('NET.C must be positive and larger than NET.alphatol');
end

if ~isfield(net, 'use2norm'),
  net.use2norm = 0;
end

if ~isfield(net, 'qpsolver'),
  net.qpsolver = '';
end
qpsolver = net.qpsolver;
if isempty(qpsolver),
  % QUADPROG is the fastest solver for both 1norm and 2norm SVMs, if
  % qpsize is around 10-70 (loqo is best for large 1norm SVMs)
  checkseq = {'quadprog', 'loqo', 'qp'};
  i = 1;
  while (i <= length(checkseq)),
    e = exist(checkseq{i});
    if (e==2) | (e==3),
      qpsolver = checkseq{i};
      break;
    end
    i = i+1;
  end
  if isempty(qpsolver),
    error('No quadratic programming solver (QUADPROG,LOQO,QP) found.');
  end
end
% Mind that there may occur problems with the QUADPROG solver. At least in
% early versions of Matlab 5.3 there are severe numerical problems somewhere
% deep in QUADPROG

% Turn off all messages coming from quadprog, increase the maximum number
% of iterations from 200 to 500 - good for low-dimensional problems
if strcmp(qpsolver, 'quadprog') & (dodisplay==0),
  quadprogopt = optimset('Display', 'off', 'MaxIter', 500);
else
  quadprogopt = [];
end

% Actual size of quadratic program during training may not be larger than
% the number of examples
QPsize = min(N, net.qpsize);
chsize = net.chunksize;

% SVMout contains the output of the SVM decision function for each
% example. This is updated iteratively during training.
SVMout = zeros(N, 1);

% Make sure there are no other values in Y than +1 and -1
Y(class1) = 1;
Y(class0) = -1;
if dodisplay>0,
  fprintf('Training set: %i examples (%i positive, %i negative)\n', ...
	  length(Y), length(find(class1)), length(find(class0)));
end

% Start with a vector of zeros for the coefficients alpha, or the
% parameter ALPHA0, if it is given. Those values will be used to perform
% an initial working set selection, by assuming they are the true weights
% for the training set at hand.
if ~any(alpha0),
  net.alpha = zeros([N 1]);
  % If starting with a zero vector: randomize the first working set search
  randomWS = 1;
else
  randomWS = 0;
  % for 1norm SVM: make the initial values conform to the upper bounds
  if ~net.use2norm,
    net.alpha = min(C, alpha0);
  end
end
alphaOld = net.alpha;

if length(find(Y>0))==N,
  % only positive examples
  net.bias = 1;
  net.svcoeff = [];
  net.sv = [];
  net.svind = [];
  net.alpha = zeros([N 1]);
  return;
elseif length(find(Y<0))==N,
  % only negative examples
  net.bias = 1;
  net.svcoeff = [];
  net.sv = [];
  net.svind = [];
  net.alpha = zeros([N 1]);
  return;
end

iteration = 0;
workset = logical(uint8(zeros(N, 1)));
sameWS = 0;
net.bias = 0;

while 1,

  if dodisplay>0,
    fprintf('\nIteration %i: ', iteration+1);
  end

  % Step 1: Determine the Support Vectors.
  [net, SVthresh, SV, SVbound, SVnonbound] = findSV(net, C);
  if dodisplay>0,
    fprintf(['Working set of size %i: %i Support Vectors, %i of them at' ...
	     ' bound C\n'], length(find(workset)), length(find(workset & SV)), ...
	    length(find(workset & SVbound))); 
    fprintf(['Whole training set: %i Support Vectors, %i of them at upper' ...
	     ' bound C.\n'], length(net.svind), length(find(SVbound)));
    if dodisplay>1,
      fprintf('The Support Vectors (threshold %g) are the examples\n', ...
	      SVthresh);
      fprintf(' %i', net.svind);
      fprintf('\n');
    end
  end

  
  % Step 2: Find the output of the SVM for all training examples
  if (iteration==0) | (mod(iteration, net.recompute)==0),
    % Every NET.recompute iterations the SVM output is built from
    % scratch. Use all Support Vectors for determining the output.
    changedSV = net.svind;
    changedAlpha = net.alpha(changedSV);
    SVMout = zeros(N, 1);
    if strcmp(net.kernel, 'linear'),
      net.normalw = zeros([1 d]);
    end
  else

?? 快捷鍵說明

復制代碼 Ctrl + C
搜索代碼 Ctrl + F
全屏模式 F11
切換主題 Ctrl + Shift + D
顯示快捷鍵 ?
增大字號 Ctrl + =
減小字號 Ctrl + -
亚洲欧美第一页_禁久久精品乱码_粉嫩av一区二区三区免费野_久草精品视频
亚洲女爱视频在线| 久久精品一区二区三区不卡| 国产精品三级久久久久三级| 亚洲国产视频网站| 成人性生交大合| 欧美不卡一区二区三区四区| 亚洲一区二区在线播放相泽| 99在线精品一区二区三区| 精品国产污污免费网站入口| 亚洲午夜激情网站| 成人sese在线| 久久先锋影音av鲁色资源网| 日韩国产成人精品| 91黄视频在线| 成人欧美一区二区三区视频网页| 九九国产精品视频| 欧美放荡的少妇| 亚洲一区在线观看免费| youjizz久久| 久久蜜臀中文字幕| 亚洲国产一区在线观看| 国产91精品精华液一区二区三区 | 蜜桃精品视频在线观看| 91九色最新地址| 亚洲嫩草精品久久| www.欧美.com| 欧美国产精品劲爆| 国产剧情av麻豆香蕉精品| 日韩欧美区一区二| 丝袜国产日韩另类美女| 欧美精品乱人伦久久久久久| 亚洲一区二区三区在线看| 色视频欧美一区二区三区| 国产精品久久久久久久久搜平片| 国产高清成人在线| 国产亚洲精品超碰| 粉嫩一区二区三区性色av| 久久久久久久久99精品| 国产一区在线精品| 久久久777精品电影网影网| 蓝色福利精品导航| 精品精品国产高清a毛片牛牛 | 亚洲成av人片一区二区三区| 欧美色国产精品| 亚洲五月六月丁香激情| 欧美日韩国产影片| 日韩精品电影一区亚洲| 在线播放中文字幕一区| 青娱乐精品视频| 精品捆绑美女sm三区| 国产一区在线观看视频| 国产欧美视频在线观看| 成人免费高清视频在线观看| 国产精品乱码一区二区三区软件 | 欧美午夜精品久久久久久超碰| 亚洲香肠在线观看| 3d动漫精品啪啪一区二区竹菊| 日本怡春院一区二区| 日韩欧美中文字幕公布| 经典三级视频一区| 国产欧美一区二区精品秋霞影院| 99这里只有久久精品视频| 夜夜操天天操亚洲| 欧美日韩大陆一区二区| 免费精品视频最新在线| 久久影院午夜片一区| 国产91在线观看丝袜| 亚洲蜜臀av乱码久久精品蜜桃| 欧美日韩黄色一区二区| 九九**精品视频免费播放| 中文在线一区二区| 色94色欧美sute亚洲13| 日韩电影网1区2区| 久久免费午夜影院| 91视频你懂的| 日韩福利电影在线观看| 国产亚洲视频系列| 欧美优质美女网站| 极品销魂美女一区二区三区| 欧美高清在线精品一区| 欧美三级日韩三级| 国产一区二区三区黄视频 | 日韩美女精品在线| 538在线一区二区精品国产| 国产成人自拍高清视频在线免费播放| 亚洲欧美日韩成人高清在线一区| 欧美丰满高潮xxxx喷水动漫| 粗大黑人巨茎大战欧美成人| 亚洲成人免费观看| 国产视频亚洲色图| 欧美视频在线一区| 国产黑丝在线一区二区三区| 亚洲国产欧美在线| 国产日韩欧美在线一区| 欧美视频第二页| 国产成人精品www牛牛影视| 亚洲影院理伦片| 国产视频一区二区在线观看| 欧美日韩一二三区| 豆国产96在线|亚洲| 同产精品九九九| 欧美极品少妇xxxxⅹ高跟鞋 | 成人一二三区视频| 日韩精品免费视频人成| 国产精品久久一级| 日韩欧美一二三| 91久久精品一区二区| 久久av资源站| 亚洲成人先锋电影| 欧美国产精品中文字幕| 欧美一区二区网站| 97成人超碰视| 国内精品国产成人国产三级粉色| 亚洲一区国产视频| 中文字幕中文乱码欧美一区二区| 日韩一区二区视频| 欧美综合色免费| 成人黄页在线观看| 精品一区二区三区视频在线观看 | 久久精子c满五个校花| 欧美精品一卡两卡| 97久久超碰国产精品| 国产成人免费视频一区| 美国十次综合导航| 亚洲第一搞黄网站| 亚洲欧美国产毛片在线| 中文字幕精品—区二区四季| 欧美成人a∨高清免费观看| 欧洲国产伦久久久久久久| 成人v精品蜜桃久久一区| 国内精品不卡在线| 男男成人高潮片免费网站| 亚洲最大的成人av| 1024成人网| 国产精品理伦片| 久久精品一区四区| 久久综合九色欧美综合狠狠| 欧美一级片在线| 欧美喷水一区二区| 欧美性三三影院| 色噜噜夜夜夜综合网| 99精品国产热久久91蜜凸| 高清国产一区二区三区| 国产在线精品国自产拍免费| 久久99精品久久久久久| 美国十次综合导航| 麻豆精品一二三| 久久国产精品色| 精一区二区三区| 久久99日本精品| 久久国产精品99久久久久久老狼 | 免费在线欧美视频| 日韩二区三区四区| 日本v片在线高清不卡在线观看| 午夜久久久影院| 日韩精品一二三| 欧美aaaaaa午夜精品| 玖玖九九国产精品| 久久黄色级2电影| 国产伦精品一区二区三区在线观看 | 久久亚洲精品国产精品紫薇| 久久综合资源网| 久久品道一品道久久精品| 久久蜜臀精品av| 欧美国产乱子伦| 亚洲日穴在线视频| 亚洲综合清纯丝袜自拍| 亚洲成人久久影院| 奇米影视7777精品一区二区| 久草这里只有精品视频| 国产成人av网站| 91丝袜美女网| 欧美日韩一区在线| 欧美一级欧美一级在线播放| 欧美大片国产精品| 久久影院电视剧免费观看| 中文一区在线播放| 亚洲欧美日韩一区二区三区在线观看| 亚洲欧美日本韩国| 午夜影院久久久| 韩国毛片一区二区三区| 粉嫩欧美一区二区三区高清影视| 91丨porny丨最新| 欧美日韩国产三级| 精品欧美黑人一区二区三区| 国产婷婷色一区二区三区四区 | 国产精品美女久久福利网站| 亚洲精品中文在线影院| 日日摸夜夜添夜夜添精品视频| 老司机精品视频线观看86| 国产a久久麻豆| 色综合一个色综合亚洲| 在线成人小视频| 久久精品亚洲乱码伦伦中文| 亚洲人妖av一区二区| 午夜精品久久久久久久99水蜜桃 | 国产精品1024久久| 色屁屁一区二区| 欧美电视剧在线观看完整版| 国产精品黄色在线观看|