亚洲欧美第一页_禁久久精品乱码_粉嫩av一区二区三区免费野_久草精品视频

? 歡迎來到蟲蟲下載站! | ?? 資源下載 ?? 資源專輯 ?? 關于我們
? 蟲蟲下載站

?? learn_params.m

?? 貝葉斯網絡的matlab實現??梢詣摻ㄘ惾~斯網絡、訓練模型
?? M
?? 第 1 頁 / 共 2 頁
字號:
function CPD = learn_params(CPD, fam, data, ns, cnodes, varargin)
% LEARN_PARAMS Construct classification/regression tree given complete data
% CPD = learn_params(CPD, fam, data, ns, cnodes)
%
% fam(i) is the node id of the i-th node in the family of nodes, self node is the last one
% data(i,m) is the value of node i in case m (can be cell array).
% ns(i) is the node size for the i-th node in the whold bnet
% cnodes(i) is the node id for the i-th continuous node in the whole bnet
%  
% The following optional arguments can be specified in the form of name/value pairs:
% stop_cases: for early stop (pruning). A node is not split if it has less than k cases. default is 0.
% min_gain: for early stop (pruning). 
%     For discrete output: A node is not split when the gain of best split is less than min_gain. default is 0.  
%     For continuous (cts) outpt: A node is not split when the gain of best split is less than min_gain*score(root) 
%                                 (we denote it cts_min_gain). default is 0.006
% %%%%%%%%%%%%%%%%%%%Struction definition of dtree_CPD.tree%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
% tree.num_node               the last position in tree.nodes array for adding new nodes,
%                             it is not always same to number of nodes in a tree, because some position in the 
%                             tree.nodes array can be set to unused (e.g. in tree pruning)  
% tree.nodes is the array of nodes in the tree plus some unused nodes.
% tree.nodes(1) is the root for the tree.
%
% Below is the attributes for each node
% tree.nodes(i).used;     % flag this node is used (0 means node not used, it can be removed from tree to save memory)
% tree.nodes(i).is_leaf;  % if 1 means this node is a leaf, if 0 not a leaf.
% tree.nodes(i).children; % children(i) is the node number in tree.nodes array for the i-th child node
% tree.nodes(i).split_id; % the attribute id used to split this node
% tree.nodes(i).split_threshhold; % the threshhold for continuous attribute to split this node
% %%%%%attributes specially for classification tree (discrete output)
% tree.nodes(i).probs     % probs(i) is the prob for i-th value of class node 
%                         % For three output class, the probs = [0.9 0.1 0.0] means the probability of 
%                         % class 1 is 0.9, for class 2 is 0.1, for class 3 is 0.0.
% %%%%%attributes specially for regression tree (continuous output)                          
% tree.nodes(i).mean      % mean output value for this node
% tree.nodes(i).std       % standard deviation for output values in this node
%
% Author: yimin.zhang@intel.com
% Last updated: Jan. 19, 2002

% Want list:
% (1) more efficient for cts attributes: get the values of cts attributes at first (the begining of build_tree function), then doing bi_search in finding threshhold
% (2) pruning classification tree using Pessimistic Error Pruning
% (3) bi_search for strings (used for transform data to BNT format)

global tree %tree must be global so that it can be accessed in recursive slitting function
global cts_min_gain
tree=[]; % clear the tree
tree.num_node=0;
cts_min_gain=0;

stop_cases=0;
min_gain=0;

args = varargin;
nargs = length(args);
if (nargs>0)
  if isstr(args{1})
    for i=1:2:nargs
      switch args{i},
        case 'stop_cases', stop_cases = args{i+1};   
        case 'min_gain', min_gain = args{i+1};
      end
    end
  else
    error(['error in input parameters']);
  end
end

if iscell(data)
  local_data = cell2num(data(fam,:));
else
  local_data = data(fam, :);
end
%counts = compute_counts(local_data, CPD.sizes);
%CPD.CPT = mk_stochastic(counts + CPD.prior); % bug fix 11/5/01
node_types = zeros(1,size(ns,2)); %all nodes are disrete
node_types(cnodes)=1;
%make the data be BNT compliant (values for discrete nodes are from 1-n, here n is the node size)
%trans_data=transform_data(local_data,'tmp.dat',[]); %here no cts nodes

build_dtree (CPD, local_data, ns(fam), node_types(fam),stop_cases,min_gain);
%CPD.tree=copy_tree(tree);
CPD.tree=tree; %copy the tree constructed to CPD


function new_tree = copy_tree(tree)
% copy the tree to new_tree
new_tree.num_node=tree.num_node;
new_tree.root = tree.root;
for i=1:tree.num_node
  new_tree.nodes(i)=tree.nodes(i);
end


function build_dtree (CPD, fam_ev, node_sizes, node_types,stop_cases,min_gain)
global tree
global cts_min_gain

tree.num_node=0; %the current number of nodes in the tree
tree.root=1;

T = 1:size(fam_ev,2) ; %all cases
candidate_attrs = 1:(size(node_sizes,2)-1); %all attributes
node_id=1;  %the root node
lastnode=size(node_sizes,2); %the last element in all nodes is the dependent variable (category node)
num_cat=node_sizes(lastnode);

% get minimum gain for cts output (used in stop splitting)
if (node_types(size(fam_ev,1))==1) %cts output
  N = size(fam_ev,2);
  output_id = size(fam_ev,1);
  cases_T = fam_ev(output_id,:); %get all the output value for cases T
  std_T = std(cases_T);
  avg_y_T = mean(cases_T);
  sqr_T = cases_T - avg_y_T;
  cts_min_gain = min_gain*(sum(sqr_T.*sqr_T)/N);  % min_gain * (R(root) = 1/N * SUM(y-avg_y)^2)
end  

split_dtree (CPD, fam_ev, node_sizes, node_types, stop_cases,min_gain, T, candidate_attrs, num_cat);
  


% pruning method
% (1) Restrictions on minimum node size: A node is not split if it has smaller than k cases.
% (2) Threshholds on impurity: a threshhold is imposed on the splitting test score. Threshhold can be 
% imposed on local goodness measure (the gain_ratio of a node) or global goodness.
% (3) Mininum Error Pruning (MEP), (no need pruning set)
%     Prune if static error<=backed-up error
%      Static error at node v: e(v) = (Nc + 1)/(N+k) (laplace estimate, prior for each class equal) 
%        here N is # of all examples, Nc is # of majority class examples, k is number of classes 
%      Backed-up error at node v: (Ti is the i-th subtree root)
%         E(T) = Sum_1_to_n(pi*e(Ti))
% (4) Pessimistic Error Pruning (PEP), used in Quilan C4.5 (no need pruning set, efficient because of pruning top-down)
%       Probability of error (apparent error rate)
%           q = (N-Nc+0.5)/N
%         where N=#examples, Nc=#examples in majority class
%     Error of a node v (if pruned)  q(v)= (Nv- Nc,v + 0.5)/Nv
%     Error of a subtree   q(T)= Sum_of_l_leaves(Nl - Nc,l + 0.5)/Sum_of_l_leaves(Nl)
%     Prune if q(v)<=q(T)
% 
% Implementation statuts:
% (1)(2) has been implemented as the input parameters of learn_params.
% (4) is implemented in this function
function pruning(fam_ev,node_sizes,node_types)
% PRUNING prune the constructed tree using PEP
% pruning(fam_ev,node_sizes,node_types)
%
% fam_ev(i,j)  is the value of attribute i in j-th training cases (for whole tree), the last row is for the class label (self_ev)
% node_sizes(i) is the node size for the i-th node in the family
% node_types(i) is the node type for the i-th node in the family, 0 for disrete node, 1 for continous node
% the global parameter 'tree' is for storing the input tree and the pruned tree


function split_T = split_cases(fam_ev,node_sizes,node_types,T,node_i, threshhold)
% SPLIT_CASES split the cases T according to values of node_i in the family
% split_T = split_cases(fam_ev,node_sizes,node_types,T,node_i)
%
% fam_ev(i,j)  is the value of attribute i in j-th training cases (for whole tree), the last row is for the class label (self_ev)
% node_sizes(i) is the node size for the i-th node in the family
% node_types(i) is the node type for the i-th node in the family, 0 for disrete node, 1 for continous node
% node_i is the attribute we need to split

if (node_types(node_i)==0) %discrete attribute
  %init the subsets of T
  split_T = cell(1,node_sizes(node_i)); %T will be separated into |node_size of i| subsets according to different values of node i
  for i=1:node_sizes(node_i)   % here we assume that the value of an attribute is 1:node_size
    split_T{i}=zeros(1,0);
  end

  size_t = size(T,2);
  for i=1:size_t
    case_id = T(i);
    %put this case into one subset of split_T according to its value for node_i
    value = fam_ev(node_i,case_id); 
    pos = size(split_T{value},2)+1;
    split_T{value}(pos)=case_id;  % here assumes the value of an attribute is 1:node_size 
  end
else %continuous attribute
  %init the subsets of T
  split_T = cell(1,2); %T will be separated into 2 subsets (<=threshhold) (>threshhold)
  for i=1:2   
    split_T{i}=zeros(1,0);
  end

  size_t = size(T,2);
  for i=1:size_t
    case_id = T(i);
    %put this case into one subset of split_T according to its value for node_i
    value = fam_ev(node_i,case_id); 
    subset_num=1;
    if (value>threshhold)
      subset_num=2;
    end  
    pos = size(split_T{subset_num},2)+1;
    split_T{subset_num}(pos)=case_id;  
  end
end


  
function new_node = split_dtree (CPD, fam_ev, node_sizes, node_types, stop_cases, min_gain, T, candidate_attrs, num_cat)
% SPLIT_TREE Split the tree at node node_id with cases T (actually it is just indexes to family evidences).
% new_node = split_dtree (fam_ev, node_sizes, node_types, T, node_id, num_cat, method)
%
% fam_ev(i,j)  is the value of attribute i in j-th training cases (for whole tree), the last row is for the class label (self_ev)
% node_sizes{i} is the node size for the i-th node in the family
% node_types{i} is the node type for the i-th node in the family, 0 for disrete node, 1 for continous node
% stop_cases is the threshold of number of cases to stop slitting
% min_gain is the minimum gain need to split a node
% T(i) is the index of i-th cases in current decision tree node, we need split it further
% candidate_attrs(i) the node id for the i-th attribute that still need to be considered as split attribute 
%%%%% node_id is the index of current node considered for a split
% num_cat is the number of output categories for the decision tree
% output:
% new_node is the new node created
global tree
global cts_min_gain

size_fam = size(fam_ev,1);            %number of family size
output_type = node_types(size_fam);   %the type of output for the tree (0 is discrete, 1 is continuous)
size_attrs = size(candidate_attrs,2); %number of candidate attributes
size_t = size(T,2);                   %number of training cases in this tree node

%(1)computeFrequenceyForEachClass(T)
if (output_type==0) %discrete output
  class_freqs = zeros(1,num_cat);
  for i=1:size_t
    case_id = T(i);
    case_class = fam_ev(size_fam,case_id); %get the class label for this case
    class_freqs(case_class)=class_freqs(case_class)+1;
  end
else  %cts output
  N = size(fam_ev,2);
  cases_T = fam_ev(size(fam_ev,1),T); %get the output value for cases T
  std_T = std(cases_T);
end

%(2) if OneClass (for discrete output) or same output value (for cts output) or Class With #examples < stop_cases
%         return a leaf;
%    create a decision node N;

% get majority class in this node
if (output_type == 0)
  top1_class = 0;       %the class with the largest number of cases
  top1_class_cases = 0; %the number of cases in top1_class
  [top1_class_cases,top1_class]=max(class_freqs);
end
  
if (size_t==0)     %impossble
  new_node=-1;
  fprintf('Fatal error: please contact the author. \n');
  return;
end

% stop splitting if needed
  %for discrete output: one class 
  %for cts output, all output value in cases are same
  %cases too little
if ( (output_type==0 & top1_class_cases == size_t) | (output_type==1 & std_T == 0) | (size_t < stop_cases))             
  %create one new leaf node
  tree.num_node=tree.num_node+1;
  tree.nodes(tree.num_node).used=1; %flag this node is used (0 means node not used, it will be removed from tree at last to save memory)
  tree.nodes(tree.num_node).is_leaf=1;
  tree.nodes(tree.num_node).children=[];
  tree.nodes(tree.num_node).split_id=0;  %the attribute(parent) id to split this tree node
  tree.nodes(tree.num_node).split_threshhold=0;  
  if (output_type==0)
    tree.nodes(tree.num_node).probs=class_freqs/size_t; %the prob for each value of class node 

    %  tree.nodes(tree.num_node).probs=zeros(1,num_cat); %the prob for each value of class node 
    %  tree.nodes(tree.num_node).probs(top1_class)=1; %use the majority class of parent node, like for binary class, 
                                                   %and majority is class 2, then the CPT is [0 1]
                                                   %we may need to use prior to do smoothing, to get [0.001 0.999]
    tree.nodes(tree.num_node).error.self_error=1-top1_class_cases/size_t; %the classfication error in this tree node when use default class
    tree.nodes(tree.num_node).error.all_error=1-top1_class_cases/size_t;  %no total classfication error in this tree node and its subtree
    tree.nodes(tree.num_node).error.all_error_num=size_t - top1_class_cases;
    fprintf('Create leaf node(onecla) %d. Class %d Cases %d Error %d \n',tree.num_node, top1_class, size_t, size_t - top1_class_cases );
  else
    avg_y_T = mean(cases_T);
    tree.nodes(tree.num_node).mean = avg_y_T; 
    tree.nodes(tree.num_node).std = std_T;
    fprintf('Create leaf node(samevalue) %d. Mean %8.4f Std %8.4f Cases %d \n',tree.num_node, avg_y_T, std_T, size_t);
  end  
  new_node = tree.num_node;
  return;
end
    
%create one new node
tree.num_node=tree.num_node+1;
tree.nodes(tree.num_node).used=1; %flag this node is used (0 means node not used, it will be removed from tree at last to save memory)
tree.nodes(tree.num_node).is_leaf=1;
tree.nodes(tree.num_node).children=[];
tree.nodes(tree.num_node).split_id=0;
tree.nodes(tree.num_node).split_threshhold=0;  
if (output_type==0)
  tree.nodes(tree.num_node).error.self_error=1-top1_class_cases/size_t; 
  tree.nodes(tree.num_node).error.all_error=0;
  tree.nodes(tree.num_node).error.all_error_num=0;
else
  avg_y_T = mean(cases_T);
  tree.nodes(tree.num_node).mean = avg_y_T; 
  tree.nodes(tree.num_node).std = std_T;
end
new_node = tree.num_node;

%Stop splitting if no attributes left in this node
if (size_attrs==0) 
  if (output_type==0)
    tree.nodes(tree.num_node).probs=class_freqs/size_t; %the prob for each value of class node 
    tree.nodes(tree.num_node).error.all_error=1-top1_class_cases/size_t;  
    tree.nodes(tree.num_node).error.all_error_num=size_t - top1_class_cases;
    fprintf('Create leaf node(noattr) %d. Class %d Cases %d Error %d \n',tree.num_node, top1_class, size_t, size_t - top1_class_cases );
  else
    fprintf('Create leaf node(noattr) %d. Mean %8.4f Std %8.4f Cases %d \n',tree.num_node, avg_y_T, std_T, size_t);
  end
  return;
end
      
  
%(3) for each attribute A
%        ComputeGain(A);
max_gain=0;  %the max gain score (for discrete information gain or gain ration, for cts node the R(T))

?? 快捷鍵說明

復制代碼 Ctrl + C
搜索代碼 Ctrl + F
全屏模式 F11
切換主題 Ctrl + Shift + D
顯示快捷鍵 ?
增大字號 Ctrl + =
減小字號 Ctrl + -
亚洲欧美第一页_禁久久精品乱码_粉嫩av一区二区三区免费野_久草精品视频
成人午夜伦理影院| 久久亚洲精品国产精品紫薇| 精品国产成人在线影院| 亚洲欧美日韩中文字幕一区二区三区| 免费一级片91| 91福利国产精品| 国产精品萝li| 韩国一区二区三区| 欧美福利视频导航| 亚洲狠狠爱一区二区三区| 国产宾馆实践打屁股91| 欧美大片在线观看一区二区| 依依成人精品视频| av在线播放成人| 久久久不卡影院| 狠狠v欧美v日韩v亚洲ⅴ| 欧美巨大另类极品videosbest| 亚洲免费在线视频| 成人性视频网站| 国产女人aaa级久久久级 | 欧美大片拔萝卜| 一级做a爱片久久| www.在线成人| 中文字幕的久久| 成人开心网精品视频| 国产亚洲女人久久久久毛片| 九色porny丨国产精品| 日韩一区二区在线观看视频| 日本视频一区二区三区| 欧美一区二区三区系列电影| 日韩中文字幕亚洲一区二区va在线 | 欧美国产日韩精品免费观看| 国产美女精品在线| 久久久国产精华| 国产成人精品一区二区三区四区 | 久久久久久久网| 粉嫩av一区二区三区在线播放| 久久久久久久久久看片| 国产在线精品免费| 国产欧美一区视频| 不卡大黄网站免费看| 亚洲同性同志一二三专区| 在线视频国产一区| 午夜精品在线视频一区| 日韩一区二区在线播放| 国产在线观看一区二区| 中文字幕第一区| 色94色欧美sute亚洲线路一久| 亚洲精品自拍动漫在线| 欧美欧美午夜aⅴ在线观看| 日韩影视精彩在线| 26uuu精品一区二区| 成人h版在线观看| 亚洲高清视频在线| 欧美成人精品3d动漫h| 成人一区二区在线观看| 伊人婷婷欧美激情| 精品黑人一区二区三区久久| 成人爱爱电影网址| 午夜欧美在线一二页| 久久网站热最新地址| 在线观看91精品国产入口| 青青草国产成人99久久| 国产精品福利影院| 欧美一区二区三区四区在线观看 | 欧美色区777第一页| 精品午夜一区二区三区在线观看| 中文字幕精品在线不卡| 欧美日韩精品免费| 国产高清无密码一区二区三区| 亚洲欧美激情小说另类| 精品国产91洋老外米糕| 91国偷自产一区二区三区成为亚洲经典 | 极品美女销魂一区二区三区| 亚洲色欲色欲www| 欧美不卡一区二区三区| 91在线国产观看| 久久国产精品第一页| 亚洲综合一区在线| 国产蜜臀97一区二区三区| 欧美日韩久久久| 9久草视频在线视频精品| 日韩av一二三| 亚洲综合激情网| 中文字幕乱码一区二区免费| 日韩欧美www| 欧美日韩电影在线| 色婷婷av一区二区三区大白胸| 国产一区二区精品久久99| 一级日本不卡的影视| 国产精品对白交换视频 | 欧美日韩精品一二三区| av在线播放一区二区三区| 国产精品夜夜嗨| 欧美丰满高潮xxxx喷水动漫| 成人爱爱电影网址| 国内精品国产成人| 亚洲国产乱码最新视频| 国产精品私人自拍| 成人性生交大合| 麻豆91免费看| 日本欧美大码aⅴ在线播放| 亚洲午夜免费电影| 亚洲欧洲性图库| 中文字幕欧美日韩一区| 国产香蕉久久精品综合网| 精品久久久久久久久久久久包黑料 | 精品综合免费视频观看| 日韩av高清在线观看| 亚洲高清在线精品| 午夜精品久久久久久久久久久 | 国产精品久久久久久久久晋中 | 国产成人精品亚洲777人妖 | 欧美日韩国产小视频在线观看| 不卡电影一区二区三区| 成人的网站免费观看| eeuss鲁片一区二区三区 | 亚洲亚洲人成综合网络| 亚洲综合成人网| 香蕉av福利精品导航| 日韩黄色免费网站| 精品亚洲国产成人av制服丝袜| 精品中文字幕一区二区| 国产乱色国产精品免费视频| 国产精品18久久久| 9l国产精品久久久久麻豆| 在线亚洲免费视频| 欧美精品一级二级三级| 日韩久久精品一区| 久久精品欧美一区二区三区麻豆 | 亚洲国产美国国产综合一区二区| 亚洲一区在线观看免费观看电影高清| 亚洲一区av在线| 日日噜噜夜夜狠狠视频欧美人| 日韩1区2区3区| 国产一区二区三区最好精华液| 国产成人精品综合在线观看 | 亚洲精品国产a| 日韩精品高清不卡| 激情久久五月天| 99久久夜色精品国产网站| 欧洲一区二区av| 精品日韩99亚洲| 亚洲三级小视频| 日本91福利区| 成人国产精品免费观看视频| 在线看日韩精品电影| 欧美一卡二卡在线| 国产精品美女一区二区在线观看| 一级做a爱片久久| 国产在线精品一区二区不卡了| 一本大道久久a久久精品综合| 91精品婷婷国产综合久久 | 亚洲一区在线视频观看| 国产一区二区在线观看免费 | 久久久久九九视频| 一区二区三区资源| 韩国毛片一区二区三区| 一本久久综合亚洲鲁鲁五月天 | 久久99精品久久久久婷婷| 成人av免费网站| 日韩精品一区国产麻豆| 亚洲人成在线播放网站岛国| 久久精品国产一区二区三| 91色porny在线视频| 日韩三级精品电影久久久| 中文字幕在线播放不卡一区| 久久综合综合久久综合| 欧美午夜精品久久久久久孕妇 | 福利电影一区二区| 欧美日韩国产三级| 中文字幕一区日韩精品欧美| 久久国产精品露脸对白| 欧美日韩日日摸| 亚洲免费观看高清完整版在线观看 | 精品国偷自产国产一区| 一区二区三区国产精品| 成人av在线网站| 久久老女人爱爱| 麻豆精品精品国产自在97香蕉| 欧美亚洲丝袜传媒另类| 中文字幕 久热精品 视频在线| 精品无码三级在线观看视频 | 亚洲欧洲性图库| 成人一级黄色片| 国产精品午夜久久| 国产乱码精品一区二区三区av | 欧美猛男男办公室激情| 一区二区三区免费看视频| 成人av电影在线观看| 国产片一区二区| 国产裸体歌舞团一区二区| 欧美一区二区成人6969| 婷婷综合另类小说色区| 欧美日本一道本| 三级影片在线观看欧美日韩一区二区 | 日本中文字幕一区二区有限公司| 91蝌蚪porny| 亚洲激情综合网| 91黄视频在线观看|