亚洲欧美第一页_禁久久精品乱码_粉嫩av一区二区三区免费野_久草精品视频

? 歡迎來到蟲蟲下載站! | ?? 資源下載 ?? 資源專輯 ?? 關(guān)于我們
? 蟲蟲下載站

?? tddtinducer.java

?? 用C編寫的數(shù)據(jù)挖掘的相關(guān)算法
?? JAVA
?? 第 1 頁 / 共 5 頁
字號:
package id3;
import java.util.*;
import shared.*;
import shared.Error;

/** Top-down decision-tree (TDDT) inducer induces decision trees
 * top-down by building smaller training sets and inducing trees
 * for them recursively. The decision tree built has categorizers
 * at each node, and these determine how to branch, i.e., to
 * which child to branch, or whether to classify.  The common
 * cases are: AttrCategorizers, which simply return a given value
 * for an attribute in the node, and ThresholdCategorizers, which
 * return 0 or one based on whether an attribute is less than or
 * greater than a given threshold (valid only for real attributes).
 * The leaves are usually constant categorizers, i.e., they just
 * return a constant value independent of the instance.			<P>
 * The induction algorithm calls best_split, a pure virtual
 * function, to determine the best root split.  Once the split has
 * been chosen, the data in the node is split according to the
 * categorizer best_split returns.  A node is formed, and the
 * algorithm is called recursively with each of the children.
 * Once each child returns with a subtree, we connect them to the
 * root we split. ID3Inducer, for example, implements the
 * best_split using information gain, but other methods are
 * possible. best_split() can return any categorizer, thus opening
 * the possibility for oblique trees with perceptrons at nodes,
 * recursive trees, etc.  The leaves can also be of any
 * classifier, thus perceptron-trees (Utgoff) can be created,
 * or a nearest-neighbor within a leaf, etc.					<P>
 * Complexity   :									<P>
 * The complexity of train() is proportional to the number of
 * nodes in the resulting tree times the time for deciding on
 * the split() categorizer (done by the derived classes).
 * predict() takes time proportional to the sum of the
 * categorizers time over the path from the root to a leaf node.	<P>
 * Enhancements :									<P>
 * We may speed things up by having an option to test only
 * splits where the class label changes.  For some measures
 * (e.g., entropy), it can be shown that a split will never be
 * made between two instances with the same class label
 * (Fayyad IJCAI 93 page 1022, Machine Learning journal Vol 8,
 * no 1, page 87, 1992). We may wish to discretize the real values
 * first. By making them linear discrete, we can use the regular
 * counters and things will be faster (note however that the split
 * will usually remain special since it's a binary threshold split,
 * not a multi-way split).								<P>
 * Another problem is with attributes that have many values, for
 * example social-security-number.  Computing all cut points can
 * be very expensive.  We may want to skip such attributes by
 * claiming that each value must have at least some number of
 * instances.  Utgoff in ML94 (page 322) mentions that ID slows
 * his system down considerably.  The problem of course is that if
 * you threshold, it sometimes make sense to split on such
 * attributes.  Taken to an extreme, if we had a real "real-value,"
 * all values would be different with probability 1, and hence we
 * would skip such an attribute.							<P>
 * To speed things up, we may want to have an Inducer that
 * accepts a decision tree and builds stuff in it (vs. getting
 * a graph). Other options allow for doing the recursion by
 * calling a function instead of creating the actual class.
 * The advantage of the current method is that it allows a
 * subclass to keep track of the number of levels (useful for
 * lookahead or something). Yet another option is to "recycle"
 * inducers by using our "this" and just changing the training set.	<P>
 * We currently split instances but keep the original structure,
 * that is, we don't actually delete the attribute tested on. It
 * may be faster in some cases to actually create a new List
 * without the attribute.  The disadvantage is that for multi-valued
 * attributes we may wish to branch again, so we can't always delete.
 * The same goes for tests which are not attributes (e.g.,
 * conjunctions).
 * @author James Louis 12/07/2000 Ported to Java.
 * @author Steve Gustafson 12/07/2000 Ported to Java.
 * @author Chia-Hsin Li 1/03/95 Added Options.
 * @author Ronny Kohavi 9/06/93 Initial revision (.h,.c)
 */
abstract public class TDDTInducer extends Inducer
{
   //ENUMS
    /** Pruning method value.
     */    
      public static final byte none = 0;           /*  PruningMethod enum */
      /** Pruning method value.
       */      
      public static final byte confidence = 1;     /*                     */ 
      /** Pruning method value.
       */      
      public static final byte penalty = 2;        /*                     */ 
      /** Pruning method value.
       */      
      public static final byte linear  = 3;        /*                     */ 
      /** Pruning method value.
       */      
      public static final byte KLdistance = 4;     /*                     */ 
      /** Pruning method value.
       */      
      public static final byte lossConfidence = 5; /*                     */ 
      /** Pruning method value.
       */      
      public static final byte lossLaplace = 6;    /*                     */ 
      
      /** LeafDistType value.
       */      
      public static final byte allOrNothing = 0;      /* LeafDistType enum */ 
      /** LeafDistType value.
       */      
      public static final byte frequencyCounts = 1;   /*                   */ 
      /** LeafDistType value.
       */      
      public static final byte laplaceCorrection = 2; /*                   */ 
      /** LeafDistType value.
       */      
      public static final byte evidenceProjection = 3;/*                   */ 

      /** Evaluation metric value.
       */      
      public static final byte error = 0;    /* EvalMetric enum */ 
      /** Evaluation metric value.
       */      
      public static final byte MSE = 1;      /*                 */ 
      /** Evaluation metric value.
       */      
      public static final byte logLoss = 2;  /*                 */ 
   //END ENUMS


   private static int MIN_INSTLIST_DRIBBLE = 5000;
   private static String MAX_LEVEL_HELP = "The maximum number of levels to grow.  0 "
   +"implies no limit.";
   private static int DEFAULT_MAX_LEVEL = 0;

   private static String LB_MSW_HELP = "This option specifies the value of lower bound "
  +"of the weight while calculating the minimum split "
  +"(overrides weight option).  Set to 0 to have the value determined "
  +"automatically depending on the total weight of the training set.";
   private static double DEFAULT_LB_MSW = 1;

   private static String UB_MSW_HELP = "This option specifies the value of upper bound "
  +"of the weight while calculating the minimum split (overrides lower bound).";
    private static double DEFAULT_UB_MSW = 25;

   private static String MS_WP_HELP = "This option chooses the value of "
  +"the weight percent while calculating the minimum split.";
    private static double DEFAULT_MS_WP = 0;

   private static String NOM_LBO_HELP = "This option specifies if only the lower bound "
  +"will be used for calculating the minimum split for nominal-valued "
  +"attributes.";
    private static boolean DEFAULT_NOM_LBO = false;

   private static String DEBUG_HELP = "This option specifies whether to display the "
  +"debug information while displaying the graph.";
private static boolean DEFAULT_DEBUG = false;

/** Indicates edges representing unknown values should be processed. TRUE 
      indicates unknown edges should be, FALSE otherwise. **/
   private static String UNKNOWN_EDGES_HELP = "This option specifies whether or not to "
  +"allow outgoing UNKNOWN edges from each node. ";
    private static boolean DEFAULT_UNKNOWN_EDGES = true;

   // This option is currently not enabled.
//private static String EMPTY_NODE_PARENT_DIST_HELP = "Should empty nodes get the "
//+"distribution of the parent or zeros";
private static boolean DEFAULT_EMPTY_NODE_PARENT_DIST = false;

   private static String PARENT_TIE_BREAKING_HELP = "Should ties be broken in favor of "
+"the majority category of the parent";
private static boolean DEFAULT_PARENT_TIE_BREAKING = true;

// The following option is currently not enabled.
//const MString PRUNING_BRANCH_REPLACEMENT_HELP = "Should replacing a node "
//"with its largest subtree be allowed during pruning";
   private static boolean DEFAULT_PRUNING_BRANCH_REPLACEMENT = false;

private static String ADJUST_THRESHOLDS_HELP = "Should theshold values be adjusted "
+"to the values of instances";
   private static boolean DEFAULT_ADJUST_THRESHOLDS = false;

private static String PRUNING_METHOD_HELP = "Which algorithm should be used for "
+"pruning.  (If not NONE and PRUNING_FACTOR is 0, a node will be made a leaf "
+"if its potential children would not improve the error count)";
   private static byte DEFAULT_PRUNING_METHOD = confidence;

   private String PRUNING_FACTOR_HELP = "Pruning factor in standard deviations. "
   +"(high value -> more pruning), zero is no pruning, 2.5 is heavy pruning";
   private static double DEFAULT_PRUNING_FACTOR = 0.0; //change this to .6925

   
   
private static String CONT_MDL_ADJUST_HELP = "When TRUE, mutual information for "
+"real attributes is lowered based on MDL";
  private static boolean DEFAULT_CONT_MDL_ADJUST = false;

private static String SMOOTH_INST_HELP = "Set the number of values on each side "
+"of a given entropy value to use for smoothing (0 turns off smoothing).";
   private static int DEFAULT_SMOOTH_INST = 0;

private static String SMOOTH_FACTOR_HELP = "Set the constant for the exponential "
+"distribution used to smooth.";
   private static double DEFAULT_SMOOTH_FACTOR = 0.75;

private static String LEAF_DIST_TYPE_HELP = "This option selects the type of "
+"distribution to create at leaf nodes.  All-or-nothing picks the majority "
+"category and places all weight there.  This is the default.  "
+"Frequency-counts compute the distribution as normalized counts of the "
+"occurance of each class at this leaf.  Laplace-correction uses the "
+"frequency counts but applies a laplace correction of M_ESTIMATE_FACTOR.  "
+"Evidence-projection uses the evidence projection algorithm to correct "
+"the frequency counts.  EVIDENCE_FACTOR is the evidence scaling factor "
+"for the correction.";
   private static byte defaultLeafDistType = allOrNothing;

private static String M_ESTIMATE_FACTOR_HELP = "This option determines the factor "
+"by which to scale the log number of instances when computing an "
+"evidence projection";
  private static double DEFAULT_LEAF_M_ESTIMATE_FACTOR = 0.0;

private static String EVIDENCE_FACTOR_HELP = "This option determines the factor "
+"by which to scale the log number of instances when computing an "
+"evidence projection";
   private static double DEFAULT_LEAF_EVIDENCE_FACTOR = 1.0;

private static String EVAL_METRIC_HELP = "The measure by which the induced tree will "
+"be evaluated. Changing this may affect induction and pruning.";
   private static byte DEFAULT_EVALUATION_METRIC = error;

   private static int totalNodesNum = 0;

   private static int callCount = 0;

   private static int totalAttr = 0;


   private int level; 
   private CGraph cgraph; 
   private DTCategorizer decisionTreeCat;
   private double totalInstWeight;
   private boolean haveContinuousAttributes, errorprune = false;
   private TDDTOptions tddtOptions;

   /** Constructor.
    * @param dscr	The description of the inducer.
    */
   public TDDTInducer(String dscr)
   { 
      super(dscr); 

      tddtOptions = new TDDTOptions();

      CGraph aCgraph = null;
      level = 0;
      cgraph = aCgraph;
      decisionTreeCat = null;
      totalInstWeight = -1; //illegal value;
      
      //this is arbirary = no schema yet
      haveContinuousAttributes = false;

      tddtOptions.maxLevel = DEFAULT_MAX_LEVEL;
      tddtOptions.lowerBoundMinSplitWeight = DEFAULT_LB_MSW;
      tddtOptions.upperBoundMinSplitWeight = DEFAULT_UB_MSW;
      tddtOptions.minSplitWeightPercent = DEFAULT_MS_WP;
      tddtOptions.nominalLBoundOnly = DEFAULT_NOM_LBO;
      tddtOptions.debug = DEFAULT_DEBUG;
      tddtOptions.unknownEdges = DEFAULT_UNKNOWN_EDGES;
      tddtOptions.splitScoreCriterion = SplitScore.defaultSplitScoreCriterion;
      tddtOptions.emptyNodeParentDist = DEFAULT_EMPTY_NODE_PARENT_DIST;
      tddtOptions.parentTieBreaking = DEFAULT_PARENT_TIE_BREAKING;
      tddtOptions.pruningMethod = DEFAULT_PRUNING_METHOD;
      tddtOptions.pruningBranchReplacement = DEFAULT_PRUNING_BRANCH_REPLACEMENT;
      tddtOptions.adjustThresholds = DEFAULT_ADJUST_THRESHOLDS;
      tddtOptions.pruningFactor = DEFAULT_PRUNING_FACTOR;
      tddtOptions.contMDLAdjust = DEFAULT_CONT_MDL_ADJUST;
      tddtOptions.smoothInst = DEFAULT_SMOOTH_INST;
      tddtOptions.smoothFactor = DEFAULT_SMOOTH_FACTOR;
      tddtOptions.leafDistType = defaultLeafDistType;
      tddtOptions.MEstimateFactor = DEFAULT_LEAF_M_ESTIMATE_FACTOR;
      tddtOptions.evidenceFactor = DEFAULT_LEAF_EVIDENCE_FACTOR;
      tddtOptions.evaluationMetric = DEFAULT_EVALUATION_METRIC;

   }

   /** Constructor.
    * @param descr The description of this inducer.
    * @param aCgraph The CGraph that will hold the decision tree.
    */
   public TDDTInducer(String descr, CGraph aCgraph)
{
   super(descr);
   level = 0;

   tddtOptions = new TDDTOptions();

?? 快捷鍵說明

復(fù)制代碼 Ctrl + C
搜索代碼 Ctrl + F
全屏模式 F11
切換主題 Ctrl + Shift + D
顯示快捷鍵 ?
增大字號 Ctrl + =
減小字號 Ctrl + -
亚洲欧美第一页_禁久久精品乱码_粉嫩av一区二区三区免费野_久草精品视频
在线免费观看一区| 亚洲成人动漫在线观看| 在线看国产一区| 99久久精品国产毛片| 国产91综合一区在线观看| 麻豆精品久久精品色综合| 亚洲v中文字幕| 亚洲成人久久影院| 五月综合激情婷婷六月色窝| 亚洲国产婷婷综合在线精品| 亚洲第一福利视频在线| 亚洲超碰精品一区二区| 天天色图综合网| 日韩av电影天堂| 日本亚洲三级在线| 国模娜娜一区二区三区| 国产一区二区精品在线观看| 粉嫩欧美一区二区三区高清影视 | 日韩不卡手机在线v区| 日韩av电影天堂| 免费精品视频在线| 国产精品影视在线| 不卡欧美aaaaa| 日本韩国精品在线| 欧美美女一区二区三区| 欧美一级专区免费大片| 久久久久青草大香线综合精品| 久久久久久黄色| 亚洲欧美中日韩| 一二三区精品视频| 蜜桃av一区二区| 成人网在线免费视频| 91在线精品一区二区| 欧美午夜寂寞影院| 欧美电影免费观看高清完整版在线| 久久亚洲综合色一区二区三区| 中文字幕国产一区二区| 亚洲成av人综合在线观看| 蜜桃视频免费观看一区| av一区二区久久| 欧美日韩精品专区| 国产性色一区二区| 一区二区高清在线| 久久99九九99精品| 91精品福利视频| 欧美不卡123| 亚洲三级电影网站| 美腿丝袜亚洲色图| av午夜精品一区二区三区| 欧美日韩一区二区电影| 国产性天天综合网| 五月天精品一区二区三区| 国产精品一区免费视频| 在线观看一区日韩| 久久久久久影视| 亚洲一区电影777| 国产成人综合视频| 7777精品伊人久久久大香线蕉超级流畅 | 99久久精品国产麻豆演员表| 欧美日韩国产综合一区二区| 久久婷婷色综合| 一区二区在线观看免费 | 99久久综合色| 日韩女优毛片在线| 91丝袜美女网| 91久久精品一区二区三| 国产无遮挡一区二区三区毛片日本| 中文字幕日韩一区二区| 日本视频在线一区| 99久久夜色精品国产网站| 制服丝袜亚洲播放| ...中文天堂在线一区| 捆绑调教美女网站视频一区| 99re热视频精品| 久久噜噜亚洲综合| 三级一区在线视频先锋| 成人av电影在线播放| 欧美大片日本大片免费观看| 一区二区三区日韩在线观看| 国产伦精品一区二区三区免费迷| 欧美日韩国产天堂| 亚洲乱码国产乱码精品精小说 | 色婷婷亚洲精品| 91电影在线观看| 欧美国产日韩在线观看| 美女久久久精品| 欧美美女网站色| 亚洲综合自拍偷拍| 成人av午夜电影| 久久久国产精华| 狠狠色丁香久久婷婷综| 3d成人h动漫网站入口| 亚洲黄色尤物视频| 99久久精品免费看国产 | 日韩电影在线看| 欧美私模裸体表演在线观看| 亚洲日本在线看| 成人的网站免费观看| 久久精品欧美一区二区三区不卡 | 久久亚洲精精品中文字幕早川悠里| 亚洲444eee在线观看| 精品视频免费在线| 亚洲国产视频在线| 欧美日韩午夜影院| 亚洲电影一区二区| 欧美日韩另类一区| 亚洲成av人片观看| 91精品在线免费观看| 日韩电影在线一区| 欧美videossexotv100| 久久 天天综合| 亚洲精品在线观| 国产不卡视频一区二区三区| 国产亚洲一二三区| 国产精品性做久久久久久| 国产农村妇女毛片精品久久麻豆 | www.综合网.com| 国产欧美一区二区三区在线老狼 | 欧美视频三区在线播放| 亚洲国产一区二区a毛片| 欧美丰满一区二区免费视频| 日韩精品国产欧美| 欧美一级高清片| 麻豆国产91在线播放| 久久久国产精品不卡| av电影在线不卡| 一区二区三区**美女毛片| 欧美日韩专区在线| 蜜桃视频在线一区| 欧美激情一区在线观看| 99久久精品情趣| 亚洲超碰97人人做人人爱| 欧美第一区第二区| 国产成人av一区| 亚洲精品大片www| 在线不卡中文字幕播放| 精品一区二区三区视频在线观看| 国产午夜精品一区二区| 99riav一区二区三区| 一区二区三区欧美激情| 51午夜精品国产| 国产高清不卡一区| 亚洲欧美二区三区| 欧美成人vps| av午夜精品一区二区三区| 日韩激情av在线| 国产日本欧美一区二区| 欧美艳星brazzers| 经典三级一区二区| 亚洲欧美另类在线| 欧美电影精品一区二区| 99re这里只有精品首页| 奇米精品一区二区三区在线观看一 | 91精品国模一区二区三区| 国产一区二区美女诱惑| 亚洲久本草在线中文字幕| 日韩免费性生活视频播放| 亚洲三级电影网站| 欧美日韩三级一区| 国精产品一区一区三区mba桃花| 一区二区中文字幕在线| 欧美一级精品在线| eeuss鲁片一区二区三区| 免费的成人av| 亚洲欧美国产77777| 久久人人97超碰com| 色哦色哦哦色天天综合| 国产在线精品一区二区三区不卡 | 日韩一区欧美二区| 中文字幕一区二区三区四区| 欧美一区二区三区成人| 99久久精品免费| 久久国产免费看| 亚洲福利电影网| 中文字幕一区在线观看| 精品粉嫩超白一线天av| 欧美日韩专区在线| 91免费看片在线观看| 国模一区二区三区白浆| 日韩电影在线免费看| 一区二区高清免费观看影视大全| 国产肉丝袜一区二区| 日韩精品中午字幕| 欧美日韩综合不卡| 色乱码一区二区三区88| 国产999精品久久久久久绿帽| 美美哒免费高清在线观看视频一区二区 | 欧美成人性福生活免费看| 精品污污网站免费看| av亚洲精华国产精华| 成人精品一区二区三区中文字幕| 美女网站色91| 三级影片在线观看欧美日韩一区二区 | 制服丝袜国产精品| 在线观看亚洲专区| 91在线视频观看| 成人h动漫精品一区二区| 国产精品资源网站| 国产主播一区二区三区| 日本免费在线视频不卡一不卡二 |