亚洲欧美第一页_禁久久精品乱码_粉嫩av一区二区三区免费野_久草精品视频

? 歡迎來到蟲蟲下載站! | ?? 資源下載 ?? 資源專輯 ?? 關于我們
? 蟲蟲下載站

?? tddtinducer.java

?? 數據倉庫挖掘與開發 ID3算法實現代碼
?? JAVA
?? 第 1 頁 / 共 5 頁
字號:
package id3;
import java.util.*;
import shared.*;
import shared.Error;

/** Top-down decision-tree (TDDT) inducer induces decision trees
 * top-down by building smaller training sets and inducing trees
 * for them recursively. The decision tree built has categorizers
 * at each node, and these determine how to branch, i.e., to
 * which child to branch, or whether to classify.  The common
 * cases are: AttrCategorizers, which simply return a given value
 * for an attribute in the node, and ThresholdCategorizers, which
 * return 0 or one based on whether an attribute is less than or
 * greater than a given threshold (valid only for real attributes).
 * The leaves are usually constant categorizers, i.e., they just
 * return a constant value independent of the instance.			<P>
 * The induction algorithm calls best_split, a pure virtual
 * function, to determine the best root split.  Once the split has
 * been chosen, the data in the node is split according to the
 * categorizer best_split returns.  A node is formed, and the
 * algorithm is called recursively with each of the children.
 * Once each child returns with a subtree, we connect them to the
 * root we split. ID3Inducer, for example, implements the
 * best_split using information gain, but other methods are
 * possible. best_split() can return any categorizer, thus opening
 * the possibility for oblique trees with perceptrons at nodes,
 * recursive trees, etc.  The leaves can also be of any
 * classifier, thus perceptron-trees (Utgoff) can be created,
 * or a nearest-neighbor within a leaf, etc.					<P>
 * Complexity   :									<P>
 * The complexity of train() is proportional to the number of
 * nodes in the resulting tree times the time for deciding on
 * the split() categorizer (done by the derived classes).
 * predict() takes time proportional to the sum of the
 * categorizers time over the path from the root to a leaf node.	<P>
 * Enhancements :									<P>
 * We may speed things up by having an option to test only
 * splits where the class label changes.  For some measures
 * (e.g., entropy), it can be shown that a split will never be
 * made between two instances with the same class label
 * (Fayyad IJCAI 93 page 1022, Machine Learning journal Vol 8,
 * no 1, page 87, 1992). We may wish to discretize the real values
 * first. By making them linear discrete, we can use the regular
 * counters and things will be faster (note however that the split
 * will usually remain special since it's a binary threshold split,
 * not a multi-way split).								<P>
 * Another problem is with attributes that have many values, for
 * example social-security-number.  Computing all cut points can
 * be very expensive.  We may want to skip such attributes by
 * claiming that each value must have at least some number of
 * instances.  Utgoff in ML94 (page 322) mentions that ID slows
 * his system down considerably.  The problem of course is that if
 * you threshold, it sometimes make sense to split on such
 * attributes.  Taken to an extreme, if we had a real "real-value,"
 * all values would be different with probability 1, and hence we
 * would skip such an attribute.							<P>
 * To speed things up, we may want to have an Inducer that
 * accepts a decision tree and builds stuff in it (vs. getting
 * a graph). Other options allow for doing the recursion by
 * calling a function instead of creating the actual class.
 * The advantage of the current method is that it allows a
 * subclass to keep track of the number of levels (useful for
 * lookahead or something). Yet another option is to "recycle"
 * inducers by using our "this" and just changing the training set.	<P>
 * We currently split instances but keep the original structure,
 * that is, we don't actually delete the attribute tested on. It
 * may be faster in some cases to actually create a new List
 * without the attribute.  The disadvantage is that for multi-valued
 * attributes we may wish to branch again, so we can't always delete.
 * The same goes for tests which are not attributes (e.g.,
 * conjunctions).
 * @author James Louis 12/07/2000 Ported to Java.
 * @author Steve Gustafson 12/07/2000 Ported to Java.
 * @author Chia-Hsin Li 1/03/95 Added Options.
 * @author Ronny Kohavi 9/06/93 Initial revision (.h,.c)
 */
abstract public class TDDTInducer extends Inducer
{
   //ENUMS
    /** Pruning method value.
     */    
      public static final byte none = 0;           /*  PruningMethod enum */
      /** Pruning method value.
       */      
      public static final byte confidence = 1;     /*                     */ 
      /** Pruning method value.
       */      
      public static final byte penalty = 2;        /*                     */ 
      /** Pruning method value.
       */      
      public static final byte linear  = 3;        /*                     */ 
      /** Pruning method value.
       */      
      public static final byte KLdistance = 4;     /*                     */ 
      /** Pruning method value.
       */      
      public static final byte lossConfidence = 5; /*                     */ 
      /** Pruning method value.
       */      
      public static final byte lossLaplace = 6;    /*                     */ 
      
      /** LeafDistType value.
       */      
      public static final byte allOrNothing = 0;      /* LeafDistType enum */ 
      /** LeafDistType value.
       */      
      public static final byte frequencyCounts = 1;   /*                   */ 
      /** LeafDistType value.
       */      
      public static final byte laplaceCorrection = 2; /*                   */ 
      /** LeafDistType value.
       */      
      public static final byte evidenceProjection = 3;/*                   */ 

      /** Evaluation metric value.
       */      
      public static final byte error = 0;    /* EvalMetric enum */ 
      /** Evaluation metric value.
       */      
      public static final byte MSE = 1;      /*                 */ 
      /** Evaluation metric value.
       */      
      public static final byte logLoss = 2;  /*                 */ 
   //END ENUMS


   private static int MIN_INSTLIST_DRIBBLE = 5000;
   private static String MAX_LEVEL_HELP = "The maximum number of levels to grow.  0 "
   +"implies no limit.";
   private static int DEFAULT_MAX_LEVEL = 0;

   private static String LB_MSW_HELP = "This option specifies the value of lower bound "
  +"of the weight while calculating the minimum split "
  +"(overrides weight option).  Set to 0 to have the value determined "
  +"automatically depending on the total weight of the training set.";
   private static double DEFAULT_LB_MSW = 1;

   private static String UB_MSW_HELP = "This option specifies the value of upper bound "
  +"of the weight while calculating the minimum split (overrides lower bound).";
    private static double DEFAULT_UB_MSW = 25;

   private static String MS_WP_HELP = "This option chooses the value of "
  +"the weight percent while calculating the minimum split.";
    private static double DEFAULT_MS_WP = 0;

   private static String NOM_LBO_HELP = "This option specifies if only the lower bound "
  +"will be used for calculating the minimum split for nominal-valued "
  +"attributes.";
    private static boolean DEFAULT_NOM_LBO = false;

   private static String DEBUG_HELP = "This option specifies whether to display the "
  +"debug information while displaying the graph.";
private static boolean DEFAULT_DEBUG = false;

/** Indicates edges representing unknown values should be processed. TRUE 
      indicates unknown edges should be, FALSE otherwise. **/
   private static String UNKNOWN_EDGES_HELP = "This option specifies whether or not to "
  +"allow outgoing UNKNOWN edges from each node. ";
    private static boolean DEFAULT_UNKNOWN_EDGES = true;

   // This option is currently not enabled.
//private static String EMPTY_NODE_PARENT_DIST_HELP = "Should empty nodes get the "
//+"distribution of the parent or zeros";
private static boolean DEFAULT_EMPTY_NODE_PARENT_DIST = false;

   private static String PARENT_TIE_BREAKING_HELP = "Should ties be broken in favor of "
+"the majority category of the parent";
private static boolean DEFAULT_PARENT_TIE_BREAKING = true;

// The following option is currently not enabled.
//const MString PRUNING_BRANCH_REPLACEMENT_HELP = "Should replacing a node "
//"with its largest subtree be allowed during pruning";
   private static boolean DEFAULT_PRUNING_BRANCH_REPLACEMENT = false;

private static String ADJUST_THRESHOLDS_HELP = "Should theshold values be adjusted "
+"to the values of instances";
   private static boolean DEFAULT_ADJUST_THRESHOLDS = false;

private static String PRUNING_METHOD_HELP = "Which algorithm should be used for "
+"pruning.  (If not NONE and PRUNING_FACTOR is 0, a node will be made a leaf "
+"if its potential children would not improve the error count)";
   private static byte DEFAULT_PRUNING_METHOD = confidence;

   private String PRUNING_FACTOR_HELP = "Pruning factor in standard deviations. "
   +"(high value -> more pruning), zero is no pruning, 2.5 is heavy pruning";
   private static double DEFAULT_PRUNING_FACTOR = 0.0; //change this to .6925

   
   
private static String CONT_MDL_ADJUST_HELP = "When TRUE, mutual information for "
+"real attributes is lowered based on MDL";
  private static boolean DEFAULT_CONT_MDL_ADJUST = false;

private static String SMOOTH_INST_HELP = "Set the number of values on each side "
+"of a given entropy value to use for smoothing (0 turns off smoothing).";
   private static int DEFAULT_SMOOTH_INST = 0;

private static String SMOOTH_FACTOR_HELP = "Set the constant for the exponential "
+"distribution used to smooth.";
   private static double DEFAULT_SMOOTH_FACTOR = 0.75;

private static String LEAF_DIST_TYPE_HELP = "This option selects the type of "
+"distribution to create at leaf nodes.  All-or-nothing picks the majority "
+"category and places all weight there.  This is the default.  "
+"Frequency-counts compute the distribution as normalized counts of the "
+"occurance of each class at this leaf.  Laplace-correction uses the "
+"frequency counts but applies a laplace correction of M_ESTIMATE_FACTOR.  "
+"Evidence-projection uses the evidence projection algorithm to correct "
+"the frequency counts.  EVIDENCE_FACTOR is the evidence scaling factor "
+"for the correction.";
   private static byte defaultLeafDistType = allOrNothing;

private static String M_ESTIMATE_FACTOR_HELP = "This option determines the factor "
+"by which to scale the log number of instances when computing an "
+"evidence projection";
  private static double DEFAULT_LEAF_M_ESTIMATE_FACTOR = 0.0;

private static String EVIDENCE_FACTOR_HELP = "This option determines the factor "
+"by which to scale the log number of instances when computing an "
+"evidence projection";
   private static double DEFAULT_LEAF_EVIDENCE_FACTOR = 1.0;

private static String EVAL_METRIC_HELP = "The measure by which the induced tree will "
+"be evaluated. Changing this may affect induction and pruning.";
   private static byte DEFAULT_EVALUATION_METRIC = error;

   private static int totalNodesNum = 0;

   private static int callCount = 0;

   private static int totalAttr = 0;


   private int level; 
   private CGraph cgraph; 
   private DTCategorizer decisionTreeCat;
   private double totalInstWeight;
   private boolean haveContinuousAttributes, errorprune = false;
   private TDDTOptions tddtOptions;

   /** Constructor.
    * @param dscr	The description of the inducer.
    */
   public TDDTInducer(String dscr)
   { 
      super(dscr); 

      tddtOptions = new TDDTOptions();

      CGraph aCgraph = null;
      level = 0;
      cgraph = aCgraph;
      decisionTreeCat = null;
      totalInstWeight = -1; //illegal value;
      
      //this is arbirary = no schema yet
      haveContinuousAttributes = false;

      tddtOptions.maxLevel = DEFAULT_MAX_LEVEL;
      tddtOptions.lowerBoundMinSplitWeight = DEFAULT_LB_MSW;
      tddtOptions.upperBoundMinSplitWeight = DEFAULT_UB_MSW;
      tddtOptions.minSplitWeightPercent = DEFAULT_MS_WP;
      tddtOptions.nominalLBoundOnly = DEFAULT_NOM_LBO;
      tddtOptions.debug = DEFAULT_DEBUG;
      tddtOptions.unknownEdges = DEFAULT_UNKNOWN_EDGES;
      tddtOptions.splitScoreCriterion = SplitScore.defaultSplitScoreCriterion;
      tddtOptions.emptyNodeParentDist = DEFAULT_EMPTY_NODE_PARENT_DIST;
      tddtOptions.parentTieBreaking = DEFAULT_PARENT_TIE_BREAKING;
      tddtOptions.pruningMethod = DEFAULT_PRUNING_METHOD;
      tddtOptions.pruningBranchReplacement = DEFAULT_PRUNING_BRANCH_REPLACEMENT;
      tddtOptions.adjustThresholds = DEFAULT_ADJUST_THRESHOLDS;
      tddtOptions.pruningFactor = DEFAULT_PRUNING_FACTOR;
      tddtOptions.contMDLAdjust = DEFAULT_CONT_MDL_ADJUST;
      tddtOptions.smoothInst = DEFAULT_SMOOTH_INST;
      tddtOptions.smoothFactor = DEFAULT_SMOOTH_FACTOR;
      tddtOptions.leafDistType = defaultLeafDistType;
      tddtOptions.MEstimateFactor = DEFAULT_LEAF_M_ESTIMATE_FACTOR;
      tddtOptions.evidenceFactor = DEFAULT_LEAF_EVIDENCE_FACTOR;
      tddtOptions.evaluationMetric = DEFAULT_EVALUATION_METRIC;

   }

   /** Constructor.
    * @param descr The description of this inducer.
    * @param aCgraph The CGraph that will hold the decision tree.
    */
   public TDDTInducer(String descr, CGraph aCgraph)
{
   super(descr);
   level = 0;

   tddtOptions = new TDDTOptions();

?? 快捷鍵說明

復制代碼 Ctrl + C
搜索代碼 Ctrl + F
全屏模式 F11
切換主題 Ctrl + Shift + D
顯示快捷鍵 ?
增大字號 Ctrl + =
減小字號 Ctrl + -
亚洲欧美第一页_禁久久精品乱码_粉嫩av一区二区三区免费野_久草精品视频
国产成a人亚洲| 老鸭窝一区二区久久精品| 国产欧美日韩在线| 久久一二三国产| 国产女人18毛片水真多成人如厕| www国产成人| 国产亚洲午夜高清国产拍精品| 久久综合一区二区| 国产精品素人一区二区| 国产精品国产三级国产a| 国产精品麻豆视频| 亚洲视频一区在线观看| 亚洲一区影音先锋| 另类的小说在线视频另类成人小视频在线| 美国三级日本三级久久99| 久久 天天综合| 99这里只有精品| 91.com视频| 精品国产一区二区亚洲人成毛片 | 欧美成人一区二区三区在线观看| 欧美一区二区视频网站| 久久精子c满五个校花| 国产精品久久看| 亚洲18女电影在线观看| 国内精品伊人久久久久av影院 | 中文字幕在线免费不卡| 亚洲国产日韩a在线播放性色| 日日欢夜夜爽一区| 成人福利在线看| 欧美二区在线观看| 亚洲欧洲日韩在线| 捆绑紧缚一区二区三区视频| 夫妻av一区二区| 欧美日韩国产美| 国产日韩一级二级三级| 一区二区三区在线免费视频| 加勒比av一区二区| 91九色02白丝porn| 国产视频视频一区| 日本免费在线视频不卡一不卡二| 成人综合在线观看| 日韩一区二区电影在线| 一区二区三区欧美视频| 成人免费av资源| 欧美一级一区二区| 亚洲综合免费观看高清完整版| 久久99精品久久久久久久久久久久| 91网站黄www| 中文乱码免费一区二区| 九九视频精品免费| 91麻豆精品国产91久久久使用方法| 国产精品灌醉下药二区| 国产在线视视频有精品| 欧美一区二区精品| 天天操天天干天天综合网| 一本大道久久a久久精二百| 99久久精品一区| 国产欧美一区二区三区网站| 日韩av一区二区在线影视| 日韩成人免费电影| 欧美日韩国产一级二级| 亚洲卡通动漫在线| 99久久久免费精品国产一区二区| 久久精品欧美日韩精品| 国产原创一区二区三区| 久久综合国产精品| 国产一区二区三区精品视频 | 亚洲天堂av老司机| k8久久久一区二区三区| 国产精品视频观看| 国产福利一区二区三区在线视频| 欧美大白屁股肥臀xxxxxx| 捆绑调教美女网站视频一区| 在线免费一区三区| 亚洲一本大道在线| 欧美日韩美女一区二区| 亚洲国产日韩一级| 欧美日韩二区三区| 日韩av在线免费观看不卡| 欧美一区二区三区在线观看视频| 日韩精品一卡二卡三卡四卡无卡| 欧美三级在线视频| 日本伊人精品一区二区三区观看方式| 欧美色大人视频| 日本欧美加勒比视频| 久久久久久一二三区| 99久久久无码国产精品| 亚洲一区二区美女| 欧美一二三区精品| 成人免费观看av| 亚洲风情在线资源站| 欧美www视频| 91福利区一区二区三区| 国产人伦精品一区二区| 中文字幕一区二区视频| 天天做天天摸天天爽国产一区 | 国产成人综合自拍| 精品国产免费久久| 日韩电影在线观看一区| 欧美性受极品xxxx喷水| 中文字幕不卡一区| 91成人看片片| 日韩美女一区二区三区四区| 日本欧美一区二区三区| 日韩欧美一区在线观看| 国产成人午夜精品影院观看视频 | 成人高清伦理免费影院在线观看| 国产精品国产三级国产专播品爱网 | 日韩一区二区免费在线观看| 国产成人午夜99999| 亚洲精品视频一区| 亚洲精品一线二线三线无人区| 97超碰欧美中文字幕| 美女国产一区二区| 夜夜亚洲天天久久| 国产亚洲人成网站| 欧美一区二区精品| 色综合久久久久网| 国产成人精品免费在线| 日韩综合在线视频| 亚洲视频在线一区| 国产日产亚洲精品系列| 欧美亚洲国产bt| 久久爱www久久做| 日韩精品资源二区在线| 色悠悠亚洲一区二区| 国产乱国产乱300精品| 日本 国产 欧美色综合| 一片黄亚洲嫩模| 亚洲人吸女人奶水| 国产日韩欧美a| 2024国产精品| 精品噜噜噜噜久久久久久久久试看| 91女人视频在线观看| 大桥未久av一区二区三区中文| 美国十次了思思久久精品导航| 一区二区在线观看免费 | 91蜜桃视频在线| 成人福利在线看| 高清不卡在线观看av| 国产老肥熟一区二区三区| 久久99深爱久久99精品| 免费人成精品欧美精品| 五月婷婷色综合| 日本亚洲最大的色成网站www| 亚洲综合免费观看高清完整版在线| 国产精品久久久久久久久快鸭 | 色综合天天综合狠狠| 不卡av在线网| 91蜜桃网址入口| 91久久精品一区二区三| 欧美伊人久久大香线蕉综合69| 日本福利一区二区| 欧美日韩1234| 日韩一区二区精品在线观看| 欧美成人综合网站| 国产欧美一区二区精品性色| 国产精品网友自拍| 亚洲品质自拍视频网站| 亚洲mv在线观看| 久久se精品一区二区| 国产成人免费视频网站高清观看视频| 国产精品资源网站| 一本一道综合狠狠老| 欧美久久高跟鞋激| 日韩欧美成人一区| 国产精品女人毛片| 亚洲另类春色国产| 蜜桃av一区二区| 国产风韵犹存在线视精品| 91视频观看视频| 欧美一区二区人人喊爽| 国产日韩一级二级三级| 一区二区不卡在线视频 午夜欧美不卡在| 亚洲国产综合91精品麻豆| 蜜桃视频一区二区三区 | 欧美三级午夜理伦三级中视频| 欧美日本一道本| 国产午夜精品久久久久久久| 亚洲欧美偷拍三级| 日韩激情视频在线观看| 国产麻豆视频精品| 欧洲另类一二三四区| 日韩免费高清电影| 亚洲欧美一区二区视频| 轻轻草成人在线| hitomi一区二区三区精品| 欧美精品 日韩| 中文久久乱码一区二区| 日本va欧美va精品| 91欧美一区二区| 亚洲精品在线观看网站| 亚洲一区免费视频| 不卡影院免费观看| 欧美成人一区二区| 亚洲第一av色| 91蝌蚪porny九色| 久久精品一区二区三区av| 亚洲尤物视频在线| 99久久精品一区二区|