亚洲欧美第一页_禁久久精品乱码_粉嫩av一区二区三区免费野_久草精品视频

? 歡迎來到蟲蟲下載站! | ?? 資源下載 ?? 資源專輯 ?? 關于我們
? 蟲蟲下載站

?? naivebayescat.java

?? Naive Bayes算法java代碼
?? JAVA
?? 第 1 頁 / 共 3 頁
字號:
package nb;import shared.AttrInfo;import shared.AugCategory;import shared.BagCounters;import shared.CatDist;import shared.Categorizer;import shared.DisplayPref;import shared.Entropy;import shared.Error;import shared.Globals;import shared.Instance;import shared.InstanceList;import shared.MLJ;import shared.NominalAttrInfo;import shared.Schema;import shared.StatData;import java.io.BufferedWriter;
import java.io.IOException;/** This categorizer returns the category (label) that had the  * greatest relative probability of being correct, assuming  * independence of attributes. Relative probability of a label  * is calculated by multiplying the relative probability for  * each attribute.  The calculation of relative probabity for a  * label on a single attribute depends on whether the attribute  * is descrete or continuous.  * By Bayes Theorem, P(L=l | X1=x1, X2=x2, ... Xn=xn)  * = P(X1=x1, X2=x2, ... Xn=xn | L=l)*P(L=l)/P(X)  * where P(X) is P(X1=x1, ..., Xn=xn).  * Since P(X) is constant independent of the classes, we  * can ignore it.  * The Naive Bayesian approach asssumes complete independence  * of the attributes GIVEN the label, thus  * P(X1=x1, X2=x2, ... Xn=xn | L=l) =  * P(X1=x1|L=l)*P(X2=x2|L)*... P(Xn=xn|L)  * and P(X1=x1|L=l) = P(X1=x1 ^ L=l)/P(L=l) where this  * quantity is approximated form the data.  * When the computed probabilities for two labels have the same  * value, we break the tie in favor of the most prevalent label.  *   * If the instance being categorized has the first attribute = 1,  * and in the training set label A occured 20 times, 10 of  * which had value 1 for the first attribute, then the  * relative probability is 10/20 = 0.5.  *  * For continuous (real) attributes, the relative probability  * is based on the Normal Distribution of the values of the  * attribute on training instances with the label.  The actual  * calculation is done with the Normal Density; constants,  * which do not affect the relative probability between labels,  * are ignored.  For example, say 3 training instances have  * label 1 and these instances have the following values for a  * continous attribute: 35, 50, 65.  The program would use the  * mean and variance of this "sample" along with the attribute  * value of the instance that is being categorized in the  * Normal Density equation.  The evaluation of the Normal  * Density equation, without constant factors, provides the  * relative probability.  *  * Unknown attributes are skipped over.  *   * Assumptions :  This method calculates the probability of a label as the  * product of the probabilities of each attribute.  * This is assuming that the attributes are  * independent, a condition not likely corresponding to  * reality.  Thus the "Naive" of the title.  * This method assumes that all continous attributes have a  * Normal distribution for each label value.  *   * Comments :   For nominal attributes, if a label does not have  * any occurences for a given attribute value  * of the test instance, a probability of  * noMatchesFactor * ( 1 / # instances in training set )  * is used.  *  * For nominal attributes, if an attribute value does not  * occur in the training set, the attribute is skipped  * in the categorizer, since it does not serve to  * differentiate the labels.  *    * The code can handle dealing with unknowns as a special  * value by doing the is_unknown only in the real attribute  * case.  *  * Helper class NBNorm is a simple structure to hold the  * parameters needed to calculate the Normal Distribution  * of each Attribute,Label pair.  The NBNorms are stored in  * a Array2 table "continNorm" which is indexed by attribute  * number and label value.  *  * For continuous attributes the variance must not equal 0 since  * it is in the denominator.  If the variance is undefined for  * a label value (e.g. if a label only has only one instance  * in the training set), NaiveBayesInd will declare the  * variance to be defaultVariance, a static variable.  In  * cases where the variance is defined but equal to 0,  * NaiveBayesInd will declare the variance to be epsilon,  * a very small static variable.  *  * For continous attributes, if a label does not occur in  * the training set, a zero relative probability is  * assigned.  If a label occurs in the training set but only  * has unknown values for the attribute, noMatchesFactor is  * used as in the nominal attribute case above.  *  * Complexity : categorize() is O(ln) where l = the number of categories  * and n = the number of attributes.  *  * @author James Plummer 5/15/2001 Ported to Java  * @author Eric Bauer and Clay Kunz 5/24/1996 Added L'aplace correction  * @author Robert Allen 12/03/94 Initial revision  */public class NaiveBayesCat extends Categorizer {
  public final static String endl = new String("\n");
  // Member data (also see public data)
  private BagCounters nominCounts;		// hold data on nominal attributs
  private NBNorm[][] continNorm;		        // hold data on real attributes
  private double trainWeight;
  private int numAttributes;
  private boolean useLaplace;                     // turn on to activate Laplace correction
  private double mEstimateFactor;                // noise in Laplace correction
  private double[] attrImportance;               // importance values per attribute
  private boolean[] unkIsVal;               // should unknowns be special values? // decisions per attribute

  /** Ported from C++ >     *   enum UnknownIsValueEnum { unknownNo, unknownYes, unknownAuto }; //C++ equivalent    */  public static final int unknownNo = 1;
  public static final int unknownYes = 2;
  public static final int unknownAuto = 3;
  private int unknownIsValue; // 1, 2, 3.
  private double klThreshold;
     /** Fraction of a single occurence to use in cases when a label    * has no occurences of a given nominal value in the training set:    */  private double noMatchesFactor;
  /** If true Evidence projection is used.    */  private boolean useEvidenceProjection;
  /** The scale factor to use with Evidence Projection.    */   private double evidenceFactor;
  /** Categorizer option defaults.    */  public static final double defaultMEstimateFactor = 1.0;
  public static final boolean defaultLaplaceCorrection = false;
  public static final int defaultUnknownIsValue = unknownNo;
  public static final double defaultKLThreshold = 0.1;
  public static final double defaultNoMatchesFactor = 0.0;
  public static final boolean defaultUseEvidenceProjection = false;
  public static final double defaultEvidenceFactor = 1.0;
  /** Value to use for Variance when actual variance = 0:    */  public static final double epsilon = .01;
  /** Value to use for Vaiance when actual variance is undefined becase there    * is only one occurance.    */  public static final double defaultVariance = 1.0;
/** Constructor   * @param dscr - the description of this Inducer.  * @param instList - training data.  */  public NaiveBayesCat(String dscr, InstanceList instList) {
    super(instList.num_categories(), dscr, instList.get_schema());
    nominCounts = instList.counters();
    trainWeight = instList.total_weight();
    numAttributes = instList.num_attr();
    logOptions.LOG(3, "NBC . . numAttributes = "+numAttributes);
    useLaplace = defaultLaplaceCorrection;
    mEstimateFactor = defaultMEstimateFactor;
    unkIsVal = null;
    unknownIsValue = defaultUnknownIsValue;
    klThreshold = defaultKLThreshold;
    noMatchesFactor = defaultNoMatchesFactor;
    useEvidenceProjection = defaultUseEvidenceProjection;
    evidenceFactor = defaultEvidenceFactor;
    attrImportance = this.compute_importance(instList);
    continNorm = this.compute_contin_norm(instList);
  }  /** Copy Constructor.    * @param source - the NaiveBayesCat to copy.    */  public NaiveBayesCat(NaiveBayesCat source) {
    super(source.num_categories(), source.description(), source.get_schema());
    nominCounts = new BagCounters(source.nominCounts);
    continNorm = source.copyContinNorm();
    attrImportance = source.copyAttrImportance();
    trainWeight = source.trainWeight;
    numAttributes = source.numAttributes;
    useLaplace = source.useLaplace;
    mEstimateFactor = source.mEstimateFactor;
    unkIsVal = null;
    unknownIsValue = source.unknownIsValue;
    klThreshold = source.klThreshold;
    noMatchesFactor = source.noMatchesFactor;
    useEvidenceProjection = source.useEvidenceProjection;
    evidenceFactor = source.evidenceFactor;
  }  /** Categorizes a single instances based upon the training data.    * @param instance - the instance to categorize.    * @return the predicted category.    */  public AugCategory categorize(Instance instance) {
    CatDist cDist = score(instance);
    AugCategory cat = cDist.best_category();
    return cat;
  }  /** Simple Method to return an ID.    * @return - an int representing this Categorizer.    * @deprecated CLASS_NB_CATEGORIZER has been deprecated    */  public int class_id() {return CLASS_NB_CATEGORIZER;}
  /** Returns a pointer to a deep copy of this NaiveBayesCat.    * @return - the copy of this Categorizer.    */  public Object clone() {
    if ( !(this instanceof NaiveBayesCat) ) { 
      Error.fatalErr("NaiveBayesCat.clone: invoked for improper class");    }    return new NaiveBayesCat(this);
  }  /** Compute the norms of the continuous attributes
    * @param instList - the instances to calculate.    * @return the array[][] of NBNorms.    */  public static NBNorm[][] compute_contin_norm(InstanceList instList) {
    int contAttrCount = 0;
    int numCategories = instList.num_categories();
    Schema schema = instList.get_schema();
    int numAttributes = schema.num_attr();
   
    // start labels at -1 for unknown
    NBNorm[][] normDens = new NBNorm[numAttributes][numCategories + 1]; // no initial value
    for (int m=0; m<normDens.length;m++) {
      for (int n=0; n<normDens[m].length;n++) {
        normDens[m][n] = new NBNorm();
        normDens[m][n].set_mean_and_var(0,0);
      }
    }
      
    // loop through each attribute, and process all instances for each
    // continuous one
    for (int attrNum = 0; attrNum < numAttributes; attrNum++) {
      AttrInfo attrinfo = schema.attr_info(attrNum);
      if (attrinfo.can_cast_to_real()) {
	  // this is a continuous attribute
	  contAttrCount++;
	 
	  // read each occurance in the list and feed the stats for attribute
	  StatData[] continStats = new StatData[numCategories + 1];
        for (int j=0; j<continStats.length;j++) {
          continStats[j]=new StatData();
        }
//	  for (ILPix pix(instList); pix; ++pix) { //What?
        for (int i = 0; i < instList.num_instances(); i++) {
	    Instance inst = new Instance((Instance)instList.instance_list().get(i));
	    int labelVal = schema.label_info().cast_to_nominal().get_nominal_val(inst.get_label()); //for some reason the label values for the instances are one number higher than the actual value
	    MLJ.ASSERT(labelVal < numCategories, " NaiveBayesCat.compute_contin_norm()");

            // Ignore unknowns.
	    if ( !attrinfo.is_unknown(inst.get_value(attrNum))) {
	       double value = attrinfo.get_real_val(inst.get_value(attrNum));
	       continStats[labelVal].insert( value );
	    }
	  }

	  double mean;
        double var;
	  // extract Normal Density parameters into normDens table
	  for (int label = 0; label < numCategories; label++) {
	    if (continStats[label].size() == 0 ) {
	       mean = 0;
	       var = defaultVariance;
	    }
	    else {
	       mean = continStats[label].mean();
	       if (continStats[label].size() == 1 )
		  var = defaultVariance;
	       
	       else if ( (var = continStats[label].variance(0))<=0 )   // var == 0
		  var = epsilon;
	    }
	    normDens[attrNum][label].set_mean_and_var(mean,var);

	    //@@ pass in a log option?
	    //LOG(3, " Continuous Attribute # " << attrNum <<
	    //", Label " << label << ": Mean = " << mean <<
	    //", Variation = " << var << endl );
	  }
      } // end of handling this continous attribute
    }    // end of loop through all attributes

    if (contAttrCount==0) {  // no continous attributes found
      normDens = null;
    }
    return normDens;
  }  /** Computes importance values for each nominal attribute using
    * the mutual_info (entropy).
    * Static function; used as helper by train() below.    * @param instList - the instances to use.    * @return - the array[] of importance values.    */  public static double[] compute_importance(InstanceList instList) {
    double[] attrImp = new double[instList.num_attr()];
    for (int i = 0; i < attrImp.length; i++) {
      attrImp[i] = 0;
    }
   
    double ent = Entropy.entropy(instList);
    if (ent == Globals.UNDEFINED_REAL) {
      Error.fatalErr("compute_importance: undefined entropy");
    }
    if(ent < 0 && -ent < MLJ.realEpsilon) {
      ent = 0;
    }
    for (int i=0; i<instList.num_attr(); i++) {
      if(instList.get_schema().attr_info(i).can_cast_to_real()) {
	  attrImp[i] = 0;
      }
      else if(instList.get_schema().attr_info(i).can_cast_to_nominal()) {
	  if(ent <= 0) {

?? 快捷鍵說明

復制代碼 Ctrl + C
搜索代碼 Ctrl + F
全屏模式 F11
切換主題 Ctrl + Shift + D
顯示快捷鍵 ?
增大字號 Ctrl + =
減小字號 Ctrl + -
亚洲欧美第一页_禁久久精品乱码_粉嫩av一区二区三区免费野_久草精品视频
欧美国产精品v| 91麻豆精品91久久久久同性| 91精品办公室少妇高潮对白| 日韩免费高清av| 亚洲色图清纯唯美| 色综合一个色综合亚洲| 国产盗摄女厕一区二区三区| 色综合中文字幕国产| 欧美日韩国产一级片| 国产精品无码永久免费888| 丝袜诱惑亚洲看片| 91亚洲精品久久久蜜桃网站 | 日韩一区在线播放| 久久99国产精品麻豆| 欧美视频一区在线观看| 中文字幕免费一区| 精品在线播放免费| 91精品国产综合久久精品| 中文字幕中文字幕一区| 国产在线不卡视频| 日韩区在线观看| 日韩精品乱码av一区二区| 色哟哟亚洲精品| 中文字幕一区二区日韩精品绯色| 色素色在线综合| 国产一区二区在线观看免费| 91传媒视频在线播放| 欧美国产精品专区| 国产999精品久久| 久久网站热最新地址| 麻豆成人综合网| 日韩午夜精品电影| 捆绑变态av一区二区三区| 欧美精品乱码久久久久久按摩| 亚洲在线视频网站| 欧美在线不卡视频| 午夜视频一区在线观看| 国产情人综合久久777777| 成人精品小蝌蚪| 成人免费在线视频观看| 日本黄色一区二区| 亚洲午夜私人影院| 69p69国产精品| 蜜桃av一区二区三区| 欧美成人一区二区三区片免费| 麻豆国产欧美日韩综合精品二区| 51精品久久久久久久蜜臀| 欧美一区二区三区四区五区| 午夜国产不卡在线观看视频| 欧美日本一道本在线视频| 午夜精品久久久| 欧美一区二区三区人| 极品少妇xxxx偷拍精品少妇| 久久久蜜臀国产一区二区| 懂色av一区二区夜夜嗨| 亚洲免费观看高清在线观看| 欧洲av一区二区嗯嗯嗯啊| 亚洲国产综合色| 欧美成人一区二区三区| 成人性生交大片免费看视频在线| 亚洲欧美日韩中文播放 | a级精品国产片在线观看| 亚洲天堂网中文字| 91麻豆精品国产91久久久使用方法| 狠狠色丁香久久婷婷综合_中| 一区二区三区日韩欧美| 夫妻av一区二区| 一区二区三区加勒比av| 91精品国产综合久久精品app| 国产精品自产自拍| 亚洲国产日韩a在线播放性色| 日韩免费电影网站| 色综合 综合色| 久久99精品国产| 亚洲女同一区二区| 久久在线免费观看| 欧美午夜精品一区二区三区| 国产精品888| 五月天网站亚洲| 国产精品蜜臀在线观看| 91精品国产丝袜白色高跟鞋| 成人动漫在线一区| 久久精品国产久精国产| 亚洲一区二区av电影| 国产女人aaa级久久久级| 91精品欧美福利在线观看| 91免费观看国产| 国产美女一区二区| 日韩黄色片在线观看| 国产精品乱人伦一区二区| 亚洲精选视频免费看| 91麻豆精品国产自产在线 | 欧美午夜精品电影| 国产一区 二区| 免费欧美日韩国产三级电影| 亚洲精品视频免费观看| 国产亚洲福利社区一区| 欧美一区二区三区四区视频| 欧美日韩一二区| 在线区一区二视频| 99久久精品免费看国产| 国产成人av资源| 国产精品影视网| 久久国产人妖系列| 日韩电影在线免费观看| 亚洲国产人成综合网站| 亚洲精选一二三| 亚洲免费观看高清完整版在线| 中文一区在线播放| 国产精品久久久久永久免费观看| 久久久青草青青国产亚洲免观| 亚洲欧洲一区二区在线播放| 日韩一二三区视频| 欧美一区二区三区四区久久 | 国产精品一区二区在线观看不卡 | 亚洲欧美日韩中文字幕一区二区三区| 久久久精品免费观看| 久久日韩粉嫩一区二区三区| 一区二区成人在线| 亚洲精品欧美二区三区中文字幕| 国产精品美女久久福利网站 | 精品福利av导航| 精品少妇一区二区| 久久久久综合网| 中文字幕国产一区| 亚洲美女在线国产| 亚洲国产另类av| 丝袜美腿亚洲一区| 精品一区精品二区高清| 国产精品自产自拍| 99r精品视频| 欧美三级资源在线| 欧美大片国产精品| 国产欧美一区二区精品忘忧草| 国产精品久久久久一区| 亚洲一级二级三级在线免费观看| 婷婷成人激情在线网| 国内不卡的二区三区中文字幕 | av激情综合网| 欧美三级韩国三级日本三斤| 777a∨成人精品桃花网| 2020国产精品| 亚洲色图清纯唯美| 蜜桃视频免费观看一区| 成人精品在线视频观看| 欧美日韩国产精品自在自线| 精品国产第一区二区三区观看体验| 国产亚洲欧美一级| 亚洲狠狠爱一区二区三区| 激情综合网最新| av亚洲精华国产精华精| 欧美一区二区三区色| 国产精品久久久久天堂| 日日夜夜免费精品视频| 风间由美中文字幕在线看视频国产欧美| 91视频观看视频| 精品国产一区二区三区久久影院| 中文字幕在线不卡一区二区三区| 日韩黄色片在线观看| 高潮精品一区videoshd| 91麻豆精品国产91久久久资源速度 | 美国一区二区三区在线播放| 国产一区二区电影| 91电影在线观看| 国产精品国产三级国产普通话三级 | 国产精品久久久久影院色老大| 亚洲在线一区二区三区| 国产传媒欧美日韩成人| 3d动漫精品啪啪一区二区竹菊| 久久久亚洲精品石原莉奈| 亚洲永久免费av| www.亚洲免费av| 精品国产免费人成电影在线观看四季| 首页国产丝袜综合| 性做久久久久久久久| 国产aⅴ精品一区二区三区色成熟| 欧美亚洲国产一卡| 国产精品久久久久久久裸模| 国产一区欧美日韩| 欧美一级片在线| 亚洲成人三级小说| 91热门视频在线观看| 久久久精品2019中文字幕之3| 日韩电影在线免费看| 欧美午夜精品一区二区蜜桃| 中文字幕日本乱码精品影院| 国产一区91精品张津瑜| 欧美va亚洲va香蕉在线| 免费xxxx性欧美18vr| 久久久亚洲欧洲日产国码αv| 日韩在线一区二区| 欧美卡1卡2卡| 亚洲国产另类av| 欧美午夜不卡在线观看免费| 亚洲美女精品一区| 色哟哟一区二区在线观看| 自拍偷拍亚洲激情| 色综合久久六月婷婷中文字幕| 中文字幕一区在线观看视频| av资源网一区|