亚洲欧美第一页_禁久久精品乱码_粉嫩av一区二区三区免费野_久草精品视频

? 歡迎來到蟲蟲下載站! | ?? 資源下載 ?? 資源專輯 ?? 關(guān)于我們
? 蟲蟲下載站

?? naivebayescat.java

?? Naive Bayes算法java代碼
?? JAVA
?? 第 1 頁 / 共 3 頁
字號:
package nb;import shared.AttrInfo;import shared.AugCategory;import shared.BagCounters;import shared.CatDist;import shared.Categorizer;import shared.DisplayPref;import shared.Entropy;import shared.Error;import shared.Globals;import shared.Instance;import shared.InstanceList;import shared.MLJ;import shared.NominalAttrInfo;import shared.Schema;import shared.StatData;import java.io.BufferedWriter;
import java.io.IOException;/** This categorizer returns the category (label) that had the  * greatest relative probability of being correct, assuming  * independence of attributes. Relative probability of a label  * is calculated by multiplying the relative probability for  * each attribute.  The calculation of relative probabity for a  * label on a single attribute depends on whether the attribute  * is descrete or continuous.  * By Bayes Theorem, P(L=l | X1=x1, X2=x2, ... Xn=xn)  * = P(X1=x1, X2=x2, ... Xn=xn | L=l)*P(L=l)/P(X)  * where P(X) is P(X1=x1, ..., Xn=xn).  * Since P(X) is constant independent of the classes, we  * can ignore it.  * The Naive Bayesian approach asssumes complete independence  * of the attributes GIVEN the label, thus  * P(X1=x1, X2=x2, ... Xn=xn | L=l) =  * P(X1=x1|L=l)*P(X2=x2|L)*... P(Xn=xn|L)  * and P(X1=x1|L=l) = P(X1=x1 ^ L=l)/P(L=l) where this  * quantity is approximated form the data.  * When the computed probabilities for two labels have the same  * value, we break the tie in favor of the most prevalent label.  *   * If the instance being categorized has the first attribute = 1,  * and in the training set label A occured 20 times, 10 of  * which had value 1 for the first attribute, then the  * relative probability is 10/20 = 0.5.  *  * For continuous (real) attributes, the relative probability  * is based on the Normal Distribution of the values of the  * attribute on training instances with the label.  The actual  * calculation is done with the Normal Density; constants,  * which do not affect the relative probability between labels,  * are ignored.  For example, say 3 training instances have  * label 1 and these instances have the following values for a  * continous attribute: 35, 50, 65.  The program would use the  * mean and variance of this "sample" along with the attribute  * value of the instance that is being categorized in the  * Normal Density equation.  The evaluation of the Normal  * Density equation, without constant factors, provides the  * relative probability.  *  * Unknown attributes are skipped over.  *   * Assumptions :  This method calculates the probability of a label as the  * product of the probabilities of each attribute.  * This is assuming that the attributes are  * independent, a condition not likely corresponding to  * reality.  Thus the "Naive" of the title.  * This method assumes that all continous attributes have a  * Normal distribution for each label value.  *   * Comments :   For nominal attributes, if a label does not have  * any occurences for a given attribute value  * of the test instance, a probability of  * noMatchesFactor * ( 1 / # instances in training set )  * is used.  *  * For nominal attributes, if an attribute value does not  * occur in the training set, the attribute is skipped  * in the categorizer, since it does not serve to  * differentiate the labels.  *    * The code can handle dealing with unknowns as a special  * value by doing the is_unknown only in the real attribute  * case.  *  * Helper class NBNorm is a simple structure to hold the  * parameters needed to calculate the Normal Distribution  * of each Attribute,Label pair.  The NBNorms are stored in  * a Array2 table "continNorm" which is indexed by attribute  * number and label value.  *  * For continuous attributes the variance must not equal 0 since  * it is in the denominator.  If the variance is undefined for  * a label value (e.g. if a label only has only one instance  * in the training set), NaiveBayesInd will declare the  * variance to be defaultVariance, a static variable.  In  * cases where the variance is defined but equal to 0,  * NaiveBayesInd will declare the variance to be epsilon,  * a very small static variable.  *  * For continous attributes, if a label does not occur in  * the training set, a zero relative probability is  * assigned.  If a label occurs in the training set but only  * has unknown values for the attribute, noMatchesFactor is  * used as in the nominal attribute case above.  *  * Complexity : categorize() is O(ln) where l = the number of categories  * and n = the number of attributes.  *  * @author James Plummer 5/15/2001 Ported to Java  * @author Eric Bauer and Clay Kunz 5/24/1996 Added L'aplace correction  * @author Robert Allen 12/03/94 Initial revision  */public class NaiveBayesCat extends Categorizer {
  public final static String endl = new String("\n");
  // Member data (also see public data)
  private BagCounters nominCounts;		// hold data on nominal attributs
  private NBNorm[][] continNorm;		        // hold data on real attributes
  private double trainWeight;
  private int numAttributes;
  private boolean useLaplace;                     // turn on to activate Laplace correction
  private double mEstimateFactor;                // noise in Laplace correction
  private double[] attrImportance;               // importance values per attribute
  private boolean[] unkIsVal;               // should unknowns be special values? // decisions per attribute

  /** Ported from C++ >     *   enum UnknownIsValueEnum { unknownNo, unknownYes, unknownAuto }; //C++ equivalent    */  public static final int unknownNo = 1;
  public static final int unknownYes = 2;
  public static final int unknownAuto = 3;
  private int unknownIsValue; // 1, 2, 3.
  private double klThreshold;
     /** Fraction of a single occurence to use in cases when a label    * has no occurences of a given nominal value in the training set:    */  private double noMatchesFactor;
  /** If true Evidence projection is used.    */  private boolean useEvidenceProjection;
  /** The scale factor to use with Evidence Projection.    */   private double evidenceFactor;
  /** Categorizer option defaults.    */  public static final double defaultMEstimateFactor = 1.0;
  public static final boolean defaultLaplaceCorrection = false;
  public static final int defaultUnknownIsValue = unknownNo;
  public static final double defaultKLThreshold = 0.1;
  public static final double defaultNoMatchesFactor = 0.0;
  public static final boolean defaultUseEvidenceProjection = false;
  public static final double defaultEvidenceFactor = 1.0;
  /** Value to use for Variance when actual variance = 0:    */  public static final double epsilon = .01;
  /** Value to use for Vaiance when actual variance is undefined becase there    * is only one occurance.    */  public static final double defaultVariance = 1.0;
/** Constructor   * @param dscr - the description of this Inducer.  * @param instList - training data.  */  public NaiveBayesCat(String dscr, InstanceList instList) {
    super(instList.num_categories(), dscr, instList.get_schema());
    nominCounts = instList.counters();
    trainWeight = instList.total_weight();
    numAttributes = instList.num_attr();
    logOptions.LOG(3, "NBC . . numAttributes = "+numAttributes);
    useLaplace = defaultLaplaceCorrection;
    mEstimateFactor = defaultMEstimateFactor;
    unkIsVal = null;
    unknownIsValue = defaultUnknownIsValue;
    klThreshold = defaultKLThreshold;
    noMatchesFactor = defaultNoMatchesFactor;
    useEvidenceProjection = defaultUseEvidenceProjection;
    evidenceFactor = defaultEvidenceFactor;
    attrImportance = this.compute_importance(instList);
    continNorm = this.compute_contin_norm(instList);
  }  /** Copy Constructor.    * @param source - the NaiveBayesCat to copy.    */  public NaiveBayesCat(NaiveBayesCat source) {
    super(source.num_categories(), source.description(), source.get_schema());
    nominCounts = new BagCounters(source.nominCounts);
    continNorm = source.copyContinNorm();
    attrImportance = source.copyAttrImportance();
    trainWeight = source.trainWeight;
    numAttributes = source.numAttributes;
    useLaplace = source.useLaplace;
    mEstimateFactor = source.mEstimateFactor;
    unkIsVal = null;
    unknownIsValue = source.unknownIsValue;
    klThreshold = source.klThreshold;
    noMatchesFactor = source.noMatchesFactor;
    useEvidenceProjection = source.useEvidenceProjection;
    evidenceFactor = source.evidenceFactor;
  }  /** Categorizes a single instances based upon the training data.    * @param instance - the instance to categorize.    * @return the predicted category.    */  public AugCategory categorize(Instance instance) {
    CatDist cDist = score(instance);
    AugCategory cat = cDist.best_category();
    return cat;
  }  /** Simple Method to return an ID.    * @return - an int representing this Categorizer.    * @deprecated CLASS_NB_CATEGORIZER has been deprecated    */  public int class_id() {return CLASS_NB_CATEGORIZER;}
  /** Returns a pointer to a deep copy of this NaiveBayesCat.    * @return - the copy of this Categorizer.    */  public Object clone() {
    if ( !(this instanceof NaiveBayesCat) ) { 
      Error.fatalErr("NaiveBayesCat.clone: invoked for improper class");    }    return new NaiveBayesCat(this);
  }  /** Compute the norms of the continuous attributes
    * @param instList - the instances to calculate.    * @return the array[][] of NBNorms.    */  public static NBNorm[][] compute_contin_norm(InstanceList instList) {
    int contAttrCount = 0;
    int numCategories = instList.num_categories();
    Schema schema = instList.get_schema();
    int numAttributes = schema.num_attr();
   
    // start labels at -1 for unknown
    NBNorm[][] normDens = new NBNorm[numAttributes][numCategories + 1]; // no initial value
    for (int m=0; m<normDens.length;m++) {
      for (int n=0; n<normDens[m].length;n++) {
        normDens[m][n] = new NBNorm();
        normDens[m][n].set_mean_and_var(0,0);
      }
    }
      
    // loop through each attribute, and process all instances for each
    // continuous one
    for (int attrNum = 0; attrNum < numAttributes; attrNum++) {
      AttrInfo attrinfo = schema.attr_info(attrNum);
      if (attrinfo.can_cast_to_real()) {
	  // this is a continuous attribute
	  contAttrCount++;
	 
	  // read each occurance in the list and feed the stats for attribute
	  StatData[] continStats = new StatData[numCategories + 1];
        for (int j=0; j<continStats.length;j++) {
          continStats[j]=new StatData();
        }
//	  for (ILPix pix(instList); pix; ++pix) { //What?
        for (int i = 0; i < instList.num_instances(); i++) {
	    Instance inst = new Instance((Instance)instList.instance_list().get(i));
	    int labelVal = schema.label_info().cast_to_nominal().get_nominal_val(inst.get_label()); //for some reason the label values for the instances are one number higher than the actual value
	    MLJ.ASSERT(labelVal < numCategories, " NaiveBayesCat.compute_contin_norm()");

            // Ignore unknowns.
	    if ( !attrinfo.is_unknown(inst.get_value(attrNum))) {
	       double value = attrinfo.get_real_val(inst.get_value(attrNum));
	       continStats[labelVal].insert( value );
	    }
	  }

	  double mean;
        double var;
	  // extract Normal Density parameters into normDens table
	  for (int label = 0; label < numCategories; label++) {
	    if (continStats[label].size() == 0 ) {
	       mean = 0;
	       var = defaultVariance;
	    }
	    else {
	       mean = continStats[label].mean();
	       if (continStats[label].size() == 1 )
		  var = defaultVariance;
	       
	       else if ( (var = continStats[label].variance(0))<=0 )   // var == 0
		  var = epsilon;
	    }
	    normDens[attrNum][label].set_mean_and_var(mean,var);

	    //@@ pass in a log option?
	    //LOG(3, " Continuous Attribute # " << attrNum <<
	    //", Label " << label << ": Mean = " << mean <<
	    //", Variation = " << var << endl );
	  }
      } // end of handling this continous attribute
    }    // end of loop through all attributes

    if (contAttrCount==0) {  // no continous attributes found
      normDens = null;
    }
    return normDens;
  }  /** Computes importance values for each nominal attribute using
    * the mutual_info (entropy).
    * Static function; used as helper by train() below.    * @param instList - the instances to use.    * @return - the array[] of importance values.    */  public static double[] compute_importance(InstanceList instList) {
    double[] attrImp = new double[instList.num_attr()];
    for (int i = 0; i < attrImp.length; i++) {
      attrImp[i] = 0;
    }
   
    double ent = Entropy.entropy(instList);
    if (ent == Globals.UNDEFINED_REAL) {
      Error.fatalErr("compute_importance: undefined entropy");
    }
    if(ent < 0 && -ent < MLJ.realEpsilon) {
      ent = 0;
    }
    for (int i=0; i<instList.num_attr(); i++) {
      if(instList.get_schema().attr_info(i).can_cast_to_real()) {
	  attrImp[i] = 0;
      }
      else if(instList.get_schema().attr_info(i).can_cast_to_nominal()) {
	  if(ent <= 0) {

?? 快捷鍵說明

復(fù)制代碼 Ctrl + C
搜索代碼 Ctrl + F
全屏模式 F11
切換主題 Ctrl + Shift + D
顯示快捷鍵 ?
增大字號 Ctrl + =
減小字號 Ctrl + -
亚洲欧美第一页_禁久久精品乱码_粉嫩av一区二区三区免费野_久草精品视频
日本一区二区三区久久久久久久久不 | 99免费精品视频| 337p日本欧洲亚洲大胆精品| 蜜桃精品在线观看| 欧美mv日韩mv国产网站| 韩国成人福利片在线播放| 日韩精品一区二区三区视频| 久久99精品一区二区三区三区| 6080亚洲精品一区二区| 久久精品国产第一区二区三区| 91精品国产欧美一区二区成人| 日本一不卡视频| 日韩欧美成人激情| 国产成人午夜视频| 亚洲日本中文字幕区| 在线一区二区视频| 日本不卡在线视频| 久久综合久久久久88| 国产成人高清视频| 亚洲激情五月婷婷| 日韩一区二区三| 国产精品一区二区久久不卡| 国产精品久久久久久一区二区三区| 色综合久久中文字幕| 天天做天天摸天天爽国产一区| 日韩一区二区麻豆国产| 国产99精品在线观看| 亚洲国产精品视频| 久久久综合九色合综国产精品| 91亚洲永久精品| 亚洲.国产.中文慕字在线| 久久综合久久99| 欧美在线看片a免费观看| 久久国产精品第一页| 最新高清无码专区| 日韩欧美国产高清| 色综合天天在线| 九九热在线视频观看这里只有精品| 中文字幕+乱码+中文字幕一区| 欧美日韩一本到| 欧美日本精品一区二区三区| 国产麻豆精品视频| 亚洲成a天堂v人片| 国产精品美女久久久久高潮| 337p亚洲精品色噜噜| 成人免费视频播放| 久久精品国产一区二区| 亚洲少妇中出一区| 久久久国产一区二区三区四区小说| 欧美日韩在线精品一区二区三区激情 | 亚洲乱码一区二区三区在线观看| 日韩欧美在线影院| 在线观看日韩电影| 国产成人免费视频一区| 美女在线观看视频一区二区| 夜夜揉揉日日人人青青一国产精品| 久久综合久色欧美综合狠狠| 欧美日本一区二区在线观看| 91麻豆123| 成人手机在线视频| 国产一区二区三区在线观看免费| 婷婷成人激情在线网| 一区二区日韩av| 中文字幕字幕中文在线中不卡视频| 久久五月婷婷丁香社区| 欧美一区午夜精品| 欧美精品色综合| 欧美无砖专区一中文字| 一本一道久久a久久精品综合蜜臀| 国产成人免费视频精品含羞草妖精| 首页欧美精品中文字幕| 夜夜操天天操亚洲| 有坂深雪av一区二区精品| 国产精品美女久久久久久久久久久 | 成人h动漫精品一区二区| 麻豆专区一区二区三区四区五区| 亚洲成人高清在线| 亚洲午夜影视影院在线观看| 亚洲另类在线视频| 黄网站免费久久| 免费看黄色91| 美女任你摸久久| 韩日av一区二区| 久久成人免费电影| 久草这里只有精品视频| 精品一区二区三区的国产在线播放| 天天色天天操综合| 青青草视频一区| 毛片av中文字幕一区二区| 玖玖九九国产精品| 国产乱码精品一区二区三| 国产a视频精品免费观看| 丁香五精品蜜臀久久久久99网站| a4yy欧美一区二区三区| 91丨porny丨首页| 欧美三级电影网站| 欧美一个色资源| 久久久国产一区二区三区四区小说 | 国产精品久久久久一区二区三区共 | 亚洲永久精品国产| 日韩二区三区在线观看| 韩国av一区二区三区在线观看| 国产伦精品一区二区三区免费| 国产夫妻精品视频| 色偷偷久久一区二区三区| 欧美人伦禁忌dvd放荡欲情| 欧美一区二区高清| 国产日韩欧美一区二区三区乱码 | 国产午夜精品久久久久久免费视 | 欧美猛男gaygay网站| 欧美电影在哪看比较好| 久久天堂av综合合色蜜桃网| 国产精品乱人伦中文| 亚洲成人av在线电影| 精品亚洲成a人| 91丝袜高跟美女视频| 91精品国产91久久久久久最新毛片 | 欧美日韩在线三级| 26uuu久久综合| 亚洲精品国产品国语在线app| 日本欧美肥老太交大片| av电影在线观看一区| 欧美日韩高清一区二区不卡| 久久精品欧美日韩精品| 亚洲一二三区在线观看| 国产在线精品一区二区三区不卡| 99精品欧美一区二区三区综合在线| 欧美日韩五月天| 日本一二三不卡| 天天色天天操综合| av电影在线观看一区| 欧美不卡激情三级在线观看| 亚洲欧洲av在线| 久久精品国产亚洲一区二区三区 | 在线视频你懂得一区二区三区| 日韩一区二区三区高清免费看看| 国产精品嫩草影院av蜜臀| 青青草伊人久久| 色成人在线视频| 国产精品欧美精品| 蜜臀精品一区二区三区在线观看 | 男人的天堂亚洲一区| 色综合视频在线观看| 国产视频一区不卡| 日本中文字幕不卡| 欧美性猛片aaaaaaa做受| 国产日韩欧美不卡在线| 美国十次了思思久久精品导航| 色视频一区二区| 中文字幕欧美国产| 国产在线精品一区二区三区不卡| 欧美浪妇xxxx高跟鞋交| 亚洲男女毛片无遮挡| 国产91色综合久久免费分享| 精品精品欲导航| 蜜乳av一区二区三区| 欧美日韩一区二区三区四区五区 | 国模套图日韩精品一区二区| 欧美乱熟臀69xxxxxx| 尤物在线观看一区| 91麻豆国产自产在线观看| 国产精品人成在线观看免费| 国模少妇一区二区三区| 日韩一区二区视频| 免费看日韩a级影片| 欧美一区二区三区免费视频| 亚洲一区在线看| 欧洲人成人精品| 亚洲地区一二三色| 欧美日韩综合在线| 亚洲大片免费看| 欧美精品乱人伦久久久久久| 午夜在线成人av| 欧美精品三级日韩久久| 日韩av在线发布| 欧美一卡在线观看| 久久国产精品无码网站| 日韩欧美精品在线| 国产麻豆精品theporn| 国产日韩欧美一区二区三区综合| 国产精品一区二区三区乱码| 国产日韩av一区| 成人99免费视频| 一区二区三区在线观看动漫| 欧洲av一区二区嗯嗯嗯啊| 亚洲与欧洲av电影| 777久久久精品| 国产一区激情在线| 中文字幕 久热精品 视频在线| 91在线精品一区二区三区| 亚洲另类在线视频| 5858s免费视频成人| 久88久久88久久久| 国产精品美女久久久久久久网站| 99久久婷婷国产综合精品| 一级日本不卡的影视| 欧美高清精品3d| 国产一区二区精品在线观看| 国产精品欧美极品| 欧美日韩精品免费|