亚洲欧美第一页_禁久久精品乱码_粉嫩av一区二区三区免费野_久草精品视频

? 歡迎來到蟲蟲下載站! | ?? 資源下載 ?? 資源專輯 ?? 關(guān)于我們
? 蟲蟲下載站

?? readme

?? 馬克斯普朗克提供的機(jī)器學(xué)習(xí)程序包
??
字號(hào):
Libsvm is a simple, easy-to-use, and efficient software for SVM
classification and regression. It can solve C-SVM classification,
nu-SVM classification, one-class-SVM, epsilon-SVM regression,
and nu-SVM regression. This document explains the use of libsvm.

Libsvm is available at 
http://www.csie.ntu.edu.tw/~cjlin/libsvm
Please read the COPYRIGHT file before using libsvm.

Installation
============

On Unix systems, type `make' to build the `svm-train' and `svm-predict'
programs. Run them without arguments to show the usages of them.

On other systems, consult `Makefile' to build them or use the pre-built
binaries (Windows binaries are in the subdirectory `windows').

The format of training and testing data file is:

<label> <index1>:<value1> <index2>:<value2> ...
.
.
.

<label> is the target value of the training data. For classification,
it should be an integer which identifies a class (multi-class classification
is supported). For regression, it's any real number. For one-class SVM,
it's not used so can be any number. <index> is an integer starting from 1,
<value> is a real number. The labels in the testing data file are only used to
calculate accuracy or error. If they are unknown, just fill this column with a
number.

There is a sample training data for classification in this package:
heart_scale.

Type `svm-train heart_scale', and the program will read the training
data and output the model file `heart_scale.model'. Then you can
type `svm-predict heart_scale heart_scale.model output' to see the
rate of classification on training data. The `output' file contains
the prediction value of the model.

There are some other useful programs in this package.

svm-scale:

	This is a tool for scaling input data file.

svm-toy:

	This is a simple graphical interface which shows how SVM
	separate data in a plane. You can click in the window to 
	draw data points. Use "change" button to choose class 
	1 or 2, "load" button to load data from a file, "save" button
	to save data to a file, "run" button to obtain an SVM model,
	and "clear" button to clear the window.

	You can enter options in the bottom of the window, the syntax of
	options is the same as `svm-train'.

	Note that "load" and "save" consider data in the classification but
	not the regression case. Each data point has one label (the color)
	and two attributes (x-axis and y-axis values).

	Type `make' in respective directories to build them.

	You need Qt library to build the Qt version.
	(You can download it from http://www.trolltech.com)

	You need GTK+ library to build the GTK version.
	(You can download it from http://www.gtk.org)
	
	We use Visual C++ to build the Windows version.
	The pre-built Windows binaries are in the windows subdirectory.

`svm-train' Usage
=================

Usage: svm-train [options] training_set_file [model_file]
options:
-s svm_type : set type of SVM (default 0)
	0 -- C-SVC
	1 -- nu-SVC
	2 -- one-class SVM
	3 -- epsilon-SVR
	4 -- nu-SVR
-t kernel_type : set type of kernel function (default 2)
	0 -- linear: u'*v
	1 -- polynomial: (gamma*u'*v + coef0)^degree
	2 -- radial basis function: exp(-gamma*|u-v|^2)
	3 -- sigmoid: tanh(gamma*u'*v + coef0)
-d degree : set degree in kernel function (default 3)
-g gamma : set gamma in kernel function (default 1/k)
-r coef0 : set coef0 in kernel function (default 0)
-c cost : set the parameter C of C-SVC, epsilon-SVR, and nu-SVR (default 1)
-n nu : set the parameter nu of nu-SVC, one-class SVM, and nu-SVR (default 0.5)
-p epsilon : set the epsilon in loss function of epsilon-SVR (default 0.1)
-m cachesize : set cache memory size in MB (default 40)
-e epsilon : set tolerance of termination criterion (default 0.001)
-h shrinking: whether to use the shrinking heuristics, 0 or 1 (default 1)
-wi weight: set the parameter C of class i to weight*C in C-SVC (default 1)
-v n: n-fold cross validation mode

The k in the -g option means the number of attributes in the input data.

option -v randomly splits the data into n parts and calculates cross
validation accuracy/mean squared error on them.

`svm-predict' Usage
===================

Usage: svm-predict test_file model_file output_file

model_file is the model file generated by svm-train.
test_file is the test data you want to predict.
svm-predict will produce output in the output_file.

No options are needed for svm-predict.

Tips on practical use
=====================

* Scale your data. For example, scale each attribute to [0,1] or [-1,+1].
* For C-SVC, try small and large C, like 1 to 1000 and decide which are
  better for your data by cross validation. For the better C's, try
  several gamma's.
* nu in nu-SVC/one-class-SVM/nu-SVR approximates the fraction of training
  errors and support vectors.
* If data for classification are unbalanced (e.g. many positive and
  few negative), try different penalty parameters C by -wi (see
  examples below).

Examples
========

> svm-train -s 0 -c 1000 -t 1 -g 1 -r 1 -d 3 data_file

Train a classifier with polynomial kernel (u'v+1)^3 and C = 1000

> svm-train -s 1 -n 0.1 -t 2 -g 0.5 -e 0.00001 data_file

Train a classifier by nu-SVM (nu = 0.1) with RBF kernel
exp(-0.5|u-v|^2) and stopping tolerance 0.00001

> svm-train -s 3 -p 0.1 -t 0 -c 10 data_file

Solve SVM regression with linear kernel u'v and C=10, and epsilon = 0.1
in the loss function.

> svm-train -s 0 -c 10 -w1 1 -w-1 5 data_file

Train a classifier with penalty 10 for class 1 and penalty 50
for class -1.

> svm-train -s 0 -c 500 -g 0.1 -v 5 data_file

Do five-fold cross validation for the classifier using
the parameters C = 500 and gamma = 0.1

Library Usage
=============

These functions and structures are declared in the header file `svm.h'.
You need to #include "svm.h" in your C/C++ source files and link your
program with `svm.cpp'. You can see `svm-train.c' and `svm-predict.c'
for examples showing how to use them.

Before you classify test data, you need to construct an SVM model
(`svm_model') using training data. A model can also be saved in
a file for later use. Once an SVM model is available, you can use it
to classify new data.

- Function: struct svm_model *svm_train(const struct svm_problem *prob,
					const struct svm_parameter *param);

    This function constructs and returns an SVM model according to
    the given training data and parameters.

    struct svm_problem describes the problem:
	
	struct svm_problem
	{
		int l;
		double *y;
		struct svm_node **x;
	};
 
    where `l' is the number of training data, and `y' is an array containing
    their target values. (integers in classification, real numbers in
    regression) `x' is an array of pointers, each of which points to a sparse
    representation (array of svm_node) of one training vector.

    For example, if we have the following training data:

    LABEL	ATTR1	ATTR2	ATTR3	ATTR4	ATTR5
    -----	-----	-----	-----	-----	-----
      1		  0	  0.1	  0.2	  0	  0
      2		  0	  0.1	  0.3	 -1.2	  0
      1		  0.4	  0	  0	  0	  0
      2		  0	  0.1	  0	  1.4	  0.5
      3		 -0.1	 -0.2	  0.1	  1.1	  0.1

    then the components of svm_problem are:

    l = 5

    y -> 1 2 1 2 3

    x -> [ ] -> (2,0.1) (3,0.2) (-1,?)
	 [ ] -> (2,0.1) (3,0.3) (4,-1.2) (-1,?)
	 [ ] -> (1,0.4) (-1,?)
	 [ ] -> (2,0.1) (4,1.4) (5,0.5) (-1,?)
	 [ ] -> (1,-0.1) (2,-0.2) (3,0.1) (4,1.1) (5,0.1) (-1,?)

    where (index,value) is stored in the structure `svm_node':

	struct svm_node
	{
		int index;
		double value;
	};

    index = -1 indicates the end of one vector.
 
    struct svm_parameter describes the parameters of an SVM model:

	struct svm_parameter
	{
		int svm_type;
		int kernel_type;
		double degree;	// for poly
		double gamma;	// for poly/rbf/sigmoid
		double coef0;	// for poly/sigmoid

		// these are for training only
		double cache_size; // in MB
		double eps;	// stopping criteria
		double C;	// for C_SVC, EPSILON_SVR, and NU_SVR
		int nr_weight;		// for C_SVC
		int *weight_label;	// for C_SVC
		double* weight;		// for C_SVC
		double nu;	// for NU_SVC, ONE_CLASS, and NU_SVR
		double p;	// for EPSILON_SVR
		int shrinking;	// use the shrinking heuristics
	};

    svm_type can be one of C_SVC, NU_SVC, ONE_CLASS, EPSILON_SVR, NU_SVR.

    C_SVC:		C-SVM classification
    NU_SVC:		nu-SVM classification
    ONE_CLASS:		one-class-SVM
    EPSILON_SVR:	epsilon-SVM regression
    NU_SVR:		nu-SVM regression

    kernel_type can be one of LINEAR, POLY, RBF, SIGMOID.

    LINEAR:	u'*v
    POLY:	(gamma*u'*v + coef0)^degree
    RBF:	exp(-gamma*|u-v|^2)
    SIGMOID:	tanh(gamma*u'*v + coef0)

    cache_size is the size of the kernel cache, specified in megabytes.
    C is the cost of constraints violation. (we usually use 1 to 1000)
    eps is the stopping criterion. (we usually use 0.00001 in nu-SVC,
    0.001 in others). nu is the parameter in nu-SVM, nu-SVR, and
    one-class-SVM. p is the epsilon in epsilon-insensitive loss function
    of epsilon-SVM regression. shrinking = 1 means shrinking is conducted;
    = 0 otherwise.

    nr_weight, weight_label, and weight are used to change the penalty
    for some classes (If the weight for a class is not changed, it is
    set to 1). This is useful for training classifier using unbalanced
    input data or with asymmetric misclassification cost.

    nr_weight is the number of elements in the array weight_label and
    weight. Each weight[i] corresponds to weight_label[i], meaning that
    the penalty of class weight_label[i] is scaled by a factor of weight[i].
    
    If you do not want to change penalty for any of the classes,
    just set nr_weight to 0.

    *NOTE* Because svm_model contains pointers to svm_problem, you can
    not free the memory used by svm_problem if you are still using the
    svm_model produced by svm_train().

- Function: double svm_predict(const struct svm_model *model,
                             const struct svm_node *x);

    This function does classification or regression on a test vector x
    given a model.

    For a classification model, the predicted class for x is returned.
    For a regression model, the function value of x calculated using
    the model is returned. For one-class model, +1 or -1 is returned.

- Function: int svm_save_model(const char *model_file_name,
			       const struct svm_model *model);

    This function saves a model to a file; returns 0 on success, or -1
    if an error occurs.

- Function: struct svm_model *svm_load_model(const char *model_file_name);

    This function returns a pointer to the model read from the file,
    or a null pointer if the model could not be loaded.


- Function: void svm_destroy_model(struct svm_model *model);

    This function frees the memory used by a model.

Java version
============

The precompiled java class archive `libsvm.jar' and its source files are
in the java subdirectory. To run the programs, use

java -classpath libsvm.jar svm_train <arguments>
java -classpath libsvm.jar svm_predict <arguments>
java -classpath libsvm.jar svm_toy

We have tried IBM's and Sun's JDK.
You may need to add Java runtime library (like classes.zip) to the classpath.
You may need to increase maximum Java heap size.

Library usages are similar to the C version. These functions are available:

public class svm {
	public static svm_model svm_train(svm_problem prob, svm_parameter param);
	public static double svm_predict(svm_model model, svm_node[] x);
	public static void svm_save_model(String model_file_name, svm_model model) throws IOException
	public static svm_model svm_load_model(String model_file_name) throws IOException
}

Note that in Java version, svm_node[] is not ended with a node whose index = -1.

ADDITIONAL INFORMATION
============

Chih-Chung Chang and Chih-Jen Lin
LIBSVM : a library for support vector machines.
http://www.csie.ntu.edu.tw/~cjlin/papers/libsvm.ps.gz

Acknowledgments:
This work was supported in part by the National Science 
Council of Taiwan via the grant NSC 89-2213-E-002-013.
The authors thank Chih-Wei Hsu and Jen-Hao Lee
for many helpful discussions and comments.

?? 快捷鍵說明

復(fù)制代碼 Ctrl + C
搜索代碼 Ctrl + F
全屏模式 F11
切換主題 Ctrl + Shift + D
顯示快捷鍵 ?
增大字號(hào) Ctrl + =
減小字號(hào) Ctrl + -
亚洲欧美第一页_禁久久精品乱码_粉嫩av一区二区三区免费野_久草精品视频
欧美电影免费观看高清完整版在 | 久久一区二区视频| 国产一区欧美二区| 亚洲精品成人在线| 国产午夜亚洲精品羞羞网站| 2021久久国产精品不只是精品| 欧美亚洲国产一区二区三区| 五月激情丁香一区二区三区| 亚洲风情在线资源站| 亚洲国产成人av网| 日韩av不卡一区二区| 美女www一区二区| 激情综合色丁香一区二区| 色网综合在线观看| 欧美一区中文字幕| 欧美日韩在线播放| 国产成人啪午夜精品网站男同| 国产一区不卡视频| 高清不卡在线观看av| 不卡一区二区三区四区| 91啪亚洲精品| 欧美日韩一区二区三区视频| 日韩一区二区三区四区| 久久精品欧美一区二区三区不卡| 欧美国产日产图区| 亚洲欧美日韩国产综合在线| 亚洲va天堂va国产va久| 美女视频黄频大全不卡视频在线播放| 经典三级一区二区| 成人动漫视频在线| 欧美三级韩国三级日本三斤 | 国产精品二区一区二区aⅴ污介绍| 国产欧美一区二区三区鸳鸯浴| 欧美国产日本视频| 亚洲制服丝袜一区| 另类小说欧美激情| 美女视频一区在线观看| 亚洲精品国产视频| 图片区日韩欧美亚洲| 激情综合网最新| 99re亚洲国产精品| 日韩欧美激情在线| 国产精品不卡在线| 日本午夜一区二区| 不卡一区中文字幕| 69p69国产精品| 国产人成一区二区三区影院| 又紧又大又爽精品一区二区| 美女被吸乳得到大胸91| 99精品黄色片免费大全| 日韩午夜中文字幕| 亚洲欧美一区二区三区国产精品| 日韩av一二三| 99riav一区二区三区| 日韩一二三四区| 成人免费小视频| 久久精品国产亚洲a| 91麻豆福利精品推荐| 精品日韩99亚洲| 亚洲三级理论片| 狠狠色丁香婷婷综合| 欧美日韩国产高清一区二区三区 | 国产成人av电影在线| 欧美日韩免费观看一区二区三区| 国产亚洲污的网站| 视频精品一区二区| 一本大道av一区二区在线播放| 久久久美女毛片| 日韩高清在线一区| 色女孩综合影院| 欧美高清在线视频| 九九热在线视频观看这里只有精品| 一本一本大道香蕉久在线精品| 久久午夜老司机| 日本一区中文字幕| 在线一区二区观看| 国产精品久久久久久久第一福利| 久久精品国产77777蜜臀| 欧美性猛交xxxxxxxx| 亚洲欧洲99久久| 粉嫩13p一区二区三区| 26uuu亚洲| 麻豆成人在线观看| 在线观看91av| 亚洲第一狼人社区| 欧洲国内综合视频| 亚洲人精品午夜| 92国产精品观看| 国产精品天美传媒沈樵| 国产精品一区免费在线观看| 欧美v日韩v国产v| 久久aⅴ国产欧美74aaa| 91精品国产色综合久久不卡电影| 亚洲影视在线播放| 欧美三级电影在线观看| 亚洲国产婷婷综合在线精品| 欧美系列一区二区| 亚洲一区二区偷拍精品| 在线观看日韩精品| 亚洲成人综合视频| 欧美视频一区二区三区| 亚洲图片一区二区| 欧美日韩一卡二卡| 午夜电影一区二区三区| 欧美精品日韩精品| 日本不卡的三区四区五区| 欧美一区二区在线不卡| 蜜臀av一区二区在线免费观看| 日韩亚洲欧美成人一区| 看国产成人h片视频| 精品不卡在线视频| 国产精品白丝av| 国产精品视频你懂的| 色综合天天综合狠狠| 夜夜亚洲天天久久| 在线播放日韩导航| 激情五月婷婷综合| 久久精品人人做| 91麻豆自制传媒国产之光| 亚洲一区免费视频| 日韩视频在线观看一区二区| 国产一区二区女| 1024国产精品| 欧美日韩国产另类不卡| 久久国产精品第一页| 日本一区二区久久| 色噜噜狠狠色综合中国| 偷拍日韩校园综合在线| 2020日本不卡一区二区视频| 成人污视频在线观看| 亚洲免费观看高清完整| 在线综合视频播放| 国产高清在线精品| 亚洲精品视频在线观看网站| 欧美福利视频导航| 国产精品中文字幕一区二区三区| 亚洲欧美一区二区视频| 91.com视频| 国产成人av影院| 亚洲综合男人的天堂| 亚洲精品在线网站| 99久久夜色精品国产网站| 五月天婷婷综合| 久久久国产精品午夜一区ai换脸| 色综合中文字幕国产| 亚洲成人av一区二区三区| 久久久亚洲午夜电影| 91福利精品第一导航| 久草在线在线精品观看| 亚洲男人电影天堂| 日韩精品一区二区在线| 91婷婷韩国欧美一区二区| 全部av―极品视觉盛宴亚洲| 亚洲欧洲日韩女同| 欧美一区二区久久久| av一区二区不卡| 麻豆久久久久久| 亚洲精品视频观看| 久久精品视频免费| 91麻豆精品国产91久久久久久| 成人精品视频网站| 蜜臂av日日欢夜夜爽一区| 一区二区三区在线观看网站| 久久精品视频免费观看| 777欧美精品| 91亚洲男人天堂| 国产精品正在播放| 奇米精品一区二区三区在线观看 | 一区二区在线电影| 久久久久一区二区三区四区| 欧美精品精品一区| 97久久久精品综合88久久| 精品一区二区三区视频| 亚洲国产精品人人做人人爽| 欧美韩国一区二区| 欧美成人性福生活免费看| 欧美日韩一二区| 一本色道a无线码一区v| 不卡电影一区二区三区| 国产自产高清不卡| 美腿丝袜亚洲一区| 亚洲mv在线观看| 亚洲一二三级电影| 亚洲伦在线观看| 国产精品传媒入口麻豆| 久久九九影视网| 久久久久99精品国产片| 精品国产免费视频| 欧美v亚洲v综合ⅴ国产v| 日韩欧美国产综合在线一区二区三区| 欧美亚洲综合色| 色婷婷亚洲婷婷| 一本一道久久a久久精品综合蜜臀| 成人性生交大片免费看视频在线| 韩国精品在线观看| 久久成人免费电影| 久久爱另类一区二区小说| 麻豆一区二区99久久久久| 丁香婷婷综合激情五月色| 精品一区二区av|