亚洲欧美第一页_禁久久精品乱码_粉嫩av一区二区三区免费野_久草精品视频

? 歡迎來到蟲蟲下載站! | ?? 資源下載 ?? 資源專輯 ?? 關(guān)于我們
? 蟲蟲下載站

?? readme

?? 馬克斯普朗克提供的機(jī)器學(xué)習(xí)程序包
??
字號(hào):
Libsvm is a simple, easy-to-use, and efficient software for SVM
classification and regression. It can solve C-SVM classification,
nu-SVM classification, one-class-SVM, epsilon-SVM regression,
and nu-SVM regression. This document explains the use of libsvm.

Libsvm is available at 
http://www.csie.ntu.edu.tw/~cjlin/libsvm
Please read the COPYRIGHT file before using libsvm.

Installation
============

On Unix systems, type `make' to build the `svm-train' and `svm-predict'
programs. Run them without arguments to show the usages of them.

On other systems, consult `Makefile' to build them or use the pre-built
binaries (Windows binaries are in the subdirectory `windows').

The format of training and testing data file is:

<label> <index1>:<value1> <index2>:<value2> ...
.
.
.

<label> is the target value of the training data. For classification,
it should be an integer which identifies a class (multi-class classification
is supported). For regression, it's any real number. For one-class SVM,
it's not used so can be any number. <index> is an integer starting from 1,
<value> is a real number. The labels in the testing data file are only used to
calculate accuracy or error. If they are unknown, just fill this column with a
number.

There is a sample training data for classification in this package:
heart_scale.

Type `svm-train heart_scale', and the program will read the training
data and output the model file `heart_scale.model'. Then you can
type `svm-predict heart_scale heart_scale.model output' to see the
rate of classification on training data. The `output' file contains
the prediction value of the model.

There are some other useful programs in this package.

svm-scale:

	This is a tool for scaling input data file.

svm-toy:

	This is a simple graphical interface which shows how SVM
	separate data in a plane. You can click in the window to 
	draw data points. Use "change" button to choose class 
	1 or 2, "load" button to load data from a file, "save" button
	to save data to a file, "run" button to obtain an SVM model,
	and "clear" button to clear the window.

	You can enter options in the bottom of the window, the syntax of
	options is the same as `svm-train'.

	Note that "load" and "save" consider data in the classification but
	not the regression case. Each data point has one label (the color)
	and two attributes (x-axis and y-axis values).

	Type `make' in respective directories to build them.

	You need Qt library to build the Qt version.
	(You can download it from http://www.trolltech.com)

	You need GTK+ library to build the GTK version.
	(You can download it from http://www.gtk.org)
	
	We use Visual C++ to build the Windows version.
	The pre-built Windows binaries are in the windows subdirectory.

`svm-train' Usage
=================

Usage: svm-train [options] training_set_file [model_file]
options:
-s svm_type : set type of SVM (default 0)
	0 -- C-SVC
	1 -- nu-SVC
	2 -- one-class SVM
	3 -- epsilon-SVR
	4 -- nu-SVR
-t kernel_type : set type of kernel function (default 2)
	0 -- linear: u'*v
	1 -- polynomial: (gamma*u'*v + coef0)^degree
	2 -- radial basis function: exp(-gamma*|u-v|^2)
	3 -- sigmoid: tanh(gamma*u'*v + coef0)
-d degree : set degree in kernel function (default 3)
-g gamma : set gamma in kernel function (default 1/k)
-r coef0 : set coef0 in kernel function (default 0)
-c cost : set the parameter C of C-SVC, epsilon-SVR, and nu-SVR (default 1)
-n nu : set the parameter nu of nu-SVC, one-class SVM, and nu-SVR (default 0.5)
-p epsilon : set the epsilon in loss function of epsilon-SVR (default 0.1)
-m cachesize : set cache memory size in MB (default 40)
-e epsilon : set tolerance of termination criterion (default 0.001)
-h shrinking: whether to use the shrinking heuristics, 0 or 1 (default 1)
-wi weight: set the parameter C of class i to weight*C in C-SVC (default 1)
-v n: n-fold cross validation mode

The k in the -g option means the number of attributes in the input data.

option -v randomly splits the data into n parts and calculates cross
validation accuracy/mean squared error on them.

`svm-predict' Usage
===================

Usage: svm-predict test_file model_file output_file

model_file is the model file generated by svm-train.
test_file is the test data you want to predict.
svm-predict will produce output in the output_file.

No options are needed for svm-predict.

Tips on practical use
=====================

* Scale your data. For example, scale each attribute to [0,1] or [-1,+1].
* For C-SVC, try small and large C, like 1 to 1000 and decide which are
  better for your data by cross validation. For the better C's, try
  several gamma's.
* nu in nu-SVC/one-class-SVM/nu-SVR approximates the fraction of training
  errors and support vectors.
* If data for classification are unbalanced (e.g. many positive and
  few negative), try different penalty parameters C by -wi (see
  examples below).

Examples
========

> svm-train -s 0 -c 1000 -t 1 -g 1 -r 1 -d 3 data_file

Train a classifier with polynomial kernel (u'v+1)^3 and C = 1000

> svm-train -s 1 -n 0.1 -t 2 -g 0.5 -e 0.00001 data_file

Train a classifier by nu-SVM (nu = 0.1) with RBF kernel
exp(-0.5|u-v|^2) and stopping tolerance 0.00001

> svm-train -s 3 -p 0.1 -t 0 -c 10 data_file

Solve SVM regression with linear kernel u'v and C=10, and epsilon = 0.1
in the loss function.

> svm-train -s 0 -c 10 -w1 1 -w-1 5 data_file

Train a classifier with penalty 10 for class 1 and penalty 50
for class -1.

> svm-train -s 0 -c 500 -g 0.1 -v 5 data_file

Do five-fold cross validation for the classifier using
the parameters C = 500 and gamma = 0.1

Library Usage
=============

These functions and structures are declared in the header file `svm.h'.
You need to #include "svm.h" in your C/C++ source files and link your
program with `svm.cpp'. You can see `svm-train.c' and `svm-predict.c'
for examples showing how to use them.

Before you classify test data, you need to construct an SVM model
(`svm_model') using training data. A model can also be saved in
a file for later use. Once an SVM model is available, you can use it
to classify new data.

- Function: struct svm_model *svm_train(const struct svm_problem *prob,
					const struct svm_parameter *param);

    This function constructs and returns an SVM model according to
    the given training data and parameters.

    struct svm_problem describes the problem:
	
	struct svm_problem
	{
		int l;
		double *y;
		struct svm_node **x;
	};
 
    where `l' is the number of training data, and `y' is an array containing
    their target values. (integers in classification, real numbers in
    regression) `x' is an array of pointers, each of which points to a sparse
    representation (array of svm_node) of one training vector.

    For example, if we have the following training data:

    LABEL	ATTR1	ATTR2	ATTR3	ATTR4	ATTR5
    -----	-----	-----	-----	-----	-----
      1		  0	  0.1	  0.2	  0	  0
      2		  0	  0.1	  0.3	 -1.2	  0
      1		  0.4	  0	  0	  0	  0
      2		  0	  0.1	  0	  1.4	  0.5
      3		 -0.1	 -0.2	  0.1	  1.1	  0.1

    then the components of svm_problem are:

    l = 5

    y -> 1 2 1 2 3

    x -> [ ] -> (2,0.1) (3,0.2) (-1,?)
	 [ ] -> (2,0.1) (3,0.3) (4,-1.2) (-1,?)
	 [ ] -> (1,0.4) (-1,?)
	 [ ] -> (2,0.1) (4,1.4) (5,0.5) (-1,?)
	 [ ] -> (1,-0.1) (2,-0.2) (3,0.1) (4,1.1) (5,0.1) (-1,?)

    where (index,value) is stored in the structure `svm_node':

	struct svm_node
	{
		int index;
		double value;
	};

    index = -1 indicates the end of one vector.
 
    struct svm_parameter describes the parameters of an SVM model:

	struct svm_parameter
	{
		int svm_type;
		int kernel_type;
		double degree;	// for poly
		double gamma;	// for poly/rbf/sigmoid
		double coef0;	// for poly/sigmoid

		// these are for training only
		double cache_size; // in MB
		double eps;	// stopping criteria
		double C;	// for C_SVC, EPSILON_SVR, and NU_SVR
		int nr_weight;		// for C_SVC
		int *weight_label;	// for C_SVC
		double* weight;		// for C_SVC
		double nu;	// for NU_SVC, ONE_CLASS, and NU_SVR
		double p;	// for EPSILON_SVR
		int shrinking;	// use the shrinking heuristics
	};

    svm_type can be one of C_SVC, NU_SVC, ONE_CLASS, EPSILON_SVR, NU_SVR.

    C_SVC:		C-SVM classification
    NU_SVC:		nu-SVM classification
    ONE_CLASS:		one-class-SVM
    EPSILON_SVR:	epsilon-SVM regression
    NU_SVR:		nu-SVM regression

    kernel_type can be one of LINEAR, POLY, RBF, SIGMOID.

    LINEAR:	u'*v
    POLY:	(gamma*u'*v + coef0)^degree
    RBF:	exp(-gamma*|u-v|^2)
    SIGMOID:	tanh(gamma*u'*v + coef0)

    cache_size is the size of the kernel cache, specified in megabytes.
    C is the cost of constraints violation. (we usually use 1 to 1000)
    eps is the stopping criterion. (we usually use 0.00001 in nu-SVC,
    0.001 in others). nu is the parameter in nu-SVM, nu-SVR, and
    one-class-SVM. p is the epsilon in epsilon-insensitive loss function
    of epsilon-SVM regression. shrinking = 1 means shrinking is conducted;
    = 0 otherwise.

    nr_weight, weight_label, and weight are used to change the penalty
    for some classes (If the weight for a class is not changed, it is
    set to 1). This is useful for training classifier using unbalanced
    input data or with asymmetric misclassification cost.

    nr_weight is the number of elements in the array weight_label and
    weight. Each weight[i] corresponds to weight_label[i], meaning that
    the penalty of class weight_label[i] is scaled by a factor of weight[i].
    
    If you do not want to change penalty for any of the classes,
    just set nr_weight to 0.

    *NOTE* Because svm_model contains pointers to svm_problem, you can
    not free the memory used by svm_problem if you are still using the
    svm_model produced by svm_train().

- Function: double svm_predict(const struct svm_model *model,
                             const struct svm_node *x);

    This function does classification or regression on a test vector x
    given a model.

    For a classification model, the predicted class for x is returned.
    For a regression model, the function value of x calculated using
    the model is returned. For one-class model, +1 or -1 is returned.

- Function: int svm_save_model(const char *model_file_name,
			       const struct svm_model *model);

    This function saves a model to a file; returns 0 on success, or -1
    if an error occurs.

- Function: struct svm_model *svm_load_model(const char *model_file_name);

    This function returns a pointer to the model read from the file,
    or a null pointer if the model could not be loaded.


- Function: void svm_destroy_model(struct svm_model *model);

    This function frees the memory used by a model.

Java version
============

The precompiled java class archive `libsvm.jar' and its source files are
in the java subdirectory. To run the programs, use

java -classpath libsvm.jar svm_train <arguments>
java -classpath libsvm.jar svm_predict <arguments>
java -classpath libsvm.jar svm_toy

We have tried IBM's and Sun's JDK.
You may need to add Java runtime library (like classes.zip) to the classpath.
You may need to increase maximum Java heap size.

Library usages are similar to the C version. These functions are available:

public class svm {
	public static svm_model svm_train(svm_problem prob, svm_parameter param);
	public static double svm_predict(svm_model model, svm_node[] x);
	public static void svm_save_model(String model_file_name, svm_model model) throws IOException
	public static svm_model svm_load_model(String model_file_name) throws IOException
}

Note that in Java version, svm_node[] is not ended with a node whose index = -1.

ADDITIONAL INFORMATION
============

Chih-Chung Chang and Chih-Jen Lin
LIBSVM : a library for support vector machines.
http://www.csie.ntu.edu.tw/~cjlin/papers/libsvm.ps.gz

Acknowledgments:
This work was supported in part by the National Science 
Council of Taiwan via the grant NSC 89-2213-E-002-013.
The authors thank Chih-Wei Hsu and Jen-Hao Lee
for many helpful discussions and comments.

?? 快捷鍵說明

復(fù)制代碼 Ctrl + C
搜索代碼 Ctrl + F
全屏模式 F11
切換主題 Ctrl + Shift + D
顯示快捷鍵 ?
增大字號(hào) Ctrl + =
減小字號(hào) Ctrl + -
亚洲欧美第一页_禁久久精品乱码_粉嫩av一区二区三区免费野_久草精品视频
国产精品白丝在线| 国产成人精品亚洲日本在线桃色| 91黄色免费看| 美洲天堂一区二卡三卡四卡视频 | 日韩一区二区三免费高清| 欧美综合视频在线观看| 久久欧美中文字幕| 国产亚洲综合在线| 国产91精品一区二区麻豆网站| 激情文学综合丁香| 欧美三级电影在线看| caoporen国产精品视频| 精品欧美一区二区在线观看| 国产真实乱偷精品视频免| 亚洲最色的网站| 亚洲精品免费在线| 在线成人免费视频| 日韩黄色小视频| 国产九色sp调教91| 精品视频1区2区| 欧美日韩在线播| 午夜精品在线看| 免费美女久久99| 国产精品白丝jk白祙喷水网站 | 视频在线观看一区二区三区| 在线亚洲高清视频| 亚洲免费在线视频一区 二区| 精品国产乱码久久久久久蜜臀| 亚洲妇女屁股眼交7| 欧美影片第一页| 国产欧美精品国产国产专区| 日韩欧美一区中文| 国产一区999| 99久久精品免费看国产 | 国产精品国产三级国产有无不卡| 26uuu色噜噜精品一区二区| 免费成人美女在线观看| 国产精品系列在线观看| 色综合网站在线| 久久色中文字幕| 亚洲国产日韩精品| 日韩精品一区第一页| 日韩有码一区二区三区| 全部av―极品视觉盛宴亚洲| 蜜臀久久99精品久久久久久9| 国产女同互慰高潮91漫画| 国产一区二区调教| 亚洲激情图片qvod| 中文字幕欧美三区| 久久99精品国产91久久来源 | 色综合亚洲欧洲| 成人久久18免费网站麻豆| 亚洲成人av免费| 国产欧美精品一区aⅴ影院| 日韩视频永久免费| 中文字幕一区二区三区在线观看 | 91丝袜国产在线播放| 久久精品99国产国产精| 欧日韩精品视频| 国产精品美女视频| 久久久久一区二区三区四区| 欧美日本一区二区| 国产精品一区免费在线观看| 国产精品久99| 中文字幕欧美国产| 日韩无一区二区| 蜜桃精品在线观看| 久久国产精品色| 国产高清无密码一区二区三区| 亚洲国产精品一区二区久久 | 欧美日本在线播放| 美国十次综合导航| 天堂一区二区在线| 亚洲柠檬福利资源导航| 国产原创一区二区三区| 日本精品免费观看高清观看| 成人综合在线网站| 国产成人av网站| 91精品国产欧美一区二区| 欧美日韩精品专区| 午夜久久久久久久久| 日韩成人免费在线| 久久精品国产秦先生| 日韩毛片视频在线看| 久久99热狠狠色一区二区| 欧美大白屁股肥臀xxxxxx| 亚洲另类在线一区| 天天色综合天天| 欧美一区二区三区在线电影| 日韩不卡一二三区| 2023国产精品自拍| a美女胸又www黄视频久久| 亚洲免费观看高清完整版在线观看| 91视频观看免费| 一区二区三区四区激情| 在线电影国产精品| 国产精品久久久久影院老司 | 91无套直看片红桃| 性感美女极品91精品| 国产一区二区三区蝌蚪| 国产目拍亚洲精品99久久精品 | 亚洲一区二区3| 国产一区二区在线观看视频| 中文字幕免费一区| 欧美欧美欧美欧美首页| 黄色资源网久久资源365| 自拍偷拍亚洲综合| 91精品国产色综合久久不卡蜜臀| 国产美女视频91| 一区二区国产盗摄色噜噜| 欧美成人一区二区三区| www.在线欧美| 在线不卡欧美精品一区二区三区| 国产91富婆露脸刺激对白| 成人永久免费视频| 国产精品久久久久四虎| 国产精品麻豆视频| 久久你懂得1024| 日本电影欧美片| 欧美日韩1234| 亚洲人成人一区二区在线观看| 国产欧美一区二区精品性| 欧美精品一区二| 久久不见久久见免费视频1| 另类人妖一区二区av| 国产精品护士白丝一区av| 成人福利视频网站| 在线观看欧美日本| 久久久久99精品一区| 亚洲国产一区二区三区青草影视| 国产麻豆日韩欧美久久| 欧美激情一区二区三区| 欧美午夜精品一区| 亚洲欧美偷拍卡通变态| 日韩欧美国产综合| 日本一区二区三级电影在线观看 | 亚洲柠檬福利资源导航| 夜夜精品浪潮av一区二区三区| 国内精品在线播放| 亚洲一区二区精品3399| 一区二区三区四区高清精品免费观看| 亚洲妇女屁股眼交7| 五月综合激情婷婷六月色窝| 亚洲网友自拍偷拍| 欧美色图第一页| 91麻豆精品久久久久蜜臀| 日韩电影免费在线| 久久精品亚洲乱码伦伦中文 | 色av成人天堂桃色av| xfplay精品久久| 欧美日韩高清一区二区| 亚洲天堂免费看| 亚洲综合色婷婷| 成人久久18免费网站麻豆| 欧美色综合网站| 国产福利不卡视频| 91小视频在线免费看| 国产喷白浆一区二区三区| 午夜av一区二区三区| 亚洲一区二区三区视频在线播放 | 亚洲图片欧美综合| 男人的天堂久久精品| av电影在线观看完整版一区二区| 亚洲人成影院在线观看| 国产成人亚洲精品狼色在线| 欧美日韩国产大片| av动漫一区二区| 宅男噜噜噜66一区二区66| 国产亚洲自拍一区| 裸体在线国模精品偷拍| 天堂午夜影视日韩欧美一区二区| 在线观看日韩毛片| 亚洲品质自拍视频| 精品欧美一区二区在线观看| 亚洲情趣在线观看| 久久久www成人免费毛片麻豆| 91原创在线视频| 国产精品剧情在线亚洲| 欧美日韩亚洲综合一区| 国产女同性恋一区二区| 精品久久五月天| 免费看黄色91| 国产精品一区二区视频| 国产一区二区三区高清播放| 日韩视频在线永久播放| 亚洲午夜免费视频| 欧美在线你懂的| 亚洲一区免费在线观看| 欧美高清性hdvideosex| 欧美老女人在线| 日韩一区二区在线播放| 中文字幕精品三区| 亚洲福利电影网| 久久综合九色综合欧美就去吻| 久久草av在线| 日韩精品一区国产麻豆| 欧美韩国日本一区| 欧美福利视频一区| 夜夜亚洲天天久久| 国产精品伦理一区二区|