亚洲欧美第一页_禁久久精品乱码_粉嫩av一区二区三区免费野_久草精品视频

? 歡迎來到蟲蟲下載站! | ?? 資源下載 ?? 資源專輯 ?? 關(guān)于我們
? 蟲蟲下載站

?? readme

?? SVM是一種常用的模式分類機器學(xué)習(xí)算法
??
?? 第 1 頁 / 共 2 頁
字號:
Libsvm is a simple, easy-to-use, and efficient software for SVMclassification and regression. It solves C-SVM classification, nu-SVMclassification, one-class-SVM, epsilon-SVM regression, and nu-SVMregression. It also provides an automatic model selection tool forC-SVM classification. This document explains the use of libsvm.Libsvm is available at http://www.csie.ntu.edu.tw/~cjlin/libsvmPlease read the COPYRIGHT file before using libsvm.Table of Contents=================- Quick Start- Installation- `svm-train' Usage- `svm-predict' Usage- Tips on practical use- Examples- Precomputed Kernels - Library Usage- Java Version- Building Windows Binaries- Additional Tools: Model Selection, Sub-sampling, etc.- Python Interface- Additional InformationQuick Start===========If you are new to SVM and if the data is not large, please go to `tools' directory and use easy.py after installation. It does everything automatic -- from data scaling to parameter selection.Usage: easy.py training_file [testing_file]More information about parameter selection can be found intools/README.Installation============On Unix systems, type `make' to build the `svm-train' and `svm-predict'programs. Run them without arguments to show the usages of them.On other systems, consult `Makefile' to build them (e.g., see'Building Windows binaries' in this file) or use the pre-builtbinaries (Windows binaries are in the directory `windows').The format of training and testing data file is:<label> <index1>:<value1> <index2>:<value2> ......For classification, <label> is an integer indicating the class label(multi-class is supported). For regression, <label> isthe target value which can be any real number. For one-class SVM, it'snot used so can be any number.  Except using precomputed kernels(explained in another section), <index>:<value> gives a feature(attribute) value.  <index> is an integer starting from 1 and <value>is a real number. Indices must be in an ASCENDING order. Labels in thetesting file are only used to calculate accuracy or errors. If theyare unknown, just fill the first column with any numbers.A sample classification data included in this package is `heart_scale'.Type `svm-train heart_scale', and the program will read the trainingdata and output the model file `heart_scale.model'. If you have a testset called heart_scale.t, then type `svm-predict heart_scale.theart_scale.model output' to see the prediction accuracy. The `output'file contains the predicted class labels.There are some other useful programs in this package.svm-scale:	This is a tool for scaling input data file.svm-toy:	This is a simple graphical interface which shows how SVM	separate data in a plane. You can click in the window to 	draw data points. Use "change" button to choose class 	1, 2 or 3 (i.e., up to three classes are supported), "load"	button to load data from a file, "save" button to save data to	a file, "run" button to obtain an SVM model, and "clear"	button to clear the window.	You can enter options in the bottom of the window, the syntax of	options is the same as `svm-train'.	Note that "load" and "save" consider data in the	classification but not the regression case. Each data point	has one label (the color) which must be 1, 2, or 3 and two	attributes (x-axis and y-axis values) in [0,1].	Type `make' in respective directories to build them.	You need Qt library to build the Qt version.	(available from http://www.trolltech.com)	You need GTK+ library to build the GTK version.	(available from http://www.gtk.org)		We use Visual C++ to build the Windows version.	The pre-built Windows binaries are in the windows directory.`svm-train' Usage=================Usage: svm-train [options] training_set_file [model_file]options:-s svm_type : set type of SVM (default 0)	0 -- C-SVC	1 -- nu-SVC	2 -- one-class SVM	3 -- epsilon-SVR	4 -- nu-SVR-t kernel_type : set type of kernel function (default 2)	0 -- linear: u'*v	1 -- polynomial: (gamma*u'*v + coef0)^degree	2 -- radial basis function: exp(-gamma*|u-v|^2)	3 -- sigmoid: tanh(gamma*u'*v + coef0)	4 -- precomputed kernel (kernel values in training_set_file)-d degree : set degree in kernel function (default 3)-g gamma : set gamma in kernel function (default 1/k)-r coef0 : set coef0 in kernel function (default 0)-c cost : set the parameter C of C-SVC, epsilon-SVR, and nu-SVR (default 1)-n nu : set the parameter nu of nu-SVC, one-class SVM, and nu-SVR (default 0.5)-p epsilon : set the epsilon in loss function of epsilon-SVR (default 0.1)-m cachesize : set cache memory size in MB (default 100)-e epsilon : set tolerance of termination criterion (default 0.001)-h shrinking: whether to use the shrinking heuristics, 0 or 1 (default 1)-b probability_estimates: whether to train an SVC or SVR model for probability estimates, 0 or 1 (default 0)-wi weight: set the parameter C of class i to weight*C in C-SVC (default 1)-v n: n-fold cross validation modeThe k in the -g option means the number of attributes in the input data.option -v randomly splits the data into n parts and calculates crossvalidation accuracy/mean squared error on them.`svm-predict' Usage===================Usage: svm-predict [options] test_file model_file output_fileoptions:-b probability_estimates: whether to predict probability estimates, 0 or 1 (default 0); for one-class SVM only 0 is supportedmodel_file is the model file generated by svm-train.test_file is the test data you want to predict.svm-predict will produce output in the output_file.Tips on Practical Use=====================* Scale your data. For example, scale each attribute to [0,1] or [-1,+1].* For C-SVC, consider using the model selection tool in the tools directory.* nu in nu-SVC/one-class-SVM/nu-SVR approximates the fraction of training  errors and support vectors.* If data for classification are unbalanced (e.g. many positive and  few negative), try different penalty parameters C by -wi (see  examples below).* Specify larger cache size (i.e., larger -m) for huge problems.Examples========> svm-scale -l -1 -u 1 -s range train > train.scale> svm-scale -r range test > test.scaleScale each feature of the training data to be in [-1,1]. Scalingfactors are stored in the file range and then used for scaling thetest data.> svm-train -s 0 -c 1000 -t 2 -g 0.5 -e 0.00001 data_file Train a classifier with RBF kernel exp(-0.5|u-v|^2) and stoppingtolerance 0.00001> svm-train -s 3 -p 0.1 -t 0 -c 10 data_fileSolve SVM regression with linear kernel u'v and C=10, and epsilon = 0.1in the loss function.> svm-train -s 0 -c 10 -w1 1 -w-1 5 data_fileTrain a classifier with penalty 10 for class 1 and penalty 50for class -1.> svm-train -s 0 -c 500 -g 0.1 -v 5 data_fileDo five-fold cross validation for the classifier usingthe parameters C = 500 and gamma = 0.1> svm-train -s 0 -b 1 data_file> svm-predict -b 1 test_file data_file.model output_fileObtain a model with probability information and predict test data withprobability estimatesPrecomputed Kernels ===================Users may precompute kernel values and input them as training andtesting files.  Then libsvm does not need the originaltraining/testing sets.Assume there are L training instances x1, ..., xL and. Let K(x, y) be the kernelvalue of two instances x and y. The input formatsare:New training instance for xi:<label> 0:i 1:K(xi,x1) ... L:K(xi,xL) New testing instance for any x:<label> 0:? 1:K(x,x1) ... L:K(x,xL) That is, in the training file the first column must be the "ID" ofxi. In testing, ? can be any value.All kernel values including ZEROs must be explicitly provided.  Anypermutation or random subsets of the training/testing files are alsovalid (see examples below).Note: the format is slightly different from the precomputed kernelpackage released in libsvmtools earlier.Examples:	Assume the original training data has three four-feature	instances and testing data has one instance:	1  1:1 2:1 3:1 4:1	3      2:3     4:3	2          3:1	1  1:1     3:1	If the linear kernel is used, we have the following new	training/testing sets:	1  0:1 1:4 2:6  3:1	3  0:2 1:6 2:18 3:0 	2  0:3 1:1 2:0  3:1 	1  0:? 1:2 2:0  3:1	? can be any value.	Any subset of the above training file is also valid. For example,	2  0:3 1:1 2:0  3:1	3  0:2 1:6 2:18 3:0 	implies that the kernel matrix is		0  1		18 0Library Usage=============These functions and structures are declared in the header file `svm.h'.You need to #include "svm.h" in your C/C++ source files and link yourprogram with `svm.cpp'. You can see `svm-train.c' and `svm-predict.c'for examples showing how to use them.Before you classify test data, you need to construct an SVM model(`svm_model') using training data. A model can also be saved ina file for later use. Once an SVM model is available, you can use itto classify new data.- Function: struct svm_model *svm_train(const struct svm_problem *prob,					const struct svm_parameter *param);    This function constructs and returns an SVM model according to    the given training data and parameters.    struct svm_problem describes the problem:		struct svm_problem	{		int l;		double *y;		struct svm_node **x;	};     where `l' is the number of training data, and `y' is an array containing    their target values. (integers in classification, real numbers in    regression) `x' is an array of pointers, each of which points to a sparse    representation (array of svm_node) of one training vector.     For example, if we have the following training data:    LABEL	ATTR1	ATTR2	ATTR3	ATTR4	ATTR5    -----	-----	-----	-----	-----	-----      1		  0	  0.1	  0.2	  0	  0

?? 快捷鍵說明

復(fù)制代碼 Ctrl + C
搜索代碼 Ctrl + F
全屏模式 F11
切換主題 Ctrl + Shift + D
顯示快捷鍵 ?
增大字號 Ctrl + =
減小字號 Ctrl + -
亚洲欧美第一页_禁久久精品乱码_粉嫩av一区二区三区免费野_久草精品视频
亚洲男同性视频| 在线亚洲一区观看| 日韩欧美国产电影| 亚洲天堂成人在线观看| 日韩黄色免费网站| 欧美日韩免费不卡视频一区二区三区| 国产精品成人一区二区艾草 | 久久丝袜美腿综合| 麻豆一区二区99久久久久| 欧美日韩国产123区| 一区二区三区四区不卡视频| 色偷偷久久人人79超碰人人澡| 亚洲欧美怡红院| 在线观看视频91| 午夜电影一区二区三区| 日韩亚洲欧美一区| 国产一区二区三区在线观看免费 | 精品国产sm最大网站| 秋霞午夜鲁丝一区二区老狼| 欧美色综合天天久久综合精品| 国产欧美精品日韩区二区麻豆天美| 高清在线观看日韩| 国产精品进线69影院| 色菇凉天天综合网| 亚洲电影中文字幕在线观看| 91精品国产综合久久蜜臀| 久久电影国产免费久久电影| 国产夜色精品一区二区av| 懂色av噜噜一区二区三区av| 亚洲区小说区图片区qvod| 欧美日韩一二三区| 激情五月播播久久久精品| 亚洲国产精品ⅴa在线观看| 色综合一区二区三区| 日韩在线一二三区| 国产三级一区二区| 色噜噜偷拍精品综合在线| 性做久久久久久久免费看| 精品国产免费一区二区三区香蕉| 成人a区在线观看| 亚洲国产一二三| 久久午夜羞羞影院免费观看| 色欧美乱欧美15图片| 美国十次了思思久久精品导航| 国产欧美精品一区二区色综合朱莉 | 99久久综合狠狠综合久久| 亚洲美女在线国产| 精品少妇一区二区| 色呦呦一区二区三区| 久久99国产精品成人| 国产精品家庭影院| 欧美精选一区二区| 福利电影一区二区三区| 日产欧产美韩系列久久99| 国产精品乱码久久久久久| 88在线观看91蜜桃国自产| 国产成a人无v码亚洲福利| 亚洲国产精品影院| 中文字幕不卡在线观看| 在线观看日韩精品| 不卡一区二区中文字幕| 久久99久久99| 午夜激情一区二区| 亚洲欧美视频在线观看视频| 日韩美女视频一区二区在线观看| 91蝌蚪porny| 成人夜色视频网站在线观看| 日本一道高清亚洲日美韩| 亚洲视频小说图片| 国产日本亚洲高清| 日韩一二三区不卡| 欧美日韩视频在线第一区| 成+人+亚洲+综合天堂| 国产精品自拍三区| 亚洲成人在线免费| 国产视频不卡一区| 26uuu色噜噜精品一区二区| 欧美精品视频www在线观看| 在线视频一区二区三区| 国产mv日韩mv欧美| 国产麻豆精品久久一二三| 另类欧美日韩国产在线| 天天操天天综合网| 三级影片在线观看欧美日韩一区二区 | 日韩av一级电影| 午夜电影一区二区| 亚洲va天堂va国产va久| 一区二区三区在线观看动漫| 亚洲三级免费电影| 精品国产乱码久久久久久图片| 欧美一区二区三区在| 欧美欧美午夜aⅴ在线观看| 欧美在线视频不卡| 欧美视频第二页| 91麻豆精品国产| 欧美一区二区三区四区视频| 91精品国产91热久久久做人人| 一本大道久久a久久综合| 97se亚洲国产综合自在线| 99精品欧美一区二区三区小说 | 欧美日韩www| 欧美嫩在线观看| 日韩美一区二区三区| 精品国产乱码久久久久久浪潮| 制服丝袜亚洲色图| 欧美精品久久久久久久久老牛影院| 在线观看91精品国产麻豆| 91精品一区二区三区在线观看| 日韩欧美一区二区免费| 在线综合视频播放| 精品少妇一区二区三区| 日本一区二区三区电影| 亚洲三级在线免费观看| 亚洲chinese男男1069| 蜜臀国产一区二区三区在线播放| 日韩高清中文字幕一区| 精品中文字幕一区二区小辣椒| 国产精品一二三四| 色综合色综合色综合| 91黄色激情网站| 在线播放日韩导航| 欧美日韩国产bt| 国产亚洲女人久久久久毛片| 综合欧美亚洲日本| 日韩二区三区在线观看| 国产成人在线免费| 成人激情小说乱人伦| 欧美日韩精品欧美日韩精品一| 精品国产sm最大网站免费看| 亚洲欧美另类久久久精品| 亚洲在线中文字幕| 国产一区二区女| 欧美视频一二三区| 国产亚洲欧洲997久久综合| 夜夜操天天操亚洲| 国产露脸91国语对白| a4yy欧美一区二区三区| 色综合色综合色综合| 91啪在线观看| 中文欧美字幕免费| 国产在线不卡一区| 欧美一区在线视频| 亚洲欧美成人一区二区三区| 国产精品资源网| 日韩女优视频免费观看| 亚洲高清视频在线| 欧洲一区二区三区在线| 亚洲日本免费电影| 国产一区二区精品在线观看| 欧美一区三区二区| 亚洲一区二区不卡免费| 91在线观看视频| 国产精品久久午夜夜伦鲁鲁| 国产高清视频一区| 国产午夜亚洲精品理论片色戒| 精品一区二区三区免费视频| 在线综合视频播放| 日韩精彩视频在线观看| 91.xcao| 日韩va亚洲va欧美va久久| 欧美日韩一级大片网址| 亚洲国产成人av网| 欧美日韩一卡二卡三卡 | 欧美午夜影院一区| 一区二区三区资源| 在线中文字幕一区| 亚洲超碰精品一区二区| 欧美久久久久久久久中文字幕| 午夜免费欧美电影| 6080日韩午夜伦伦午夜伦| 日韩精品亚洲一区二区三区免费| 欧美日韩综合一区| 天天免费综合色| 精品裸体舞一区二区三区| 精品一区二区三区在线观看 | 94-欧美-setu| 一区二区三区欧美在线观看| 欧美日韩免费观看一区二区三区| 亚洲成a人v欧美综合天堂| 日韩一区二区精品在线观看| 捆绑调教一区二区三区| 欧美精品一区二区三区蜜桃| 成人午夜视频免费看| 日韩美女视频一区二区| 精品视频免费在线| 久久se这里有精品| 中文字幕成人网| 在线观看免费成人| 天天色图综合网| 国产欧美精品一区aⅴ影院 | 在线免费视频一区二区| 三级亚洲高清视频| 久久久精品一品道一区| 色狠狠av一区二区三区| 秋霞国产午夜精品免费视频| 国产精品欧美一级免费| 欧美日韩国产高清一区二区三区| 久久99久久99小草精品免视看| 国产精品理论在线观看| 91.成人天堂一区|