亚洲欧美第一页_禁久久精品乱码_粉嫩av一区二区三区免费野_久草精品视频

? 歡迎來到蟲蟲下載站! | ?? 資源下載 ?? 資源專輯 ?? 關于我們
? 蟲蟲下載站

?? readme

?? SVM是一種常用的模式分類機器學習算法
??
?? 第 1 頁 / 共 2 頁
字號:
Libsvm is a simple, easy-to-use, and efficient software for SVMclassification and regression. It solves C-SVM classification, nu-SVMclassification, one-class-SVM, epsilon-SVM regression, and nu-SVMregression. It also provides an automatic model selection tool forC-SVM classification. This document explains the use of libsvm.Libsvm is available at http://www.csie.ntu.edu.tw/~cjlin/libsvmPlease read the COPYRIGHT file before using libsvm.Table of Contents=================- Quick Start- Installation- `svm-train' Usage- `svm-predict' Usage- Tips on practical use- Examples- Precomputed Kernels - Library Usage- Java Version- Building Windows Binaries- Additional Tools: Model Selection, Sub-sampling, etc.- Python Interface- Additional InformationQuick Start===========If you are new to SVM and if the data is not large, please go to `tools' directory and use easy.py after installation. It does everything automatic -- from data scaling to parameter selection.Usage: easy.py training_file [testing_file]More information about parameter selection can be found intools/README.Installation============On Unix systems, type `make' to build the `svm-train' and `svm-predict'programs. Run them without arguments to show the usages of them.On other systems, consult `Makefile' to build them (e.g., see'Building Windows binaries' in this file) or use the pre-builtbinaries (Windows binaries are in the directory `windows').The format of training and testing data file is:<label> <index1>:<value1> <index2>:<value2> ......For classification, <label> is an integer indicating the class label(multi-class is supported). For regression, <label> isthe target value which can be any real number. For one-class SVM, it'snot used so can be any number.  Except using precomputed kernels(explained in another section), <index>:<value> gives a feature(attribute) value.  <index> is an integer starting from 1 and <value>is a real number. Indices must be in an ASCENDING order. Labels in thetesting file are only used to calculate accuracy or errors. If theyare unknown, just fill the first column with any numbers.A sample classification data included in this package is `heart_scale'.Type `svm-train heart_scale', and the program will read the trainingdata and output the model file `heart_scale.model'. If you have a testset called heart_scale.t, then type `svm-predict heart_scale.theart_scale.model output' to see the prediction accuracy. The `output'file contains the predicted class labels.There are some other useful programs in this package.svm-scale:	This is a tool for scaling input data file.svm-toy:	This is a simple graphical interface which shows how SVM	separate data in a plane. You can click in the window to 	draw data points. Use "change" button to choose class 	1, 2 or 3 (i.e., up to three classes are supported), "load"	button to load data from a file, "save" button to save data to	a file, "run" button to obtain an SVM model, and "clear"	button to clear the window.	You can enter options in the bottom of the window, the syntax of	options is the same as `svm-train'.	Note that "load" and "save" consider data in the	classification but not the regression case. Each data point	has one label (the color) which must be 1, 2, or 3 and two	attributes (x-axis and y-axis values) in [0,1].	Type `make' in respective directories to build them.	You need Qt library to build the Qt version.	(available from http://www.trolltech.com)	You need GTK+ library to build the GTK version.	(available from http://www.gtk.org)		We use Visual C++ to build the Windows version.	The pre-built Windows binaries are in the windows directory.`svm-train' Usage=================Usage: svm-train [options] training_set_file [model_file]options:-s svm_type : set type of SVM (default 0)	0 -- C-SVC	1 -- nu-SVC	2 -- one-class SVM	3 -- epsilon-SVR	4 -- nu-SVR-t kernel_type : set type of kernel function (default 2)	0 -- linear: u'*v	1 -- polynomial: (gamma*u'*v + coef0)^degree	2 -- radial basis function: exp(-gamma*|u-v|^2)	3 -- sigmoid: tanh(gamma*u'*v + coef0)	4 -- precomputed kernel (kernel values in training_set_file)-d degree : set degree in kernel function (default 3)-g gamma : set gamma in kernel function (default 1/k)-r coef0 : set coef0 in kernel function (default 0)-c cost : set the parameter C of C-SVC, epsilon-SVR, and nu-SVR (default 1)-n nu : set the parameter nu of nu-SVC, one-class SVM, and nu-SVR (default 0.5)-p epsilon : set the epsilon in loss function of epsilon-SVR (default 0.1)-m cachesize : set cache memory size in MB (default 100)-e epsilon : set tolerance of termination criterion (default 0.001)-h shrinking: whether to use the shrinking heuristics, 0 or 1 (default 1)-b probability_estimates: whether to train an SVC or SVR model for probability estimates, 0 or 1 (default 0)-wi weight: set the parameter C of class i to weight*C in C-SVC (default 1)-v n: n-fold cross validation modeThe k in the -g option means the number of attributes in the input data.option -v randomly splits the data into n parts and calculates crossvalidation accuracy/mean squared error on them.`svm-predict' Usage===================Usage: svm-predict [options] test_file model_file output_fileoptions:-b probability_estimates: whether to predict probability estimates, 0 or 1 (default 0); for one-class SVM only 0 is supportedmodel_file is the model file generated by svm-train.test_file is the test data you want to predict.svm-predict will produce output in the output_file.Tips on Practical Use=====================* Scale your data. For example, scale each attribute to [0,1] or [-1,+1].* For C-SVC, consider using the model selection tool in the tools directory.* nu in nu-SVC/one-class-SVM/nu-SVR approximates the fraction of training  errors and support vectors.* If data for classification are unbalanced (e.g. many positive and  few negative), try different penalty parameters C by -wi (see  examples below).* Specify larger cache size (i.e., larger -m) for huge problems.Examples========> svm-scale -l -1 -u 1 -s range train > train.scale> svm-scale -r range test > test.scaleScale each feature of the training data to be in [-1,1]. Scalingfactors are stored in the file range and then used for scaling thetest data.> svm-train -s 0 -c 1000 -t 2 -g 0.5 -e 0.00001 data_file Train a classifier with RBF kernel exp(-0.5|u-v|^2) and stoppingtolerance 0.00001> svm-train -s 3 -p 0.1 -t 0 -c 10 data_fileSolve SVM regression with linear kernel u'v and C=10, and epsilon = 0.1in the loss function.> svm-train -s 0 -c 10 -w1 1 -w-1 5 data_fileTrain a classifier with penalty 10 for class 1 and penalty 50for class -1.> svm-train -s 0 -c 500 -g 0.1 -v 5 data_fileDo five-fold cross validation for the classifier usingthe parameters C = 500 and gamma = 0.1> svm-train -s 0 -b 1 data_file> svm-predict -b 1 test_file data_file.model output_fileObtain a model with probability information and predict test data withprobability estimatesPrecomputed Kernels ===================Users may precompute kernel values and input them as training andtesting files.  Then libsvm does not need the originaltraining/testing sets.Assume there are L training instances x1, ..., xL and. Let K(x, y) be the kernelvalue of two instances x and y. The input formatsare:New training instance for xi:<label> 0:i 1:K(xi,x1) ... L:K(xi,xL) New testing instance for any x:<label> 0:? 1:K(x,x1) ... L:K(x,xL) That is, in the training file the first column must be the "ID" ofxi. In testing, ? can be any value.All kernel values including ZEROs must be explicitly provided.  Anypermutation or random subsets of the training/testing files are alsovalid (see examples below).Note: the format is slightly different from the precomputed kernelpackage released in libsvmtools earlier.Examples:	Assume the original training data has three four-feature	instances and testing data has one instance:	1  1:1 2:1 3:1 4:1	3      2:3     4:3	2          3:1	1  1:1     3:1	If the linear kernel is used, we have the following new	training/testing sets:	1  0:1 1:4 2:6  3:1	3  0:2 1:6 2:18 3:0 	2  0:3 1:1 2:0  3:1 	1  0:? 1:2 2:0  3:1	? can be any value.	Any subset of the above training file is also valid. For example,	2  0:3 1:1 2:0  3:1	3  0:2 1:6 2:18 3:0 	implies that the kernel matrix is		0  1		18 0Library Usage=============These functions and structures are declared in the header file `svm.h'.You need to #include "svm.h" in your C/C++ source files and link yourprogram with `svm.cpp'. You can see `svm-train.c' and `svm-predict.c'for examples showing how to use them.Before you classify test data, you need to construct an SVM model(`svm_model') using training data. A model can also be saved ina file for later use. Once an SVM model is available, you can use itto classify new data.- Function: struct svm_model *svm_train(const struct svm_problem *prob,					const struct svm_parameter *param);    This function constructs and returns an SVM model according to    the given training data and parameters.    struct svm_problem describes the problem:		struct svm_problem	{		int l;		double *y;		struct svm_node **x;	};     where `l' is the number of training data, and `y' is an array containing    their target values. (integers in classification, real numbers in    regression) `x' is an array of pointers, each of which points to a sparse    representation (array of svm_node) of one training vector.     For example, if we have the following training data:    LABEL	ATTR1	ATTR2	ATTR3	ATTR4	ATTR5    -----	-----	-----	-----	-----	-----      1		  0	  0.1	  0.2	  0	  0

?? 快捷鍵說明

復制代碼 Ctrl + C
搜索代碼 Ctrl + F
全屏模式 F11
切換主題 Ctrl + Shift + D
顯示快捷鍵 ?
增大字號 Ctrl + =
減小字號 Ctrl + -
亚洲欧美第一页_禁久久精品乱码_粉嫩av一区二区三区免费野_久草精品视频
国产综合成人久久大片91| 国产精品女上位| 国产无一区二区| 亚洲国产一区视频| 国产一区二区三区| 69堂成人精品免费视频| 中文字幕成人av| 久久国产人妖系列| 欧美日韩一卡二卡三卡| 国产精品传媒入口麻豆| 久久99精品久久久| 欧美在线视频全部完| 国产欧美日韩不卡| 老司机精品视频导航| 欧美老肥妇做.爰bbww| 亚洲欧美日韩成人高清在线一区| 精品亚洲成av人在线观看| 在线一区二区三区| 亚洲欧美一区二区视频| 成人高清视频免费观看| 精品国产一区二区精华| 全国精品久久少妇| 日韩欧美区一区二| 男男gaygay亚洲| 在线播放/欧美激情| 亚洲国产欧美日韩另类综合 | 国产色一区二区| 久久精品国产网站| 欧美大度的电影原声| 免费美女久久99| 欧美日本一区二区在线观看| 五月综合激情日本mⅴ| 欧美视频一区二区在线观看| 亚洲色图20p| 色系网站成人免费| 又紧又大又爽精品一区二区| 在线观看视频欧美| 亚洲电影在线免费观看| 欧美精品一级二级| 另类小说色综合网站| 日韩女优电影在线观看| 国产一区二区不卡在线| 久久精品在线观看| av在线免费不卡| 亚洲午夜视频在线观看| 欧美精品欧美精品系列| 免费不卡在线视频| 国产日韩欧美精品一区| 91小视频在线观看| 调教+趴+乳夹+国产+精品| 91麻豆精品91久久久久同性| 久久国产精品免费| 欧美激情一区二区三区蜜桃视频 | 欧美在线|欧美| 一区二区三区四区国产精品| 欧美日韩国产三级| 九色综合狠狠综合久久| 久久精品在线免费观看| 91老师片黄在线观看| 婷婷中文字幕综合| 久久理论电影网| 不卡的av在线| 亚洲妇女屁股眼交7| 久久亚洲一区二区三区四区| 99久久精品一区| 亚洲国产精品天堂| 日韩精品一区二区三区在线| 成人毛片视频在线观看| 亚洲国产一区二区在线播放| 国产午夜亚洲精品不卡| 欧美日精品一区视频| 日本不卡的三区四区五区| 国产精品色哟哟网站| 欧美精品一二三| www.亚洲精品| 看电视剧不卡顿的网站| 日韩伦理免费电影| 久久先锋影音av鲁色资源网| 欧美在线小视频| 成人国产精品免费网站| 免费在线一区观看| 一区二区三区四区不卡在线| 精品少妇一区二区三区| 欧美羞羞免费网站| 不卡视频在线看| 久久疯狂做爰流白浆xx| 亚洲国产综合在线| 日韩一区欧美小说| 久久蜜桃av一区二区天堂| 欧美日韩成人激情| 99久久er热在这里只有精品15| 开心九九激情九九欧美日韩精美视频电影| 亚洲女同女同女同女同女同69| 精品国产一区二区三区不卡| 4438x亚洲最大成人网| 色丁香久综合在线久综合在线观看| 精品一区二区国语对白| 日本亚洲最大的色成网站www| 亚洲免费三区一区二区| 亚洲婷婷综合久久一本伊一区| 欧美变态凌虐bdsm| 欧美一区二区成人6969| 欧美精品国产精品| 欧美日韩在线播放一区| 91网站视频在线观看| 成人丝袜18视频在线观看| 国产一区二区三区四区在线观看| 日本系列欧美系列| 男男gaygay亚洲| 久久精品国产99国产精品| 日韩国产欧美在线播放| 日韩高清一区二区| 免费高清不卡av| 久久电影网站中文字幕| 久久精品国内一区二区三区| 久久狠狠亚洲综合| 国产一区久久久| 国产成人在线视频播放| 国产高清不卡一区| 成人av在线影院| 99久久精品国产毛片| 99久久国产综合精品色伊| 色婷婷av久久久久久久| 欧美日韩电影一区| 欧美大片日本大片免费观看| 久久只精品国产| 国产欧美一区二区在线| 亚洲视频一区二区免费在线观看| 亚洲欧美日韩系列| 亚洲 欧美综合在线网络| 三级一区在线视频先锋| 久久99热这里只有精品| 国产九色sp调教91| 97国产一区二区| 欧美日韩精品免费| 欧美精品一区二区三区在线 | 一区二区三区电影在线播| 性做久久久久久| 精品亚洲成a人在线观看| 大白屁股一区二区视频| 欧美在线播放高清精品| 日韩欧美区一区二| 国产精品二区一区二区aⅴ污介绍| 亚洲男人的天堂在线aⅴ视频| 亚洲福利国产精品| 国产盗摄一区二区| 欧美亚洲综合在线| 精品对白一区国产伦| 亚洲欧洲在线观看av| 亚洲成a人v欧美综合天堂下载| 日本aⅴ精品一区二区三区| 国产乱国产乱300精品| 91视频免费观看| 欧美成人精品二区三区99精品| 久久精品日产第一区二区三区高清版| 中文字幕在线观看不卡| 视频一区视频二区在线观看| 国产美女精品人人做人人爽| 91福利精品视频| 久久久久99精品国产片| 香港成人在线视频| 成人网男人的天堂| 日韩午夜电影在线观看| 自拍偷拍亚洲激情| 国产成人精品一区二区三区网站观看| 日本乱人伦一区| 国产女同性恋一区二区| 水野朝阳av一区二区三区| 91丝袜美腿高跟国产极品老师 | 欧美午夜宅男影院| 国产日韩欧美一区二区三区综合| 午夜精彩视频在线观看不卡| 成人午夜av影视| 精品久久久久久久久久久院品网| 亚洲伦理在线免费看| 国产成人免费视频精品含羞草妖精| 欧美制服丝袜第一页| 中文字幕在线观看不卡视频| 国产酒店精品激情| 日韩精品中文字幕在线不卡尤物| 一区二区三区日韩欧美| 成人av在线影院| 国产日韩欧美精品综合| 国产伦精品一区二区三区在线观看| 精品视频资源站| 亚洲欧美日韩国产成人精品影院 | 一区在线播放视频| 国产精品一区二区在线看| 欧美电视剧在线观看完整版| 亚洲成精国产精品女| 欧美中文字幕亚洲一区二区va在线 | 久久日韩粉嫩一区二区三区| 日本va欧美va瓶| 制服.丝袜.亚洲.中文.综合| 亚洲国产精品一区二区www在线| 91猫先生在线| 樱桃国产成人精品视频| 99精品视频一区| 综合久久综合久久| 色综合天天综合色综合av|