亚洲欧美第一页_禁久久精品乱码_粉嫩av一区二区三区免费野_久草精品视频

? 歡迎來到蟲蟲下載站! | ?? 資源下載 ?? 資源專輯 ?? 關于我們
? 蟲蟲下載站

?? faq.html

?? 支撐向量機庫文件
?? HTML
?? 第 1 頁 / 共 2 頁
字號:
<!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML//EN"><html>  <head>    <title>libsvm faq</title>  </head><body text="#000000" bgcolor="#FFEFD5" link="#FF0000" vlink="#0000FF">  <body>    <h1><a href=http://www.csie.ntu.edu.tw/~cjlin/libsvm>LIBSVM</a> FAQ</h1><!-- Created: Wed Apr 18 19:26:54 CST 2001 --><!-- hhmts start -->Last modified: Wed Mar 31 10:49:17 CST 2004<!-- hhmts end --><hr><h3>Some courses which have used libsvm as a tool</h3><ul><li><a href=http://lmb.informatik.uni-freiburg.de/lectures/svm_seminar/>Institute for Computer Science,           Faculty of Applied Science, University of Freiburg, Germany </a><li> <a href=http://www.cs.vu.nl/~elena/ml.html>Division of Mathematics and Computer Science. Faculteit der Exacte Wetenschappen Vrije Universiteit, The Netherlands. </a><li><a href=http://www.cae.wisc.edu/~ece539/matlab/>Electrical and Computer Engineering Department, University of Wisconsin-Madison </a><li><a href=http://www.hpl.hp.com/personal/Carl_Staelin/cs236601/project.html>Technion (Israel Institute of Technology), Israel.<li><a href=http://www.cise.ufl.edu/~fu/learn.html>Computer and Information Sciences Dept., University of Florida</a><li><a href=http://www.uonbi.ac.ke/acad_depts/ics/course_material/machine_learning/ML_and_DM_Resources.html>The Institute of Computer Science,University of Nairobi, Kenya.</a><li><a href=http://cerium.raunvis.hi.is/~tpr/courseware/svm/hugbunadur.html>Applied Mathematics and Computer Science, University of Iceland.</a></ul><hr><h3>Installation and and running the program</h3><ul><p><li> Where can I find documents of libsvm ? <p>In the package there is a README file which details all options, data format, and library calls.The model selection tool and the python interfacehave a separate README under the directory python.The guide<A HREF="http://www.csie.ntu.edu.tw/~cjlin/papers/guide/guide.pdf">A practical guide to support vector classification</A> shows beginners how to train/test their data.The paper <a href="http://www.csie.ntu.edu.tw/~cjlin/papers/libsvm.pdf">LIBSVM: a library for support vector machines</a> discusses the implementation oflibsvm in detail.<p><li> What are changes in previous versions? <p>See <a href="http://www.csie.ntu.edu.tw/~cjlin/libsvm/log">the change log</a>.<p><li> I would like to cite libsvm. Which paper shouldI cite ?<p>Please cite the following document:<p>Chih-Chung Chang and Chih-Jen Lin, LIBSVM: a library for support vector machines, 2001.Software available at http://www.csie.ntu.edu.tw/~cjlin/libsvm<p>The bibtex format is as follows<pre>@Manual{CC01a,  author =	 {Chih-Chung Chang and Chih-Jen Lin},  title =	 {{LIBSVM}: a library for support vector machines},  year =	 {2001},  note =	 {Software available at {\tt http://www.csie.ntu.edu.tw/\verb"~"cjlin/libsvm}},}</pre><p><li> I would like to use libsvm in my software. Is there any license problem ?<p>The libsvm license ("the modified BSD license")is compatible with manyfree software licenses such as GPL. Hence, it is very easy touse libsvm in your software.It can also be used in commercial products.<p><li> Is there a repository of additional toolsbased on libsvm ?<p>Yes, see <a href="http://www.csie.ntu.edu.tw/~cjlin/libsvmtools">libsvm tools</a><p><li> On unix machines, I got "error in loading shared libraries"or "cannot open shared object file." What happened ?<p>This usually happens if you compile the codeon one machine and run it on another which has incompatiblelibraries.Try to recompile the program on that machine or use static linking.<p><li> I have modified the source and would like to build the graphic interface "svm-toy"on MS windows. How should I do it ?<p>Build it as a project by choosing "Win32 Application."On the other hand, for "svm-train" and "svm-predict"you want to choose "Win32 Console Application."After libsvm 2.5, you can also use the file Makefile.win.See details in README.<p><li>I am an MS windows user but whyonly one (SVM_toy) of those precompiled .exe actually runs ? <p>You need to open a command window and type  svmtrain.exe to see all options.Some examples are in README file.</ul><hr><h3>Data preparation</h3><ul><p><li>Why sometimes not all attributes of a data appear in the training/model files ?<p>libsvm uses the so called "sparse" format where zerovalues do not need to be stored. Hence a data with attributes<pre>1 0 2 0</pre>is represented as<pre>1:1 3:2</pre><p><li>What if my data are non-numerical ?<p>Currently libsvm supports only numerical data.You may have to change non-numerical data to numerical. For example, you can use severalbinary attributes to represent a categoricalattribute.<p><li>Why do you consider sparse format ? Will the training of dense data be much slower ?<p>This is a controversial issue. The kernelevaluation (i.e. inner product) of sparse vectors is slower so the total training time can be at least twice or three timesof that using the dense format.However, we cannot support only dense format as then we CANNOThandle extremely sparse cases. Simplicity of the code is anotherconcern. Right now we decide to supportthe sparse format only.</ul><hr><h3>Training and prediction</h3><ul><p><li> The output of training C-SVM is like the following. What do they mean ?<br>optimization finished, #iter = 219<br>nu = 0.431030<br>obj = -100.877286, rho = 0.424632<br>nSV = 132, nBSV = 107<br>Total nSV = 132<p>obj is the optimal objective value of the dual SVM problem.rho is the bias term in the decision functionsgn(w^Tx - rho).nSV and nBSV are number of support vectors and bounded supportvectors (i.e., alpha_i = C). nu-svm is a somewhat equivalentform of C-SVM where C is replaced by nu. nu simply shows thecorresponding parameter. More details are in<a href="http://www.csie.ntu.edu.tw/~cjlin/papers/libsvm.pdf">libsvm document</a>.<p><li> Can you explain more about the model file ?<p>After the parameters, each line represents a support vector.Support vectors are listed in the order of "labels" listed earlier.(i.e., those from the first class in the "labels" list aregrouped first, and so on.) If k is the total number of classes,in front of each support vector, there arek-1 coefficients y*alpha where alpha are dual solution of thefollowing two class problems:<br>1 vs j, 2 vs j, ..., j-1 vs j, j vs j+1, j vs j+2, ..., j vs k<br>and y=1 in first j-1 coefficients, y=-1 in the remainingk-j coefficients.For example, if there are 4 classes, the file looks like:<pre>+-+-+-+--------------------+|1|1|1|                    ||v|v|v|  SVs from class 1  ||2|3|4|                    |+-+-+-+--------------------+|1|2|2|                    ||v|v|v|  SVs from class 2  ||2|3|4|                    |+-+-+-+--------------------+|1|2|3|                    ||v|v|v|  SVs from class 3  ||3|3|4|                    |+-+-+-+--------------------+|1|2|3|                    ||v|v|v|  SVs from class 4  ||4|4|4|                    |+-+-+-+--------------------+</pre><p><li> Should I use float or double to store numbers in the cache ?<p>We have float as the default as you can store more numbersin the cache. In general this is good enough but for few difficultcases (e.g. C very very large) where solutions are hugenumbers, it might be possible that the numerical precision is notenough using only float.<p><li>How do I choose the kernel ?<p>In general we suggest you to try the RBF kernel first.A recent result by Keerthi and Lin(<a href=http://www.csie.ntu.edu.tw/~cjlin/papers/limit.ps.gz>download paper here</a>)shows that if RBF is used with model selection,then there is no need to consider the linear kernel.The kernel matrix using sigmoid may not be positive definiteand in general it's accuracy is not better than RBF.(see the paper by Lin and Lin(<a href=http://www.csie.ntu.edu.tw/~cjlin/papers/tanh.pdf>download paper here</a>).Polynomial kernels are ok but if a high degree is used,numerical difficulties tend to happen(thinking about dth power of (<1) goes to 0and (>1) goes to infinity). <p><li> Does libsvm have special treatmentsfor linear SVM ?<p>No, at this point libsvm solves linear/nonlinear SVMs by thesame way.Note that there are some possibletricks to save training/testing time if thelinear kernel is used.Hence libsvm is <b>NOT</b> particularly efficient for linear SVM,especially for problems whose number of data is much largerthan number of attributes.If you plan to solve this type of problems, you may want to check <a href=http://www.csie.ntu.edu.tw/~cjlin/bsvm>bsvm</a>,which includes an efficient implementation forlinear SVMs.More details can be found in the following study:K.-M. Chung, W.-C. Kao, T. Sun, andC.-J. Lin.<A HREF="http://www.csie.ntu.edu.tw/~cjlin/papers/linear.pdf">Decomposition Methods for Linear Support Vector Machines</a><p> On the other hand, you do not really need to solvelinear SVMs. See the previous question about choosingkernels for details.<p><li>The number of free support vectors is large. What should I do ? <p>This usually happens when the data are overfitted.If attributes of your data are in large ranges,try to scale them. Then the regionof appropriate parameters may be larger.Note that there is a scale programin libsvm. <p><li>Should I scale training and testing data in a similar way ?<p>Yes, you can do the following:<br> svm-scale -s scaling_parameters train_data > scaled_train_data<br> svm-scale -r scaling_parameters test_data > scaled_test_data<p><li>Does it make a big difference  if I scale each attributeto [0,1] instead of [-1,1] ?<p>For the linear scaling method, if the RBF kernel isused and parameter selection is conducted, thereis no difference. Assume Mi and mi are respectively the maximal and minimal values of theith attribute. Scaling to [0,1] means<pre>                x'=(x-mi)/(Mi-mi)</pre>For [-1,1],<pre>                x''=2(x-mi)/(Mi-mi)-1.</pre>In the RBF kernel,<pre>                x'-y'=(x-y)/(Mi-mi), x''-y''=2(x-y)/(Mi-mi).</pre>Hence, using (C,g) on the [0,1]-scaled data is thesame as (C,g/2) on the [-1,1]-scaled data.<p> Though the performance is the same, the computationaltime may be different. For data with many zero entries,[0,1]-scaling keeps the sparsity of input data and hencemay save the time.<p><li>The prediction rate is low. How could I improve it ?<p>Try to use the model selection tool grid.py in the pythondirectory findout good parameters. To see the importance of model selection,please see my  talk:<A HREF="http://www.csie.ntu.edu.tw/~cjlin/talks/tuebingen.ps.gz">Can support vector machines become a major classification method ? </A><p><li>My data are unbalanced. Could libsvm handle such problems ?<p>Yes, there is a -wi options. For example, if you use<p> svm-train -s 0 -c 10 -w1 1 -w-1 5 data_file<p>the penalty for class "-1" is larger.Note that this -w option is for C-SVC only.<p><li>What is the difference between nu-SVC and C-SVC ?<p>Basically they are the same thing but with differentparameters. The range of C is from zero to infinitybut nu is always between [0,1]. A nice propertyof nu is that it is related to the ratio of support vectors and the ratio of the trainingerror.<p><li>The program keeps running without showingany output. What should I do ?<p>You may want to check your data. Each training/testingdata must be in one line. It cannot be separated.In addition, you have to remove empty lines.<p><li>The program keeps running (with output, i.e. many dots).What should I do ?<p>In theory libsvm guarantees to converge if the kernelmatrix is positive semidefinite. After version 2.4 it can also handle non-PSDkernels such as the sigmoid (tanh).Therefore, this means you arehandling ill-conditioned situations(e.g. too large/small parameters) so numericaldifficulties occur.<p><li>The training time is too long. What should I do ?<p>This may happen for some difficult cases (e.g. -c is large).You can try to use a looser stopping tolerance with -e.

?? 快捷鍵說明

復制代碼 Ctrl + C
搜索代碼 Ctrl + F
全屏模式 F11
切換主題 Ctrl + Shift + D
顯示快捷鍵 ?
增大字號 Ctrl + =
減小字號 Ctrl + -
亚洲欧美第一页_禁久久精品乱码_粉嫩av一区二区三区免费野_久草精品视频
精品一区二区三区在线播放| 亚洲日本在线a| 久久精品国内一区二区三区| 9191久久久久久久久久久| 一区二区三区在线视频播放| 91麻豆国产精品久久| 亚洲狠狠丁香婷婷综合久久久| 欧美在线观看视频一区二区| 亚洲va欧美va国产va天堂影院| 日韩免费看的电影| 97精品久久久午夜一区二区三区 | 国产91对白在线观看九色| 亚洲精品自拍动漫在线| 91免费视频网址| 日韩精品每日更新| 亚洲免费观看高清完整版在线| 欧美午夜精品免费| 成人短视频下载| 激情小说欧美图片| 亚洲午夜免费福利视频| 精品久久久久久最新网址| 成人黄色777网| 久久精品久久99精品久久| 欧美日韩亚洲综合在线 欧美亚洲特黄一级 | 亚洲午夜激情网站| 久久久久久免费网| 在线亚洲免费视频| bt7086福利一区国产| 不卡一二三区首页| 国产精品一区二区久久精品爱涩| 婷婷久久综合九色国产成人| 亚洲品质自拍视频| 亚洲国产精品成人综合色在线婷婷 | 国产成人精品一区二区三区网站观看| 国产午夜精品福利| 国产日本一区二区| 中文字幕欧美日本乱码一线二线| 久久青草国产手机看片福利盒子 | av电影天堂一区二区在线| 国产成a人亚洲精品| 一本久久精品一区二区| 欧美专区日韩专区| 日韩网站在线看片你懂的| 精品999在线播放| 18欧美乱大交hd1984| 亚洲高清免费一级二级三级| 日本不卡视频一二三区| 国产精品资源在线看| 色视频欧美一区二区三区| 日韩一级片网站| 久久影音资源网| 综合激情成人伊人| 日韩高清不卡在线| a级精品国产片在线观看| 欧美日韩一区三区四区| 久久麻豆一区二区| 亚洲成a人片在线不卡一二三区| 久久国产尿小便嘘嘘尿| 色婷婷综合激情| 国产精品久久久久久久久晋中| 亚洲午夜久久久久久久久久久| 国产一区二区三区久久久 | 99久久精品国产观看| 日韩天堂在线观看| 午夜av电影一区| 色综合久久中文字幕综合网| 亚洲精品在线免费观看视频| 天天操天天干天天综合网| 99久免费精品视频在线观看 | 在线视频观看一区| 国产精品久久久99| 成人免费视频caoporn| 一区二区在线观看免费| 岛国一区二区在线观看| 国产婷婷一区二区| 成人精品鲁一区一区二区| 欧美tickling网站挠脚心| 日韩va亚洲va欧美va久久| 日本高清不卡一区| 午夜亚洲福利老司机| 欧美高清激情brazzers| 日韩高清一级片| 国产午夜三级一区二区三| 国产成人av福利| 亚洲人被黑人高潮完整版| 欧美色涩在线第一页| 天天操天天干天天综合网| 欧美xfplay| 欧美亚洲动漫另类| 韩国女主播成人在线| |精品福利一区二区三区| 欧美日韩成人在线| 成人免费视频视频在线观看免费| 亚洲品质自拍视频网站| 欧美岛国在线观看| 色一情一乱一乱一91av| 韩国精品在线观看| 一区二区三区不卡视频在线观看| 日韩午夜中文字幕| 色av一区二区| av电影天堂一区二区在线| 日本中文一区二区三区| 综合av第一页| 久久精品免费在线观看| 正在播放一区二区| 制服丝袜激情欧洲亚洲| 成人h动漫精品一区二区| 蜜臀91精品一区二区三区| 亚洲激情av在线| 亚洲精品福利视频网站| 欧美韩国日本不卡| 中文字幕乱码一区二区免费| 欧美精品一区二区三区一线天视频| 91成人网在线| 欧美综合欧美视频| 欧美日韩综合在线免费观看| 99久久99久久免费精品蜜臀| 成人app网站| 99久久国产综合精品色伊 | 成人动漫一区二区| www.亚洲人| 欧美午夜精品一区| 日韩免费视频线观看| 久久久久久电影| 中文字幕一区av| 视频一区在线视频| 国产激情91久久精品导航| 成人深夜在线观看| 欧美群妇大交群中文字幕| 日韩欧美中文字幕精品| 中文字幕欧美三区| 天天影视网天天综合色在线播放| 极品少妇xxxx精品少妇| av电影在线不卡| 精品卡一卡二卡三卡四在线| 国产精品欧美久久久久一区二区| 一区二区三区四区蜜桃| 日本一不卡视频| 日本韩国精品在线| 久久久精品2019中文字幕之3| 中文字幕不卡三区| 麻豆91精品91久久久的内涵| 91蜜桃在线免费视频| 精品国产乱码久久久久久久久 | 成人avav在线| 欧美性猛片xxxx免费看久爱| 日韩一区二区三区三四区视频在线观看| 国产无一区二区| 狠狠色综合色综合网络| 欧美日韩精品电影| 国产精品国产三级国产有无不卡| 久久成人免费网| 欧美精品视频www在线观看| 亚洲国产精品成人综合色在线婷婷| 亚洲男女毛片无遮挡| 99精品视频免费在线观看| 国产女人18毛片水真多成人如厕| 天天色综合天天| 欧美一区二区三区日韩视频| 日韩专区在线视频| 欧美一区二区三区电影| 麻豆专区一区二区三区四区五区| 精品视频一区二区不卡| 亚洲高清视频在线| 91精品国产综合久久精品性色| 婷婷丁香久久五月婷婷| 日韩精品一区二区三区四区 | 日韩免费观看高清完整版 | 国产午夜精品理论片a级大结局| 丝袜美腿亚洲色图| 精品少妇一区二区三区日产乱码| 久久国产精品99久久久久久老狼| 欧美第一区第二区| 波多野洁衣一区| 日本不卡视频在线| 国产精品天天看| 欧美一级欧美三级| 极品少妇xxxx偷拍精品少妇| 国产精品久久午夜夜伦鲁鲁| 欧美色图12p| 粉嫩av一区二区三区在线播放 | 欧美艳星brazzers| 老色鬼精品视频在线观看播放| 久久久久成人黄色影片| 一本大道久久a久久综合| 加勒比av一区二区| 首页国产丝袜综合| 亚洲精品伦理在线| 国产精品天干天干在观线| 欧美一区二区精品久久911| 色偷偷一区二区三区| 国产一区欧美二区| 天堂va蜜桃一区二区三区| 中文字幕一区二区三| 国产精品久久久久精k8| 精品国产一区二区三区不卡| 欧美人体做爰大胆视频| 色爱区综合激月婷婷| 欧美亚洲动漫制服丝袜| 色婷婷综合在线|