亚洲欧美第一页_禁久久精品乱码_粉嫩av一区二区三区免费野_久草精品视频

? 歡迎來(lái)到蟲(chóng)蟲(chóng)下載站! | ?? 資源下載 ?? 資源專輯 ?? 關(guān)于我們
? 蟲(chóng)蟲(chóng)下載站

?? optimtips.m

?? New users and old of optimization in MATLAB will find useful tips and tricks in this document, as we
?? M
?? 第 1 頁(yè) / 共 5 頁(yè)
字號(hào):
%{Regression and optimization tips & tricks of the trade - usingmatlab and the optimization toolbox in ways that you may never haveconsidered. Think of this text as a Matlab work book. I've includedthe tricks I see asked about many times on the comp.soft-sys.matlabnewsgroup. If I've left out your favorite trick, just drop me a lineand I'll try to add it in. This article is not intended as a FAQhowever. Much of what I'll say in here has to do with curve-fitting andparameter estimation. It is after all a fertile source ofoptimization problems. I'll do my best to look at as many ofthe different optimizers as possible. This may help a noviceoptimizer to use the solutions found in this text as a set oftemplates for their problems.Is the order of the topics I've chosen important? I've tried to makeit all flow as well as possible, but I won't always succeed. Feelfree to skip to the section that holds your own interest.Apologies are due to my readers who live with a release of Matlabolder than release 14. Almost all of the examples here are writtenusing anonymous functions. Luckily, most of these examples willstill be quite readable, so all may not be lost. If I hear enoughrequests I may be willing to expand these examples, includingversions which are accessible to users of older releases.Using this workbookThis text is written with matlab's cell mode in mind. Each sectionhas blocks of executable code that are surrounded with a %% beforeand after. This allows you to execute that block directly in matlab.Anything that I want to say inside a block will be in the form ofa matlab comment. Users of matlab releases that do not supportcell mode can always use copy and paste to execute these blocksof matlab commands.I've also used the block comment form, %{ and %}, which became an option in release 14 of matlab. In effect, every line of thisarticle is either an executable line of matlab code, or a validmatlab comment. Users of older releases will just have to forgiveme once more.If you do have release 14 or beyond, then try "Publish to HTML".Its an option on the editor's file menu, or a button on the editortask bar. Give Matlab a minute or two to run through all of myexamples, then matlab produces a nice document that can be readthrough at your leisure.%}%% 1. Linear regression basics in matlab%{I'll start with some linear regression basics. While polyfit doesa lot, a basic understanding of the process is useful.Lets assume that you have some data in the form y = f(x) + noise.We'll make some up and plot it.%}%%% Execute this cell.x = sort(rand(20,1));y = 2 + 3*x + randn(size(x));plot(x,y,'o')title 'A linear relationship with added noise'%%% We'd like to estimate the coefficients of this model from the data.% Many books show a solution using the normal equations.M = [ones(length(x),1),x];% These are the normal equations.coef = inv(M'*M)*M'*y% coef contains regression estimates of the parametersyhat = M*coef;plot(x,y,'o',x,yhat,'-')title 'Linear regression model'%%% A better solution uses \. Why? Because \ is more numerically% stable than is inv. Its something you will appreciate one day% when your data is nasty. In this case, the different methods% will be indistinguishable. Use \ anyway. disp 'Use of \'coef2 = M\y%%% Pinv is also an option. It too is numerically stable, but it% will yield subtly different results when your matrix is singular% or nearly so. Is pinv better? There are arguments for both \ and% pinv. The difference really lies in what happens on singular or% nearly singular matrixes. See the sidebar below.% Pinv will not work on sparse problems, and since pinv relies on% the singular value decomposition, it may be slower for large% problems.disp 'Use of pinv'coef3 = pinv(M)*y% Large-scale problems where M is sparse may sometimes benefit% from a sparse iterative solution. An iterative solver is overkill% on this small problem, but ...disp 'Use of lsqr'coef4 = lsqr(M,y,1.e-13,10)% There is another option, lscov. lscov is designed to handle problems% where the data covariance matrix is known. It can also solve a% weighted regression problem (see section 2.)disp 'Use of lscov'coef5 = lscov(M,y)%%% Directly related to the \ solution is one based on the QR% factorization. If our over-determined system of equations to% solve is M*coef = y, then a quick look at the normal equations,%%   coef = inv(M'*M)*M'*y%% combined with the qr factorization of M,%%   M = Q*R%% yields%%   coef = inv(R'*Q'*Q*R)*R'*Q'*y%% Of course, we know that Q is an orthogonal matrix, so Q'*Q is% an identity matrix.%%   coef = inv(R'*R)*R'*Q'*y%% If R is non-singular, then inv(R'*R) = inv(R)*inv(R'), so% we can further reduce to%%   coef = inv(R)*Q'*y%% Finally, recognize that this is best written in matlab% (especially for upper triangular R) as%%   coef = R\(Q'*y)%% Why show this solution at all? Because later on, when we discuss% confidence intervals on the parameters, this will prove useful.disp 'Use of an explicit qr factorization'[Q,R] = qr(M,0);coef6 = R\(Q'*y)%%% Note that when we generated our data above, we added random noise% using the function randn. Randn generates uncorrelated Gaussian% (normally distributed) noise. In fact, the model that we chose% was the correct model for our data. In some cases the choice of% model will be only a guess.x2 = sort(rand(50,1));y2 = 1 + 2*x2 - 4*x2.^2 + randn(size(x2))/10;% lets fit this data with our same linear model.M2 = [ones(length(x2),1),x2];coef = M2\y2% and plot the resultsyhat2 = M2*coef;plot(x2,y2,'o',x2,yhat2,'-')title 'Linear model through quadratic data'%%% Plotting the residuals shows the clear lack of fit in our model. % I'll leave any more discussion of basic regression analysis to a% good text on the subject. Draper and Smith, "Applied regression% Analysis" was always a favorite of mine.res2 = y2 - yhat2;plot(x2,res2,'o')title 'Residuals for a linear model through quadratic data'%%% Sidebar: Pinv uses a singular value decomposition, whereas \% uses a qr factorization for non-square matrices. The difference?% lets try out the alternative solutions on a singular problem,% with no noise in the data.M = rand(10,2);M = M(:,[1 2 2]);y = sum(M,2);disp 'Singular matrix: pinv'coef1 = pinv(M)*ydisp 'Singular matrix: \'coef2 = M\ydisp 'Singular matrix: lsqr'coef3 = lsqr(M,y)disp 'Singular matrix: lscov'coef4 = lscov(M,y)% Lsqr produces a solution with pinv-like characteristics, while% lscov is clearly similar to \.%%% Note that \ gave a warning of rank deficiency, and that since% the second and third columns of M were replicates, the two% solutions are really equivalent. Except that \ resulted in a% zero coefficient for the third column. Pinv has the property% that in the case of singularity, it will produce the minimum% norm solution.[norm(coef1),norm(coef2)]% Either solution [1 1 1]' or [1 2 0]' was equally valid, but the% pinv solution had a lower norm.%% 2. Polynomial regression models%{Arguably the most common linear regression model is the polynomialmodel. In a simple case, we may wish to estimate the coefficients(a and b) of the model  y = a*x + bAs I showed in the previous section, this is easily done using \,or any of a variety of other tools in matlab. We could also havedone the regression estimation using polyfit.Note that polyfit returns its polynomial with terms in orderfrom the highest power down.You can also build more general polynomial models, with your choiceof terms, or in multiple dimensions using polyfitn. Its here onthe file exchange:http://www.mathworks.com/matlabcentral/fileexchange/loadFile.do?objectId=10065&objectType=FILE%}%% % A linear model estimated using polyfitx = sort(rand(20,1));y = 3*x + 2 + randn(size(x));p = polyfit(x,y,1)%% 3. Weighted regression models%{What do you do when you have weights? How should we interpretregression weights anyway? Suppose we knew one particular data point had much lower errorthan the rest. We might just choose to replicate that data pointmultiple times. That replicated point will drag the sums ofsquares of errors around. Lets try it out.%}%% x = sort(rand(20,1));y = 2 + 3*x + randn(size(x));% replicating the point (1,5) 20 timesnreps = 20;x2 = [x;ones(nreps,1)];y2 = [y;repmat(5,nreps,1)];% and solveM2 = [ones(size(x2)),x2];coef = M2\y2yhat = M2*coef;closeplot(x2,y2,'o',x2,yhat,'-')title 'A weighted regression, weighting by replication'% note that the error in the replicated point was probably pretty% small, much lower than the rest of the data.%%% We can emulate this point replication using a weighted regression.% Note the sqrt(nreps) in this approach to the weighted regression.x3 = [x;1];y3 = [y;5];nreps = 20;weights = [ones(size(y));sqrt(nreps)];M3 = [ones(size(x3)),x3];% Just multiply each row of M and the corresponding y by its weight.coef = (repmat(weights,1,2).*M3)\(weights.*y3)%%% Weighted regression is one of the abilities of lscov.weights = [ones(size(y));nreps];coef = lscov(M3,y3,weights)%%% Are regression weights really used as a description of the known% variance of your data? Clearly the examples above show that% weights are interpreted as a relative replication factor. Thus% a weight of k for a point is equivalent to having replicated the% given point k times.% Does this mean that a weighted regression with all its weights% equal to some large value will yield a different result?n=20;x = sort(rand(n,1));y = 2 + 3*x + randn(size(x));M = [ones(n,1),x];% The unweighted resultcoef0 = lscov(M,y)%%% With weights all equal to 10w = 10;weights = w*ones(n,1);coef1 = lscov(M,y,weights)% Likewise, any confidence limits derived for the model will also% be unchanged.% Thus weights in this context are PURELY relative weights. Doubling% all of the weights will not reflect any overall belief on your% part that the data is more accurate. %% 4. Robust estimation %{Outliers are the source of many difficulties for estimation problems.Least squares estimation, linear or nonlinear, will be dragged aroundby points with large residuals. If these large residual points do notarise because of the expected normal distribution, but actually arisefrom some other distribution mixed in, then the least squares estimatesmay well be wildly off.In this case, some sort of trimming or iterative re-weighting schememay be appropriate. Iterative re-weighting simply means to computea regression model, then generates weights which are somehow inverselyrelated to the residual magnitude. Very often this relationship willbe highly nonlinear, perhaps the 5% of the points with the largestresiduals will be assigned a zero weight, the rest of the pointstheir normal weight. Then redo the regression model as a weightedregression.%}%%n = 50;m = 5;x = rand(n,1);y = 2 + 3*x + randn(size(x))/10;% Now mix in a few outliers. Since the data is in random order,% just add some noise to the first few data points.y(1:m) = y(1:m) + exp(rand(m,1)*3);closeplot(x,y,'o')title 'Outliers in the data'%%% Fit with a simple first order linear modelM = [ones(n,1),x];coef0 = M\y%%% Compute the residuals for the current model, then make up some% weights based on the residuals, then fit. Iterate a few times.for i=1:3  res = M*coef0 - y;  weights = exp(-3*abs(res)/max(abs(res)))';  % compute the weighted estimate using these weights  coef1 = lscov(M,y,weights)  coef0=coef1;end%{This final estimate of coef1 will usually be closer to the known coefficients than the first (unweighted) estimate.I chose to a fairly arbitrary weight transformation. For thosewho are interested, potentially better choices may be found inone of these texts:"Robust Statistics", P.J. Huber"Data Analysis and Regression: A Second Course in Statistics",F. Mosteller, J.W. Tukey"Understanding Robust and Exploratory Data Analysis", D.C. Hoaglin

?? 快捷鍵說(shuō)明

復(fù)制代碼 Ctrl + C
搜索代碼 Ctrl + F
全屏模式 F11
切換主題 Ctrl + Shift + D
顯示快捷鍵 ?
增大字號(hào) Ctrl + =
減小字號(hào) Ctrl + -
亚洲欧美第一页_禁久久精品乱码_粉嫩av一区二区三区免费野_久草精品视频
精品精品欲导航| 欧美国产日韩a欧美在线观看| 久久精品国产成人一区二区三区| 久久精品在线观看| 欧美精品777| 99综合电影在线视频| 免费成人结看片| 夜夜操天天操亚洲| 国产偷国产偷精品高清尤物| 欧美日韩一区精品| av成人动漫在线观看| 久久精品国产在热久久| 亚洲国产日韩a在线播放| 国产欧美精品在线观看| 91精品国产福利| 在线免费一区三区| 成人动漫视频在线| 国产精品一区在线观看你懂的| 亚洲.国产.中文慕字在线| 中文一区一区三区高中清不卡| 日韩限制级电影在线观看| 欧美日韩你懂得| 91免费看`日韩一区二区| 成人亚洲一区二区一| 九色综合国产一区二区三区| 日韩精品一二区| 亚洲成人av中文| 亚洲一区二区三区在线看| 亚洲欧洲色图综合| 国产精品麻豆视频| 欧美国产禁国产网站cc| 久久久久国产精品人| 精品国产污污免费网站入口| 911精品国产一区二区在线| 在线观看视频欧美| 91精品1区2区| 91福利在线免费观看| 色网站国产精品| 色偷偷一区二区三区| 91小视频在线免费看| 不卡av电影在线播放| 成人午夜免费视频| 成年人午夜久久久| av不卡免费电影| 91免费版pro下载短视频| av亚洲精华国产精华精华| www.66久久| 色美美综合视频| 欧洲一区在线电影| 欧美日韩一区成人| 91精品综合久久久久久| 欧美一级夜夜爽| 久久亚洲一区二区三区四区| 国产亚洲欧美一级| 中文字幕一区二区三区在线播放| 欧美国产乱子伦| 亚洲天堂精品在线观看| 亚洲午夜在线电影| 日韩国产在线一| 国产美女娇喘av呻吟久久| 国产999精品久久| 色综合久久中文字幕| 在线观看网站黄不卡| 91精品国产品国语在线不卡| 精品国产伦一区二区三区观看方式 | 亚洲欧洲av色图| 一区二区在线观看视频在线观看| 亚洲6080在线| 国产一级精品在线| 91在线观看高清| 欧美日本国产一区| 欧美精品一区二区久久婷婷| 中文av字幕一区| 亚洲18女电影在线观看| 激情综合色播激情啊| 99久久精品费精品国产一区二区| 欧美一a一片一级一片| 欧美成人在线直播| 1024亚洲合集| 欧美aaaaaa午夜精品| 成人综合激情网| 欧美人狂配大交3d怪物一区| xvideos.蜜桃一区二区| 亚洲免费av观看| 精品一区二区三区免费毛片爱| 97精品国产97久久久久久久久久久久| 欧美日韩国产中文| 欧美极品xxx| 日韩av网站免费在线| 成人av午夜影院| 91精品国模一区二区三区| 日本一区二区三区dvd视频在线| 亚洲久草在线视频| 激情欧美一区二区三区在线观看| 一本大道久久a久久精品综合| 日韩一区二区麻豆国产| 亚洲男人天堂av网| 国产一区欧美一区| 精品视频在线看| 国产精品美女www爽爽爽| 老司机精品视频在线| 在线免费观看日本欧美| 久久精品在线免费观看| 五月婷婷久久综合| 一本色道**综合亚洲精品蜜桃冫| 久久男人中文字幕资源站| 香蕉成人啪国产精品视频综合网| 顶级嫩模精品视频在线看| 日韩片之四级片| 午夜不卡在线视频| 一本久久精品一区二区| 国产精品入口麻豆九色| 国产综合色视频| 欧美精品v国产精品v日韩精品| 亚洲婷婷综合久久一本伊一区| 国产精品2024| 日韩精品一区二区三区三区免费| 亚洲国产欧美另类丝袜| 91婷婷韩国欧美一区二区| 日本一区二区三区高清不卡| 国模冰冰炮一区二区| 日韩欧美国产精品一区| 三级精品在线观看| 欧美最猛性xxxxx直播| 国产精品国产三级国产aⅴ无密码 国产精品国产三级国产aⅴ原创 | 国产91丝袜在线播放九色| 日韩欧美一级二级三级| 石原莉奈一区二区三区在线观看| 91色|porny| 国产麻豆精品在线观看| 日韩一区二区三区在线| 三级亚洲高清视频| 欧美电影影音先锋| 亚洲高清视频在线| 欧美日韩黄色影视| 亚洲男人的天堂av| 色综合久久88色综合天天免费| 亚洲欧洲色图综合| 91丨九色丨国产丨porny| 中文字幕亚洲区| 97se狠狠狠综合亚洲狠狠| 亚洲视频在线一区二区| 一本色道久久综合亚洲aⅴ蜜桃 | 91精品国产综合久久久久久久久久| 亚洲综合999| 欧美日韩色一区| 亚洲成人你懂的| 91精品国产综合久久久蜜臀图片 | 极品美女销魂一区二区三区免费 | 日韩欧美国产一区在线观看| 免费观看一级特黄欧美大片| 日韩女优毛片在线| 老汉av免费一区二区三区| 久久综合一区二区| 国产精品99久久久久久有的能看| 国产肉丝袜一区二区| 99久久亚洲一区二区三区青草| 亚洲免费av观看| 欧美精品久久天天躁| 精品一区二区三区免费播放| 久久久精品国产免费观看同学| 国产aⅴ综合色| 亚洲人成影院在线观看| 欧美日韩在线观看一区二区 | 精品国产伦理网| 成人精品免费网站| 亚洲欧美日本韩国| 91精品国产91久久综合桃花| 激情欧美日韩一区二区| 亚洲日本电影在线| 51精品秘密在线观看| 国产福利一区二区| 亚洲欧美综合另类在线卡通| 欧美精品高清视频| 国产精品一区二区视频| 一区二区成人在线| 日韩精品资源二区在线| 成人白浆超碰人人人人| 亚洲成人免费电影| 久久久久久久久97黄色工厂| 色哟哟国产精品免费观看| 久久99久久精品| 一区二区三区欧美激情| 精品少妇一区二区三区视频免付费 | 国产精品久久久久久久第一福利 | 日韩电影在线观看一区| 国产午夜亚洲精品理论片色戒 | 首页国产丝袜综合| 国产日韩精品一区| 欧美人狂配大交3d怪物一区| 成人午夜av影视| 麻豆成人久久精品二区三区小说| 欧美激情一区二区三区不卡| 在线综合亚洲欧美在线视频| 成人精品国产免费网站| 蜜臀av一区二区在线免费观看| 一色桃子久久精品亚洲| 精品福利一二区| 欧美日本国产视频| 91理论电影在线观看|