亚洲欧美第一页_禁久久精品乱码_粉嫩av一区二区三区免费野_久草精品视频

? 歡迎來到蟲蟲下載站! | ?? 資源下載 ?? 資源專輯 ?? 關(guān)于我們
? 蟲蟲下載站

?? optimtips.m

?? New users and old of optimization in MATLAB will find useful tips and tricks in this document, as we
?? M
?? 第 1 頁 / 共 5 頁
字號:
With no error in the data at all, our regression estimates willbe perfect. Lets suppose there is some noise on the data pointsx(i). Look at the highest point in x. The extreme points willhave the highest leverage on your estimate of the slope. If thenoise on this is positive, moving x higher, then this point willhave MORE leverage. It will also tend to bias the slope estimatetowards zero. High values of x with noise which decreases theirvalue will see their leverage decreased. A decrease in thesevalues would tend to bias the slope away from zero. But rememberthat its the points with high leverage that affect the slopethe most. The same effects happen in reverse at the bottom endof our data.The net effect is that errors in x will tend to result in slopeestimates that are biased towards zero. Can we back this up withan experiment?%}%%x = linspace(-1,1,201)';y = 1 + x;coef0 = [ones(size(x)) , x]\y  % as you would expect, the estimated parameters are exact (to% within double precision noise.)%%% add some noise to xu = x + randn(size(x))/10;% while y is still known with no errorcoef1 = [ones(size(u)) , u]\y  % as predicted, the second coefficient in this model was less than 1.% (Note that I am willing to make this prediction on random data.)%%% Better would have been to form the model in its inverse form.% the constant term will be different, but the slope should still% be nominally 1.0coef2 = [ones(size(y)) , y]\u%%% We can even try it several more times, just in case you don't% believe me.for i=1:10  u = x + randn(size(x))/10;  coef1 = [ones(size(u)) , u]\y;  coef2 = [ones(size(y)) , y]\u;  disp(['Errors in x: ',num2str(coef1(2)),', ',num2str(coef2(2))])end% Note that on every pass through this loop, the first slope% estimate tends to be uniformly less than 1, whereas the second% was fairly randomly above and below 1.0.%%% The errors in variables problem is also known as Total Least% Squares. Here we wish to minimize the squared deviations of% each point from the regression line. We'll now assume that% both x and y have variability that we need to deal with.x = linspace(-1,1,101)';y = 1 + 2*x;% add in some noise, the variance is the same for each variable.x = x + randn(size(x))/10;y = y + randn(size(x))/10;% if we use the basic \ estimator, then the same errors in x% problem as before rears its ugly head. The slope is biased% towards zero.coef0 = [ones(size(x)) , x]\y%%% The trick is to use principal components. In this case we can% do so with a singular value decomposition.M = [x-mean(x),y-mean(y)];[u,s,v] = svd(M,0);% The model comes from the (right) singular vectors.v1 = v(:,1);disp(['(x - ',num2str(mean(x)),')*',num2str(v1(2)), ...  ' - (y - ',num2str(mean(y)),')*',num2str(v1(1)),' = 0']) % Only a little algebra will be needed to convince you that% this model is indeed approximately y = 1+2*x.%% 14. Passing extra information/variables into an optimization%{Many optimizations involve extraneous variables to the objectivefunction. With respect to the optimizer, they are fixed, but theuser may need to change these variables at will. A simple examplecomes from nonlinear root-finding. My example will use erf, afunction for which an explicit inverse already exists. I'll giveseveral solutions to passing in these extra variables. (One I willnot recommend is the use of global variables. While they will workfor this problem, I rarely like to use them when any other solution exists. There are many reasons for disliking globals, my favoriteis the confusion they can sometimes cause in debugging your code.)The problem we will solve is computation of the inverse of  y = erf(x)when y is known, and subject to change.%}%%% 1. We can embed the parameters inside an anonymous functiony = 0.5;fun = @(x) erf(x) - y;% solve using fzerostart = 0;x = fzero(fun,start,optimset('disp','iter'))%%% To convince ourselves that fzero was successful, make a direct% call to erfinv.erfinv(y)%%% The value of y has been embedded in fun. If we choose to change y% we must redefine the anonymous function.y = -.5;fun = @(x) erf(x) - y;x = fzero(fun,start,optimset('disp','off'))%%% 2. An alternative is to pass in the value as an argument, passing% it through fzero. The optimizers in the optimization toolbox allow% the user to do so, by appending them after the options argument.% This is an undocumented feature, because the Mathworks would prefer% that we use anonymous functions.% Here fun is defined as a function of two variables, y is no% longer fixed at its value when the function is created.fun = @(x,y) erf(x) - y;y = 0.5;start = 0;x = fzero(fun,start,optimset('disp','off'),y)x = fzero(fun,start,optimset('disp','off'),-0.5)%%% 3. For those individuals who prefer inline functions over anonymous% functions, or who do not use release 14 or above, this solution% looks just like the one above.fun = inline('erf(x) - y','x','y');y = 0.5;start = 0;x = fzero(fun,start,optimset('disp','off'),y)x = fzero(fun,start,optimset('disp','off'),-0.5)%%%{4. Using a nested function. Nested functions can only exist insideother functions (also only in release 14 and above), so I'll definea function that will enclose the nested function. Its naturallycalled testnestfun. This function is already saved as an m-filefor your convenience.function x = testnestfun(y)  function res = nestfun(x,yi)    res = erf(x) - yi;  endx = zeros(size(y));start = 0;for i=1:prod(size(y))  yi = y(i);  x(i) = fzero(@nestfun,start,optimset('disp','off'))endend % testnestfun terminator%}% Solve a series of inverse problems.x = [-.9:.1:.9]';disp([x,testnestfun(x)])%% 15. Minimizing the sum of absolute deviations%{Minimizing the sums of squares of errors is appropriate whenthe noise in your model is normally distributed. Its notuncommon to expect a normal error structure. But sometimeswe choose instead to minimize the sum of absolute errors.How do we do this? Its a linear programming trick this time.For each data point, we add a pair of unknowns called slackvariables. Thus  y(i) = a + b*x(i) + u(i) - v(i)Here the scalars a and b, and the vectors u and v are all unknowns.We will constrain both u(i) and v(i) to be non-negative. Solvethe linear programming system with equality constraints asabove, and the objective will be to minimize sum(u) + sum(v).The total number of unknowns will be 2+2*n, where n is thenumber of data points in our "regression" problem.%}%%x = sort(rand(100,1));y = 1+2*x + rand(size(x))-.5;closeplot(x,y,'o')title 'Linear data with noise'xlabel 'x'ylabel 'y'%%% formulate the linear programming problem.n = length(x);% our objective sums both u and v, ignores the regression% coefficients themselves.f = [0 0 ones(1,2*n)]';% a and b are unconstrained, u and v vectors must be positive.LB = [-inf -inf , zeros(1,2*n)];% no upper bounds at all.UB = [];% Build the regression problem as EQUALITY constraints, when% the slack variables are included in the problem.Aeq = [ones(n,1), x, eye(n,n), -eye(n,n)];beq = y;% estimation using linprogparams = linprog(f,[],[],Aeq,beq,LB,UB);% we can now drop the slack variablescoef = params(1:2)% and plot the fitplot(x,y,'o',x,coef(1) + coef(2)*x,'-')title 'Linprog solves the sum of absolute deviations problem (1 norm)'xlabel 'x'ylabel 'y'%% 16. Minimize the maximum absolute deviation%{We can take a similar approach to this problem as we did for thesum of absolute deviations, although here we only need a pair ofslack variables to formulate this as a linear programming problem.The slack variables will correspond to the maximally positivedeviation and the maximally negative deviation. (As long as aconstant term is present in the model, only one slack variableis truly needed. I'll develop this for the general case.)Suppose we want to solve the linear "least squares" problem  M*coef = yin a mini-max sense. We really don't care what the other errorsdo as long as the maximum absolute error is minimized. So wesimply formulate the linear programming problem (for positivescalars u and v)  min (u+v)  M*coef - y <= u  M*coef - y >= -vIf the coefficient vector (coef) has length p, then there are2+p parameters to estimate in total.%}%%% As usual, lets make up some data.x = sort(rand(100,1));y = pi - 3*x + rand(size(x))-.5;closeplot(x,y,'o')title 'Linear data with noise'xlabel 'x'ylabel 'y'%%% Build the regression matrix for a model y = a+b*x + noisen = length(x);M = [ones(n,1),x];% Our objective here is to minimize u+vf = [0 0 1 1]';% The slack variables have non-negativity constraintsLB = [-inf -inf 0 0];UB = [];% Augment the design matrix to include the slack variables,% the result will be a set of general INEQUALITY constraints.A = [[M,-ones(n,1),zeros(n,1)];[-M,zeros(n,1),-ones(n,1)]];b = [y;-y];% estimation using linprogparams = linprog(f,A,b,[],[],LB,UB);% strip off the slack variablescoef = params(1:2)%%% The maximum positive residualparams(3)%%% And the most negative residualparams(4)%%% plot the resultplot(x,y,'o',x,coef(1) + coef(2)*x,'-')title 'Linprog solves the infinity norm (minimax) problem'xlabel 'x'ylabel 'y'%% 17. Batching small problems into large problems%{Suppose we wanted to solve many simple nonlinear optimizationproblems, all of which were related. To pick one such example,I'll arbitrarily decide to invert a zeroth order Besselfunction at a large set of points. I'll choose to know onlythat the root lies in the interval [0,4].%}%%% Solve for x(i), given that y(i) = bessel(0,x(i)).n = 1000;y = rand(n,1);fun = @(x,y_i) bessel(0,x) - y_i;% first, in a loopticx = zeros(n,1);for i=1:n  x(i) = fzero(fun,[0 4],optimset('disp','off'),y(i));endtoc% as a test, compare the min and max residualsyhat = bessel(0,x);disp(['Min & max residuals: ',num2str([min(y-yhat),max(y-yhat)])])% tic and toc reported that this took roughly 9 seconds to run on% my computer.%%% Can we do better? Suppose we considered this as a multivariable% optimization problem, with hundreds of unknowns. We could batch% many small problems into one large one, solving all our problems% simultaneously. With the optimization toolbox, this is possible.% At least it is if we use the LargeScale solver in conjunction% with the JacobPattern option.% I'll use lsqnonlin because I chose to bound my solutions in the% interval [0,4]. Fsolve does not accept bound constraints.% define a batched objective functionbatchfun = @(x,y) bessel(0,x) - yoptions = optimset('lsqnonlin');options.Largescale = 'on';options.TolX = 1.e-13;options.TolFun = 1.e-13;% I'll just put the first p=10 problems in a batchp = 10;start = ones(p,1);LB = repmat(0,p,1);UB = repmat(4,p,1);ticxb = lsqnonlin(batchfun,start,LB,UB,options,y(1:p));toc% This took .1 seconds on my computer, roughly the same amount% of time per problem as did the loop, so no gain was achieved.%%% Why was the call to lsqnonlin so slow? Because I did not tell% lsqnonlin to expect that the Jacobian matrix would be sparse.% How sparse is it? Recall that each problem is really independent

?? 快捷鍵說明

復(fù)制代碼 Ctrl + C
搜索代碼 Ctrl + F
全屏模式 F11
切換主題 Ctrl + Shift + D
顯示快捷鍵 ?
增大字號 Ctrl + =
減小字號 Ctrl + -
亚洲欧美第一页_禁久久精品乱码_粉嫩av一区二区三区免费野_久草精品视频
国产乱码字幕精品高清av| 成熟亚洲日本毛茸茸凸凹| 欧美高清在线视频| 欧美日韩国产一级二级| 成人福利视频网站| 免费看黄色91| 亚洲一区二区免费视频| 亚洲国产精品t66y| 日韩欧美一二三四区| 欧美影院一区二区三区| 国产98色在线|日韩| 强制捆绑调教一区二区| 一区二区高清在线| **欧美大码日韩| 久久久久久**毛片大全| 欧美一级xxx| 欧美揉bbbbb揉bbbbb| 91热门视频在线观看| 丁香激情综合国产| 国产精品一区二区x88av| 偷拍一区二区三区| 一区二区三区91| 中文字幕日韩av资源站| 欧美高清在线精品一区| 久久久精品欧美丰满| 26uuu精品一区二区在线观看| 欧美人xxxx| 欧美日韩二区三区| 欧美视频三区在线播放| 91福利视频网站| 91成人国产精品| 色就色 综合激情| 91蜜桃传媒精品久久久一区二区| 成人免费毛片片v| 国产一区二区伦理片| 久久不见久久见中文字幕免费| 日韩成人dvd| 日韩精品亚洲一区二区三区免费| 亚洲午夜精品网| 亚洲6080在线| 日韩黄色免费电影| 午夜国产精品一区| 日韩中文字幕不卡| 日韩成人一区二区三区在线观看| 三级久久三级久久久| 青青草伊人久久| 美女一区二区三区在线观看| 美腿丝袜在线亚洲一区| 精品一二线国产| 国产精品影视天天线| 岛国av在线一区| 99re热这里只有精品视频| 色爱区综合激月婷婷| 欧美午夜寂寞影院| 在线综合视频播放| 精品蜜桃在线看| 国产欧美一区二区精品性| 国产精品女人毛片| 亚洲精品视频免费看| 亚洲bt欧美bt精品777| 免费高清不卡av| 国产91丝袜在线18| 色天天综合色天天久久| 欧美老肥妇做.爰bbww视频| 日韩欧美黄色影院| 中文在线一区二区| 亚洲成年人影院| 久久精品国产99| 成人免费观看av| 欧美性猛片aaaaaaa做受| 欧美一区二区久久久| 国产日韩精品一区| 亚洲男帅同性gay1069| 视频一区视频二区在线观看| 国产在线视视频有精品| 91影院在线免费观看| 欧美浪妇xxxx高跟鞋交| 国产亚洲一区二区三区在线观看| 亚洲少妇最新在线视频| 日韩不卡免费视频| 成人av免费网站| 91精品国产色综合久久不卡蜜臀 | 欧美久久高跟鞋激| 亚洲精品在线电影| 亚洲日本电影在线| 麻豆国产欧美一区二区三区| 99精品视频在线播放观看| 欧美一级日韩不卡播放免费| 中文字幕一区免费在线观看 | 亚洲精品ww久久久久久p站| 美女视频网站久久| 91毛片在线观看| 精品理论电影在线| 亚洲一区在线观看网站| 国产精品18久久久久久久久久久久 | 樱桃国产成人精品视频| 久久成人精品无人区| 91免费国产在线观看| 精品国产一区久久| 亚洲丰满少妇videoshd| www.日本不卡| 久久综合久久久久88| 午夜成人免费电影| 99精品视频在线观看免费| 亚洲精品一区在线观看| 一区二区三区在线观看视频| 国产一区二区三区在线观看免费 | 亚洲国产精品嫩草影院| 成人午夜电影网站| 日韩精品一区二区三区蜜臀| 亚洲狠狠爱一区二区三区| 成人动漫一区二区三区| 2014亚洲片线观看视频免费| 日韩av一二三| 欧美日韩亚洲综合一区二区三区| 国产精品国产自产拍高清av | 欧美videofree性高清杂交| 亚洲综合色网站| 91亚洲精品乱码久久久久久蜜桃 | 18成人在线观看| 国产高清久久久| 久久综合久久综合九色| 久草精品在线观看| 91精品国产综合久久精品性色| 亚洲影视在线播放| 在线免费观看成人短视频| 国产精品免费视频一区| 成人天堂资源www在线| 国产午夜亚洲精品不卡| 国产一区二区在线观看视频| 欧美不卡一区二区三区四区| 蜜桃久久久久久| 欧美一区二区啪啪| 青娱乐精品视频在线| 欧美一二三在线| 久久激情五月婷婷| 精品电影一区二区| 国产一区二区在线电影| 久久久久久久久伊人| 国产精品99久久久久久宅男| 国产日韩欧美激情| 成人精品国产福利| 亚洲天堂2014| 色综合久久综合网97色综合| 亚洲综合激情小说| 欧美女孩性生活视频| 日韩av网站免费在线| 精品乱码亚洲一区二区不卡| 国产乱码精品一区二区三区五月婷| 久久久精品中文字幕麻豆发布| 成人一级黄色片| 亚洲日本韩国一区| 欧美日韩卡一卡二| 久久国产人妖系列| 日本一区二区三区免费乱视频| 成人91在线观看| 亚洲综合一二区| 日韩欧美一区二区在线视频| 国产一区二区三区四区五区美女| 欧美激情一区在线| 欧美在线观看18| 七七婷婷婷婷精品国产| 久久你懂得1024| 一本大道久久a久久精二百 | 亚洲一级二级三级| 欧美一区二区三区思思人| 久久精品国产色蜜蜜麻豆| 国产日韩亚洲欧美综合| 日本久久一区二区| 久久丁香综合五月国产三级网站| 中文字幕乱码亚洲精品一区| 在线这里只有精品| 久久精品国产精品青草| 中文字幕日韩一区| 欧美一区二区三区性视频| 成人免费视频视频| 香蕉av福利精品导航| 国产亚洲欧美中文| 欧美日韩一级黄| 国产成人免费视频网站| 亚洲成人av在线电影| 国产日韩欧美不卡| 欧美日韩精品福利| 成人国产在线观看| 另类人妖一区二区av| 日韩久久一区二区| 精品美女一区二区三区| 91国在线观看| 国产a视频精品免费观看| 天天综合日日夜夜精品| 中文字幕一区二区不卡| 日韩视频免费观看高清完整版 | 久久蜜桃香蕉精品一区二区三区| 91久久久免费一区二区| 国产精品一区二区三区乱码| 天天色综合天天| 最新日韩av在线| 久久久久久久久久久久久女国产乱| 欧美性生交片4| 99视频国产精品|