亚洲欧美第一页_禁久久精品乱码_粉嫩av一区二区三区免费野_久草精品视频

? 歡迎來到蟲蟲下載站! | ?? 資源下載 ?? 資源專輯 ?? 關于我們
? 蟲蟲下載站

?? dynamic.html

?? 粒子濾波
?? HTML
?? 第 1 頁 / 共 2 頁
字號:
<!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN"><html> <head><title>Bayesian inference in dynamic models -- an overview</title></head><body><h1><center>Bayesian inference in dynamic models -- an overview</center></h1><h3><center>by <a href="/~minka/">Tom Minka</a></center></h3><p> The following algorithms all try to infer the hidden state of a dynamicmodel from measurements.  The input is a dynamic model and a measurementsequence and the output is an approximate posterior distribution over thehidden state at one or many times.  Only discrete-time models are discussedhere.  <p>Inferring only the most recent hidden state is known as <b>filtering</b>;inferring past states is known as <b>smoothing</b>.Most filtering methods are <b>on-line</b>, which means they process each measurement exactly once, after which it can be discarded.This allows them to operate with a fixed amount of memory.The opposite of <b>on-line</b> is <b>off-line</b> or <b>batch</b>.There are standard ways to turn an on-line filtering algorithm into a batch filtering or smoothing algorithm.Therefore, the overview is divided into two parts: on-line filtering andbatch filtering/smoothing.<p>Some of these algorithms are general algorithms for approximate Bayesianinference and others are specialized for dynamic models.  With thedescription of each algorithm is a partial list of references.  I'veincluded more references for algorithms which are lesswell-known.  <p>Some related pages on the web:<ul><li><a href="http://www.cs.berkeley.edu/~murphyk/Bayes/kalman.html">KevinMurphy's Kalman filter toolbox</a></ul><hr><h2>On-line filtering algorithms</h2>The algorithms are grouped according to how they represent the posteriordistribution over the hidden state (their <b>belief</b>).<hr><h2>Gaussian belief</h2>The following algorithms use a multivariate Gaussian for their belief.  Infact, most of them are more general than this---they could use anyexponential family as their belief.<p><dl><dt><b>Kalman filter</b><dd>The Kalman filter only applies to models with Gaussian noise, linearstate equations, and linear measurement equations, i.e.<pre>s_t = A s_(t-1) + noisex_t = C s_t + noise</pre>For these models the state posterior really is Gaussian, and the Kalmanfilter is exact.<ul><li><a href="http://www.cs.unc.edu/~welch/kalman/">The Kalman filter</a><li><ahref="http://vismod.media.mit.edu/tech-reports/TR-531-ABSTRACT.html">"FromHidden Markov Models to Linear Dynamical Systems"</a>, T. Minka, 1998</ul></dd><p><dt><b>Extended Kalman filter</b><dd>The Extended Kalman filter applies to models with Gaussian noise.The state and measurement equations are linearized by a Taylor expansionabout the current state estimate.  The noise variance in the equations isnot changed, i.e. the additional error due to linearization is not modeled.  After linearization, the Kalman filter is applied.<ul><li>"Stochastic models, estimation and control", Peter S. Maybeck,Volume 2, Chapter 12, 1982.<li>"A linear approximation method for probabilistic inference",Ross Shachter, UAI'1990.</ul></dd><p><dt><b>Bottleneck filter</b><dd>This algorithm applies to any type of measurement equation.The measurement equation is rewritten in terms of an intermediatebottleneck variable <kbd>b_t</kbd>, such that <kbd>p(x_t|b_t)</kbd> issimple while <kbd>p(b_t|s_t)</kbd> may be complicated.  At each time step,the Gaussian belief on <kbd>s_t</kbd> is converted into a Gaussian beliefon <kbd>b_t</kbd> (usually involving approximations), <kbd>b_t</kbd> isupdated exactly from <kbd>x_t</kbd> (since <kbd>p(x_t|b_t)</kbd> issimple), and the new Gaussian belief on <kbd>b_t</kbd> is converted back toa Gaussian belief on <kbd>s_t</kbd>.  ("Bottleneck" is my own terminology.  In the West paper below, theyused Gamma distributions.)<ul><li>"Dynamic Generalized Linear Models and Bayesian Forecasting,"M. West, P. J. Harrison, &amp; H. S. Migon, J Am Stat Assoc 80:73-97, 1985.</ul></dd><p><dt><b>Linear-update filter</b>a.k.a. linear-regression filter or "statistical linearization" filter<dd>This algorithm applies to any type of measurement equation.The measurement equation is converted into a linear-Gaussian equationby regressing the observation onto the state.The result is a Kalman filter whose Kalman gain is<kbd>cov(state,measurement)/var(measurement)</kbd>.Note that the regression is done without reference to the actual measurement.I call it "linear-update" because the update to the state is always a linearfunction of the measurement.A variety of approximations have been proposed when<kbd>cov(state,measurement)</kbd>is not available analytically.  The <b>unscented filter</b>,<b>central difference filter</b>, and <b>divided difference filter</b> arefilters of this type.<ul><li>"Stochastic models, estimation and control", Peter S. Maybeck,Volume 2, Chapter 12, 1982.<li><a href="http://citeseer.ist.psu.edu/457152.html">"Kalman Filters for nonlinear systems: a comparison of performance"</a>,Tine Lefebvre, Herman Bruyninckx, Joris De Schutter.<li><a href="http://cslu.ece.ogi.edu/nsel/research/ukf.html">"The Unscented Kalman Filter for Nonlinear Estimation"</a>,Eric A. Wan and Rudolph van der Merwe, 2000.</ul></dd><p><dt><b>Assumed-density filter</b> a.k.a. moment matching<dd>This algorithm chooses the Gaussian belief which is "closest", in somesense, to the exact state posterior given previous beliefs.Usually this amounts to matching the moments of the exact posterior.This is the most general approximation technique proposed to date---itutilizes not only the form of the measurement equation but also themeasurement itself.  The assumed-density filter requires analytic ornumerical solution of integrals over the measurement distribution.For this, one could use Monte Carlo, quadrature, or Laplace's method.<ul><li><a href="ep/">"Expectation Propagation for approximate Bayesian inference"</a>,T. Minka, Uncertainty in AI'2001.<li>"Stochastic models, estimation and control", Peter S. Maybeck,Volume 2, Chapter 12.7, 1982.<li><a href="http://www.unc.edu/depts/statistics/faculty/amarjit/pdffile/Amarjit7.ps">"A nonlinear filtering algorithm based on an approximation of the conditional distribution"</a>,H. J. Kushner and A. S. Budhiraja, IEEE Trans Automatic Control45(3):580-585, 2000.<li><a href="http://citeseer.ist.psu.edu/ito99gaussian.html">"Gaussian filters for nonlinear filtering problems"</a>,K. Ito and K. Q. Xiong, IEEE Trans Automatic Control 45(5): 910-927, 2000.<li><a href="http://uai.sis.pitt.edu/displayArticleDetails.jsp?mmnu=2&smnu=2&author_id=311&article_id=212">"Approximate Learning in Complex Dynamic Bayesian Networks"</a>,Settimi, Smith, &amp; Gargoum, UAI'1999.<li><a href="http://www.ucl.ac.uk/Stats/research/Resrprts/psfiles/135.zip">"A comparison of sequential learning methods for incompletedata"</a>,R. G. Cowell, A. P. Dawid, &amp; P. Sebastiani, Bayesian Statistics 5, 1996.<li><a href="http://www.google.co.uk/search?q=%22A+Bayesian+Approach+to+On%2Dline+Learning%22">"A Bayesian Approach to On-line Learning"</a>,Manfred Opper, On-Line Learning in Neural Networks, 1999.</ul></dd><p><!-- Laplace ADF --></dl><h2>Mixture of Gaussian belief</h2>A natural choice in moving beyond a single Gaussian is to use a mixture ofGaussian belief.  Unfortunately, an algorithm for general dynamicmodels has proven elusive.  Instead, existing algorithms assume that thedynamic model is a mixture of linear-Gaussian models, i.e. it switchesrandomly between different linear-Gaussian state/measurement equations.This can be understood as having discrete state variables in addition tothe continuous ones.For these models, the true state posterior is a mixture of Gaussians, but avery big one.  The algorithms focus on reducing the size of this mixture,in an on-line way.Most of them generalize beyond Gaussian to any exponential family.<p><dl><dt><b>Assumed-density filter</b> a.k.a. "pseudo-Bayes"<dd>Same as assumed-density filtering for a single Gaussian, but now thebelief representation is categorical for the discrete state component andconditional Gaussian for the continuous state component, producing amixture of Gaussian marginal for the continuous state component.  For eachsetting of the discrete state component, this algorithm chooses theGaussian which is "closest" to the exact state posterior given previousbeliefs.  A drawback of this algorithm is that the size of the mixture ispredetermined by the cardinality of the discrete state component.  However,it does allow the state/measurement equations, conditional on the discretestate, to be non-Gaussian.<ul><li><ahref="http://www.cs.berkeley.edu/~murphyk/Papers/skf.ps.gz">"SwitchingKalman filters"</a>, Kevin Murphy, 1998.<li>"Bayesian forecasting",P. J. Harrison and C. F. Stevens,  J Royal Stat Soc B 38:205--247, 1976.<li><ahref="http://www.cs.ru.nl/~tomh/publications.html">"Expectationpropagation for approximate inference in dynamic Bayesian networks"</a>,Tom Heskes and Onno Zoeter, Uncertainty in AI'2002.</ul></dd><p><dt><b>Gaussian-sum filter</b><dd>This is a family of algorithms which propagate the exact posterior forone step, giving a large Gaussian mixture, and then reduce the mixtureas needed.  Methods for reducing the mixture include:<dl><dt>Drop mixture components with low weight<dd><ul><li><a href="http://www.ucl.ac.uk/Stats/research/Resrprts/psfiles/135.zip">"A comparison of sequential learning methods for incompletedata"</a>,R. G. Cowell, A. P. Dawid, &amp; P. Sebastiani, Bayesian Statistics 5, 1996.<li>Tugnait, Automatica 18: 607--615, 1982<li>Smith & Winter, IEEE Conf Decision & Control, 1979 </ul><dt>Sample mixture components according to weight<dd>Used in Rao-Blackwellised particle filters:<ul><li><a href="http://citeseer.ist.psu.edu/chen00mixture.html">"Mixture Kalman Filters"</a>, Chen and Liu<li><a href="http://www.stats.ox.ac.uk/pub/clifford/Particle_Filters/jj.Abstract.html">"Building Robust Simulation-based Filters for Evolving Data Sets"</a>,J. Carpenter, P. Clifford, &amp; and P. Fearnhead</ul>

?? 快捷鍵說明

復制代碼 Ctrl + C
搜索代碼 Ctrl + F
全屏模式 F11
切換主題 Ctrl + Shift + D
顯示快捷鍵 ?
增大字號 Ctrl + =
減小字號 Ctrl + -
亚洲欧美第一页_禁久久精品乱码_粉嫩av一区二区三区免费野_久草精品视频
国产精品欧美综合在线| 亚洲激情av在线| 亚洲国产精品精华液ab| 91伊人久久大香线蕉| 三级影片在线观看欧美日韩一区二区| 日韩一区二区三区免费观看 | 国产伦精一区二区三区| 欧美另类videos死尸| 日韩欧美一区二区不卡| 一区二区中文视频| caoporn国产精品| 亚洲精品自拍动漫在线| 欧美色图一区二区三区| 国产一区二区三区久久久| 蜜臀av国产精品久久久久| 亚洲伦理在线免费看| 成人免费看视频| 欧美日韩一区二区电影| 欧美成人a视频| 自拍偷在线精品自拍偷无码专区| 亚洲一区二区三区激情| 极品少妇xxxx偷拍精品少妇| 97精品国产97久久久久久久久久久久 | 精品伊人久久久久7777人| 风间由美一区二区av101| 欧美一区二区在线免费播放| 最新高清无码专区| 久久国产尿小便嘘嘘| 91国产免费看| 亚洲黄色在线视频| av色综合久久天堂av综合| 精品少妇一区二区三区在线播放| 亚洲午夜在线观看视频在线| 色综合激情五月| 一区二区三区四区视频精品免费| 国产成人av自拍| 久久久久国产精品麻豆| 国产suv精品一区二区6| 国产欧美日韩卡一| 色老综合老女人久久久| 日韩欧美亚洲一区二区| 亚洲日本va在线观看| 91浏览器打开| 日韩精品国产精品| 久久久久国产精品麻豆ai换脸 | 日韩专区欧美专区| 欧美日韩国产大片| 激情小说亚洲一区| 亚洲精品中文字幕乱码三区| 欧美狂野另类xxxxoooo| 国产一区二区不卡| 亚洲亚洲人成综合网络| 欧美精品一区二区三区在线| 国产精品久久久久久亚洲伦| 日本在线不卡视频| 亚洲欧美日韩人成在线播放| 欧美日韩免费观看一区二区三区| 日本sm残虐另类| 国产精品不卡一区二区三区| 欧美日本乱大交xxxxx| 国产成人精品三级| 蜜臀av性久久久久蜜臀aⅴ四虎| 国产精品天美传媒沈樵| 粉嫩av一区二区三区| 日韩精品国产欧美| 午夜久久电影网| 亚洲一区二区精品视频| 综合色天天鬼久久鬼色| 亚洲欧洲www| 国产农村妇女精品| 国产精品久久免费看| 国产亚洲欧美色| 久久先锋影音av鲁色资源| 欧美mv日韩mv亚洲| 精品久久久久久无| 国产目拍亚洲精品99久久精品| 久久尤物电影视频在线观看| 久久久久9999亚洲精品| 久久久久久99久久久精品网站| 国产欧美日本一区视频| 国产精品久久久久7777按摩| 国产亲近乱来精品视频| 亚洲日本va午夜在线影院| 亚洲精品视频一区二区| 舔着乳尖日韩一区| 国产一区二区三区免费看| 国产成人免费在线观看| 99精品久久99久久久久| 91超碰这里只有精品国产| 国产午夜精品一区二区三区四区| 中文字幕第一页久久| 午夜精品久久久久久久久久| 国产成人免费在线| 欧美精品视频www在线观看| wwwwxxxxx欧美| 午夜av电影一区| 99精品热视频| 国产精品久久久久久一区二区三区 | 国产亚洲污的网站| 亚洲国产欧美在线| www.欧美亚洲| 亚洲精品在线观看网站| 亚洲成av人**亚洲成av**| 99久久99久久精品国产片果冻 | 一区二区三区在线免费播放| 国产成人在线免费| 久久久天堂av| 黑人精品欧美一区二区蜜桃| 欧美日韩国产高清一区二区| 亚洲欧美乱综合| fc2成人免费人成在线观看播放| 日韩网站在线看片你懂的| 日韩在线a电影| 欧美一区二区三区免费大片| 日韩和欧美一区二区| 欧美一区二区三区在线| 日韩av二区在线播放| 欧美成人伊人久久综合网| 日日夜夜精品免费视频| 精品福利在线导航| 成人激情文学综合网| 亚洲在线免费播放| 欧美一卡二卡在线| 成人av电影免费观看| 亚洲国产乱码最新视频| 欧美成人精品福利| 丰满岳乱妇一区二区三区| 一区二区三区蜜桃网| 欧美一级理论性理论a| 成人黄色在线网站| 亚洲成人免费观看| 中文字幕一区二区三区不卡| 欧美午夜精品电影| av电影天堂一区二区在线| 香蕉乱码成人久久天堂爱免费| 国产丝袜欧美中文另类| 欧美日韩免费不卡视频一区二区三区| 精品亚洲国内自在自线福利| 中文字幕一区二区三区乱码在线 | 日本精品裸体写真集在线观看 | 亚洲人妖av一区二区| 日韩欧美aaaaaa| 69精品人人人人| 在线看国产一区| 欧美手机在线视频| 在线一区二区三区四区五区 | 精品剧情v国产在线观看在线| 欧美日韩中文一区| 91国偷自产一区二区开放时间 | 国产欧美日韩精品一区| 久久精品综合网| 亚洲国产精华液网站w| 国产喷白浆一区二区三区| 国产夜色精品一区二区av| 欧美激情在线一区二区| 亚洲欧洲无码一区二区三区| 国产精品第四页| 性做久久久久久免费观看| 久久精品噜噜噜成人av农村| 九色porny丨国产精品| 成人性视频网站| proumb性欧美在线观看| 91精彩视频在线| 91蜜桃在线观看| 国产精品中文字幕日韩精品 | 国产日韩成人精品| 亚洲精品欧美专区| 午夜欧美在线一二页| 亚洲综合丝袜美腿| 男男视频亚洲欧美| 成人黄色在线网站| 欧美日韩极品在线观看一区| 日韩一级片网站| 欧美成人精品1314www| 一区二区免费在线| 国产suv精品一区二区三区| 在线中文字幕不卡| 综合久久国产九一剧情麻豆| 亚洲国产精品嫩草影院| 福利一区福利二区| 精品国产一区二区国模嫣然| 亚洲天天做日日做天天谢日日欢| 午夜精品久久久久久久| 国产黄色精品视频| 久久青草国产手机看片福利盒子| 日韩美女视频一区二区 | 国产精品午夜在线| 久久超级碰视频| 欧美一级专区免费大片| 日韩精品视频网站| 91免费在线看| 亚洲美女区一区| 精品视频一区二区三区免费| 国产精品天干天干在线综合| 国产成人免费网站| 欧美精品一区二区三区在线| 亚洲成人在线网站| 91精品国产欧美日韩| 久久成人免费日本黄色| 精品国产区一区|