?? dynamic.html
字號:
<dt>Repeatedly merge most similar pair of mixture components<dd><ul><li>"The two-filter formula for smoothing and an implementation of theGaussian-sum smoother",G. Kitagawa, Annals Institute of Statistical Mathematics 46(4):605-623, 1994<li><a href="http://www.google.com/search?q="Bayesian Fault Detection and Diagnosis in Dynamic Systems"">"Bayesian Fault Detection and Diagnosis in Dynamic Systems"</a>,U. Lerner, R. Parr, D. Koller, & G. Biswas</ul><!-- <dt>Merge mixture components by EM<dd>--><dt>Minimize divergence between the large and small mixture by nonlinear optimization<dd><ul><li>"Sequential methods for mixture models"M. Stephens, chapter 5in <a href="http://www.stat.washington.edu/stephens/papers/tabstract.html">"Bayesian Methods for Mixtures of Normal Distributions"</a>, 1997</ul></dl></dd><p></dl><h2>Nonparametric belief</h2><dl><dt><b>Histogram filter</b><dd>Quantize the state space and represent the belief by a histogram.The filtering equations then match a hidden Markov model.<ul><li>"A general filter for measurements with any probability distribution",Y. Rosenberg and M. Werman, CVPR'97, 654--659.<li>"Non-Gaussian state-space modeling of nonstationary time series",G. Kitagawa, J Am Stat Assoc 82:1032--1063, 1987.<li>"Recursive Bayesian estimation using piecewise constant approximations",Kramer and Sorenson, Automatica 24(6):789--801, 1988.</ul></dd><p><dt><b>Particle filter</b><dd>Represent the state posterior by a large set of samples drawn from thedistribution. The particles are updated on-line to always represent thecurrent state posterior.The most common way to do this is to pick a particle for the previousstate, sample a particle for the current state using the state equation, and weight the particle by the probability of the measurement.Sample the particles according to weight to get a set of unweightedparticles.<ul><li><a href="http://citeseer.ist.psu.edu/42189.html">"Sequential Monte Carlo methods for Dynamic Systems"</a>,Liu and Chen, JASA 93, 1998.<li><a href="http://www-sigproc.eng.cam.ac.uk/smc/">Sequential Monte Carlomethods homepage</a><li><a href="http://citeseer.ist.psu.edu/chen00mixture.html">"Mixture Kalman Filters"</a></ul></dd><p><dt><b>Particle filter with interpolation</b><dd>A particle filter where you increase diversity by fitting a density tothe particles and resampling from that density.<ul><li>"A hybrid bootstrap filter for target tracking in clutter",N. Gordon,IEEE Trans Aerospace Electronic Systems 33:353-358, 1997.<li><a href="http://citeseer.ist.psu.edu/504843.html">"A Tutorial on Particle Filters for On-line Non-linear/Non-GaussianBayesian Tracking"</a> (page 11)<li><a href="http://www.google.com/search?q="Using learning for approximation in stochastic processes"">"Using learning for approximation in stochastic processes"</a><li><a href="http://citeseer.ist.psu.edu/peshkin02factored.html">"Factored Particles for Scalable Monitoring"</a></ul></dd><p><dt><b>Particle filter with MCMC steps</b><dd>A particle filter where you increase diversity by including MCMC steps.<ul><li>"Following a moving target---Bayesian inference for dynamic Bayesian models"W. Gilks and C. Berzuini, J Royal Stat Soc B 63(1):127--146, 2001.<li><a href="http://citeseer.ist.psu.edu/doucet99particle.html">"Particle filters for state estimation of jump Markov linear systems"</a><li>"Sequential importance sampling for nonparametric Bayes models: The next generation", S. N. MacEachern, M. Clyde, J. S. Liu, Canadian J of Statistics 27(2):251-267, 1999. </ul></dd><p><dt><b>MCMC filtering</b><dd>MCMC can be specially designed to provide efficient, bounded-memory filtering, for example via randomized Gibbs sampling.<ul><li><a href="http://citeseer.ist.psu.edu/marthi02decayed.html">"Decayed MCMC Filtering"</a></ul></dd><p></dl><hr><h2>Batch filtering/smoothing algorithms</h2><dl><dt><b>Kalman smoothing</b><dd>Used with the Kalman filter, or any filter which linearizes thestate equations, e.g. EKF, UF, LUF, ADF.<ul><li><a href="http://www.cs.unc.edu/~welch/kalman/">The Kalman filter</a><li><ahref="http://vismod.media.mit.edu/tech-reports/TR-531-ABSTRACT.html">"FromHidden Markov Models to Linear Dynamical Systems"</a>, T. Minka, 1998</ul></dd><p><dt><b>Expectation Propagation</b><dd>Provides batch filtering and smoothing.Can be used with any method for linearizing the measurement equations, e.g. EKF, UF, LUF, ADF. Unlike Kalman smoothing, the measurement equationsare re-linearized until a globally-stable solution is reached, givingbetter results.<ul><li><a href="ep/">"Expectation Propagation for approximate Bayesian inference"</a>,T. Minka, Uncertainty in AI'2001.<li><ahref="http://www.cs.ru.nl/~tomh/publications.html">"Expectationpropagation for approximate inference in dynamic Bayesian networks"</a>,Tom Heskes and Onno Zoeter, Uncertainty in AI'2002.</ul></dd><p><dt><b>Variational lower bounds</b><dd>Provides batch filtering and smoothing. Meshes well with parameterestimation.<ul><li><a href="http://citeseer.ist.psu.edu/beal01variational.html">"TheVariational Kalman Smoother"</a><li><a href="http://citeseer.ist.psu.edu/543841.html">"Adaptive classification by variational Kalman filtering"</a></ul></dd><p><dt><b>Two-filter smoothing</b><dd>Run filtering forward and independently in reverse and combine the results.Useful for Gaussian-sum filters.<ul><li>"The two-filter formula for smoothing and an implementation of theGaussian-sum smoother",G. Kitagawa, Annals Institute of Statistical Mathematics 46(4):605-623, 1994</ul></dd><p><dt><b>Particle smoothing by sampling</b><dd>Reweight and resample the particles at time <var>t</var>, based on a sampled particle from time <var>t+1</var>.<ul><li><a href="http://citeseer.ist.psu.edu/436555.html">"Monte Carlo smoothing with application to audio signalenhancement"</a>,W. Fong, S. Godsill, A. Doucet, & M. West,IEEE Trans. on Signal Processing, to appear, 2001.<li>"Monte Carlo filter and smoother for non-Gaussian nonlinear statespace models", G. Kitagawa, J. of Computational and GraphicalStatistics 5:1-25, 1996</ul></dd><p><dt><b>Particle smoothing by interpolation</b><dd>Reweight and resample the particles at time <var>t</var>, based on a fitted density to the particles at time<var>t+1</var>.<ul><li><a href="http://www.google.com/search?q="A General Algorithm for Approximate Inference and its Application to Hybrid Bayes Nets"">"A General Algorithm for Approximate Inference and itsApplication to Hybrid Bayes Nets"</a><li><a href="http://www.cs.cmu.edu/~thrun/papers/thrun.mchmm.html">"Monte Carlo Hidden Markov Models"</a></ul></dd><p><dt><b>MCMC</b><dd>Gibbs sampling or Metropolis-Hastings.<ul><li><a href="http://citeseer.ist.psu.edu/godsill00monte.html">"Monte Carlo smoothing for non-linear time series"</a><li><a href="ftp://ftp.stat.duke.edu/pub/WorkingPapers/95-22.ps">"Bayesian forecasting of multinomial time series through conditionally Gaussian dynamic models"</a>,C. Cargnoni and P. Mueller and M. West, J. Amer Stat Assoc 92:587--606, 1997.<li><a href="http://ht.econ.kobe-u.ac.jp/~tanizaki/cv/cv-e.htm">"Bayesian Estimation of State-Space Model Using theMetropolis-Hastings Algorithm within Gibbs Sampling"</a>,Geweke and Tanizaki, Computational Statistics and Data Analysis 37(2):151-170, 2001.</ul></dd><p><dt>Markov Random Field algorithms<dd>Relaxation, etc.</dd><p></dl><hr><!-- hhmts start -->Last modified: Thu Mar 22 14:48:18 GMT 2007<!-- hhmts end --></body> </html>
?? 快捷鍵說明
復(fù)制代碼
Ctrl + C
搜索代碼
Ctrl + F
全屏模式
F11
切換主題
Ctrl + Shift + D
顯示快捷鍵
?
增大字號
Ctrl + =
減小字號
Ctrl + -