Advertisement

Kalman Filter tutorial?

Started by February 12, 2010 02:41 PM
1 comment, last by Emergent 14 years, 9 months ago
I'm not sure if this lies in the purview of game development, but I was wondering if anyone knows of any good tutorials on Kalman Filters/Histogram Filters. I'm currently using "Probabilistic Robotics" by Thrun, Burgard, and Fox and it is almost incomprehensible, I thought my math background was pretty strong but this book has really put me in my place. Any help you <persons>* can offer would be appreciated. *English needs gender-neutral pronouns Edit: Yes I've checked Wikipedia, but I'm looking for more concrete examples, explanations on how to come up with the various matrices, what the matrices represent, etc.
"Think you Disco Duck, think!" Professor Farnsworth
Unfortunately, I don't have an awesome reference to recommend. What helped me though was to view the Kalman filter as a special case of Bayesian estimation. I might recommend that you figure out Bayesian estimation for finite problems, and then see the connection to the Kalman filter.

The other thing (related) that helped me was to realize that the covariance matrices aren't "really" what the Kalman filter is "working on;" they're just a computational representation: What's ACTUALLY going on is you have a probability density function (pdf) that's evolving in time; it just so happens that this pdf is completely described by its covariance and mean since it's gaussian -- and it also just happens, because of the assumptions we've made (linear systems, gaussian noise) that this prior will always stay Gaussian after a Bayes update, so we can just keep track of its mean and covariance to describe it. But fundamentally what's really evolving in time is a probability density function.
Advertisement
Really simple examples help also, I think.

Like, say you've got the system

x[k+1] = 1 x[k] + 0 w[k]
z[k] = 1 x[k] + v[k]

which is really just a complicated way to say "you have a bunch of samples of a single number (the initial condition, x[0]) corrupted by iid noise." The thing "everybody knows" is that, if you have an uninformative prior, you should just take the average of the values you see. Well it's easy to compute an average recursively, like

x_hat[k+1] = (k x_hat[k] + z[k])/(k+1) .

You can put this into the form of the Kalman filter, and see how it's related...

This topic is closed to new replies.

Advertisement