Prev: CPM
Next: "Correcting" output of a filter
From: Tim Wescott on 26 Apr 2010 22:53 HardySpicer wrote: > On Apr 27, 4:40 am, Tim Wescott <t...(a)seemywebsite.now> wrote: >> Cagdas Ozgenc wrote: >>> Hello, >>> In Kalman filtering does the process noise have to be Gaussian or >>> would any uncorrelated covariance stationary noise satisfy the >>> requirements? >>> When I follow the derivations of the filter I haven't encountered any >>> requirements on Gaussian distribution, but in many sources Gaussian >>> tag seems to go together. >> The Kalman filter is only guaranteed to be optimal when: >> >> * The modeled system is linear. >> * Any time-varying behavior of the system is known. >> * The noise (process and measurement) is Gaussian. >> * The noise's time-dependent behavior is known >> (note that this means the noise doesn't have to be stationary -- >> just that it's time-dependent behavior is known). >> * The model exactly matches reality. >> >> None of these requirements can be met in reality, but the math is at its >> most tractable when you assume them. Often the Gaussian noise >> assumption comes the closest to being true -- but not always. >> >> If your system matches all of the above assumptions _except_ the >> Gaussian noise assumption, then the Kalman filter that you design will >> have the lowest error variance of any possible _linear_ filter, but >> there may be nonlinear filters with better (perhaps significantly >> better) performance. > > Don't think so. You can design an H infinity linear Kalman filter > which is only a slight modification and you don't even need to know > what the covariance matrices are at all. > H infinity will give you the minimum of the maximum error. But strictly speaking the H-infinity filter isn't a Kalman filter. It's certainly not what Rudi Kalman cooked up. It is a state-space state estimator, and is one of the broader family of "Kalmanesque" filters, however. And the H-infinity filter won't minimize the error variance -- it minimizes the min-max error, by definition. -- Tim Wescott Control system and signal processing consulting www.wescottdesign.com
From: Tim Wescott on 26 Apr 2010 22:54 Peter K. wrote: > On 26 Apr, 21:52, HardySpicer <gyansor...(a)gmail.com> wrote: > >> Don't think so. You can design an H infinity linear Kalman filter >> which is only a slight modification and you don't even need to know >> what the covariance matrices are at all. >> H infinity will give you the minimum of the maximum error. > > As Tim says, the Kalman filter is the optimal linear filter for > minimizing the average estimation error. Reformulations using H- > infinity techniques do not give an optimal linear filter in this > sense. > > As you say, though, H-nifty (sic) give the optimal in terms of > minimizing the worst case estimation error... which may or may not > give "better" results than the Kalman approach. > > Depending on the application, neither "optimal" approach may give > exactly what the user is after... their idea of "optimal' may be > different from what the mathematical formulations give. Indeed, the first step in applying someone's "optimal" formulation is deciding if their "optimal" comes within the bounds of your "good enough". -- Tim Wescott Control system and signal processing consulting www.wescottdesign.com
From: Cagdas Ozgenc on 27 Apr 2010 06:34 On Apr 27, 6:54 am, Tim Wescott <t...(a)seemywebsite.now> wrote: > Peter K. wrote: > > On 26 Apr, 21:52, HardySpicer <gyansor...(a)gmail.com> wrote: > > >> Don't think so. You can design an H infinity linear Kalman filter > >> which is only a slight modification and you don't even need to know > >> what the covariance matrices are at all. > >> H infinity will give you the minimum of the maximum error. > > > As Tim says, the Kalman filter is the optimal linear filter for > > minimizing the average estimation error. Reformulations using H- > > infinity techniques do not give an optimal linear filter in this > > sense. > > > As you say, though, H-nifty (sic) give the optimal in terms of > > minimizing the worst case estimation error... which may or may not > > give "better" results than the Kalman approach. > > > Depending on the application, neither "optimal" approach may give > > exactly what the user is after... their idea of "optimal' may be > > different from what the mathematical formulations give. > > Indeed, the first step in applying someone's "optimal" formulation is > deciding if their "optimal" comes within the bounds of your "good enough".. > > -- > Tim Wescott > Control system and signal processing consultingwww.wescottdesign.com- Hide quoted text - > > - Show quoted text - Bottom line is without Gaussian distribution assumption only the optimality condition doesn't hold. But it is still the best linear estimator, but not the best overall estimator. Right?
From: Frnak McKenney on 27 Apr 2010 08:04 On Mon, 26 Apr 2010 19:54:35 -0700, Tim Wescott <tim(a)seemywebsite.now> wrote: > Indeed, the first step in applying someone's "optimal" formulation is > deciding if their "optimal" comes within the bounds of your "good enough". Tim, A phrase that deserves repeating in a variety of contexts. Mind if I steal it? Frank "TaglinesRUs" McKenney -- It does not do to leave a live dragon out of your calculations. -- J. R. R. Tolkien -- Frank McKenney, McKenney Associates Richmond, Virginia / (804) 320-4887 Munged E-mail: frank uscore mckenney ayut mined spring dawt cahm (y'all)
From: Rune Allnor on 27 Apr 2010 08:29
On 27 apr, 04:54, Tim Wescott <t...(a)seemywebsite.now> wrote: > Indeed, the first step in applying someone's "optimal" formulation is > deciding if their "optimal" comes within the bounds of your "good enough". Ehh... I would rate that as the *second* step. The first item on my list would be find out in what sense an 'optimal' filter is optimal: - Error magnitude? - Operational robustness? - Computational efficency? - Ease of implementation? - Economy? - Balancing all the above? Rune |