From: Tharindu Patikirikorala on 10 May 2010 22:13 I am using a ARX command to model a MIMO system. I wrote my own algorithm for least squares regression(LSR) and identify the model. Lets say Ua and Ub are the inputs. These two inputs are correlated Ua = 20-Ub where Ub = {4,5,6,7,...16}. So the input signal generated for MIMO identification will have correlated inputs. Which means the correlation vector(covariance vector) in LSR will be singular. The code I wrote fails to consturct a model due to this issue. However, arx command does come up with a model, with good prediction accuracy. What could be the reason for this? If I make the na = 0 and nb > 1, does this mean that I am getting FIR (Finite impulse response) model?
|
Pages: 1 Prev: Resolution of two equations with two unknown Next: HERest error |