Conditioning and robustness of subspace estimates

Subspace identification is known to be an excellent tool for the creation of linear models based on available measurements. Although Subspace algorithms are much more robust than classical predictor error methods, which tend to get stuck in local minima, subspace algorithms are not necessarily free of conditioning problems.

One such problem is the fact that if certain row-spaces of the Hankel-matrices used in subspace identification are almost parallel, all subspace implementations that use oblique projections might result in very bad estimates. We try to obtain some theoretical insight into how bad things can actually become.

An even bigger problem is the so-called positive-realness problem. Subspace algorithms can not guarantee that a positive-real or passive model is obtained even if the original system was passive. Also in this category falls the fact that subspace algorithms can not guarantee stable models, even if the original system is known to be stable. A paper suggesting a solution to the positive realness problem can be found here.

All in all, you would like to have a good guess of the variance on your estimates once you obtain your subspace results. We are working on some practical algorithms to obtain them by using matrix function differentiation of all the QR and SVD steps in a subspace algorithm to obtain a first order perturbation analysis.


Researcher(s):  Ivan Goethals