This paper derives fixed‐order recursive Least‐Squares (LS) algorithms that can be used in system identification and adaptive filtering applications such as spectral estimation, and speech analysis and synthesis. These algorithms solve the sliding‐window and growing‐memory covariance LS estimation problems, and require less computation than both unnormalized and normalized versions of the computationally efficient order‐recursive (lattice) covariance algorithms previously presented. The geometric or Hilbert space approach, originally introduced by Lee and Morf to solve the prewindowed LS problem, is used to systematically generate least‐squares recursions. We show that combining subsets of these recursions results in prewindowed LS lattice and fixed‐order (transversal) algorithms, and in sliding‐window and growing‐memory covariance lattice and transversal algorithms. The paper discusses both least‐squares prediction and joint‐process estimation.
|Original language||English (US)|
|Number of pages||32|
|Journal||Bell System Technical Journal|
|State||Published - Dec 1983|
ASJC Scopus subject areas