120 likes | 275 Vues
ELG5377 Adaptive Signal Processing. Lecture 15: Recursive Least Squares (RLS) Algorithm. Introduction. MLS states that We would like to compute j and z recursively.
E N D
ELG5377 Adaptive Signal Processing Lecture 15: Recursive Least Squares (RLS) Algorithm
Introduction • MLS states that • We would like to compute jand z recursively. • To account for any time variance, we would also incorporate a “forgetting” factor so that more weight is given to current inputs that previous ones. • To do this, we modify the cost function to be minimized.
Cost Function We can show that
Reformulation of Normal Equations • From previous, we can reformulate the time averaged autocorrelation function as: • And the time averaged cross-correlation becomes: Derivation done on blackboard
Result • By simply updating j(n) and z(n), we can compute • However, this needs a matrix inversion at each iteration. • Higher computational complexity. • Update j-1(n) each iteration instead!
Matrix Inversion Lemma • Let A and B be two positive definite M by M matrices related by: • A = B-1+CD-1CH. • Where D is a positive definite N by M matrix and C is an M by N matrix. • Then A-1 is given by: • A-1 = B-BC(D+CHBC)-1CHB.
Applying Matrix Inversion Lemma to Finding j-1(n) from j-1(n-1)
Applying Matrix Inversion Lemma to Finding j-1(n) from j-1(n-1) (2) • For convenience, let