1 / 14

- Claus Benjaminsen and Shyam Bharat ECE 734 Project Presentation

A Recursive Method for the Solution of the Linear Least Squares Formulation Algorithm and Performance in Connection with the PLX Instruction Set. - Claus Benjaminsen and Shyam Bharat ECE 734 Project Presentation. Aim.

yasuo
Télécharger la présentation

- Claus Benjaminsen and Shyam Bharat ECE 734 Project Presentation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. A Recursive Method for the Solution of the Linear Least Squares Formulation Algorithm and Performance in Connection with the PLX Instruction Set - Claus Benjaminsen and Shyam Bharat ECE 734 Project Presentation

  2. Aim • Linear Least Squares Estimation (LLSE) for Digital Signal Processing (DSP) applications • Updating of weight vector in time, based on value at previous instant

  3. Background • Least Squares Estimation (LSE) – minimization of squared error between observed data sequence and assumed signal model • Linear LSE (LLSE) – signal model is linear function of parameter to be estimated • Computation of inverse of autocorrelation matrix of assumed signal model (P x P) • For DSP applications, P  ∞ !!

  4. Motivation • Practically impossible to compute inverse of matrix with infinite entries ! • Need for alternate method not involving matrix inverse computations • Answer: Recursive Linear Least Squares!

  5. Preliminaries Function to be minimized w.r.t parameter: , 0 < λ < 1 where en(k) = d(k) – wH(n)u(k) In matrix form, J(w(n)) = [d* - uHw(n)]HΛ[d – uHw(n)] where Λ = diag{λn-1, λn-2, … , λ, 1} Solution to this equation: w(n) = (uΛuH)-1uΛd

  6. Alternate Representation w(n) = Φ(n)-1ө(n) where: Recursively, Φ(n) = λΦ(n-1) + u(n)uH(n) ө(n) = λө(n-1) + u(n)d*(n)

  7. Recursive LLSE Update the weight vector: w(n) = w(n-1) + k(n)[d*(n) – uH(n)w(n-1)] where: and P(n) = Φ-1(n) where P(n) has dimensions M x M The algorithm is ….

  8. Recursive LLSE Algorithm Initialization: P(0) = δ-1I where δ-1 is small and positive w(0) = 0 For n = 1,2,3,…. x(n) = λ-1P(n-1)u(n) k(n) = [1 + uH(n)x(n)]-1x(n) α(n) = d(n) – wH(n-1)u(n) w(n) = w(n-1) + k(n)α*(n) P(n) = λ-1P(n-1) – k(n)xH(n)

  9. Dimensions • Simplifications: • Real values for all quantities • d is a scalar

  10. Data Flow Graph

  11. Analysis of the DFG The DFG has 6 loops: A – B – A t1: tm+ta = tm+ta A – C – D – E – B – A t2: tm+tm+tt+tm+ta = 3tm+ta+tt A – C – J – E – B – A t3: tm+tm+tm+tm+ta = 4tm+ta A – C – G – H – I – J – E – B – A t4: tm+tm+tm+ta+td+tm+tm+ta = 5tm+2ta+td K – L – M – O – N – K t5: tm+ta+tm+ta+tt = 2tm+ta+tt O – O t6: ta = ta

  12. Iteration Bound (T) • Definition: T is the maximum, over all loops, of the total loop computational time divided by the number of delays in that loop • T = max{t1, t2, t3, t4, t5, t6} = t4 = 5tm+2ta+td

  13. Dependence Graph

  14. Implementation Analysis • Numerical analysis - Finding upper/lower bounds of various quantities used in analysis - Generalization of bounds using variables is a complex task - Recursion leads to unboundedness for some variables - Depending on availability of resources, appropriate bounds may be placed on the concerned variables

More Related