### Similar presentations

ESIEE, Slide 2Outline Adaptive filters and LMS algorithm Adaptive filters and LMS algorithm Implementation of FIR filters on C54x Implementation of FIR filters on C54x Implementation of FIR filters on C55x Implementation of FIR filters on C55x

Copyright © 2003 Texas Instruments. All rights reserved. ESIEE, Slide 7 Stochastic Gradient Algorithm: LMS The mean values E(e(n)x(n-i)) are not known. The mean values E(e(n)x(n-i)) are not known. In the stochastic gradient algorithm, they are replaced by e(n)x(n-i). In the stochastic gradient algorithm, they are replaced by e(n)x(n-i). The algorithm converges if the adaptation step is small enough. The algorithm converges if the adaptation step is small enough. Algorithm named: LMS (Least Mean Square) or Widrow algorithm: Algorithm named: LMS (Least Mean Square) or Widrow algorithm:

Copyright © 2003 Texas Instruments. All rights reserved. ESIEE, Slide 8 LMS Algorithm For each sample, the LMS algorithm: For each sample, the LMS algorithm: Filters the input using b i Filters the input using b i Updates the b i coefficients. Updates the b i coefficients.

Copyright © 2003 Texas Instruments. All rights reserved. ESIEE, Slide 9 LMS Algorithm at Each Sample Time FIR Filtering equation: FIR Filtering equation: Coefficient updating equation: Coefficient updating equation:

Copyright © 2003 Texas Instruments. All rights reserved. ESIEE, Slide 11 LMS steps Each time iteration (only once): Each time iteration (only once): Calculates error e n =r n -y n Calculates error e n =r n -y n Scale error by adaptation step : e n = e n. Scale error by adaptation step : e n = e n. Each time iteration, for each coefficient: Each time iteration, for each coefficient: Multiply error with signal: e i = e n x n-i Multiply error with signal: e i = e n x n-i Multiply x n-i b i and accumulate Multiply x n-i b i and accumulate Calculate new coefficients: newb i = b i +e i Calculate new coefficients: newb i = b i +e i Update coefficients: b i = newb i. Update coefficients: b i = newb i.

Copyright © 2003 Texas Instruments. All rights reserved. ESIEE, Slide 12 Implementing LMS Algorithm on C54x LMS specific instruction to realize: LMS specific instruction to realize: Filter and coefficient updating. Filter and coefficient updating. B = B + (b i *x n-i ); A = rnd(e i +b i ) B = B + (b i *x n-i ); A = rnd(e i +b i ) Rounding is important because may be very small. Rounding is important because may be very small. Filter and coefficients updating equations: Filter and coefficients updating equations: To be done at each sample time n: To be done at each sample time n:

Copyright © 2003 Texas Instruments. All rights reserved. ESIEE, Slide 13 LMS instruction With LMS instruction, LMS FIR: With LMS instruction, LMS FIR: 2N Cycles per tap. 2N Cycles per tap. LMS Xmem, Ymem LMS Xmem, Ymem (A) + (Xmem)<<16+2 15 A (A) + (Xmem)<<16+2 15 A (B) + (Xmem) x (Ymem) B (B) + (Xmem) x (Ymem) B Uses both ACCUs A and B. Uses both ACCUs A and B. This instruction does not modify T. This instruction does not modify T. Xmem points on b i, Ymem on x n-i Xmem points on b i, Ymem on x n-i Data x are stored in a circular buffer. Data x are stored in a circular buffer.