ECE Header Logo

EEC263 – Optimal And Adaptive Filtering

4 units – Spring Quarter; alternate years

Lecture: 3 hours

Discussion: 1 hour

Prerequisite: EEC 260

Grading: Letter; problem sets, MATLAB assignments, and term paper.

Catalog Description:

Geometric formulation of least-squares estimation problems. Theory and applications of optimum Weiner and Kalman filtering. MAP and maximum likelihood estimation of hidden Markov models, Viterbi algorithm. Adaptive filtering algorithms, properties, and applications.

Expanded Course Description:

  1. Geometric Formulation of Linear Least-Squares Estimation
    1. Euclidean space
    2. Least-squares estimation
    3. Hilbert space of random variables
    4. Orthogonality principle of linear least-squares estimates
  2. Wiener Filtering
    1. FIR Wiener filters
    2. Levinson recursions, lattice filters
    3. Noncausal Wiener filters
    4. Causal Wiener filters: Wiener-Hopf equation, spectral factorization, innovations process
  3. Kalman Filtering
    1. Gauss-Markov state-variable models
    2. Innovations process, Kalman Recursions
    3. Steady-state behavior of Kalman filters
    4. Square-root algorithms
    5. Smoothing formulas
  4. Estimation of hidden Markov models
    1. Markov chains observed in noise
    2. MAP estimation and maximum likelihood sequence estimation, Viterbi algorithm
    3. Applications
  5. Adaptive filtering
    1. Gradient method for FIR filtering
    2. LMS algorithm, convergence and steady-state performance
    3. Method of least-squares and RLS algorithm
    4. Fast and square-root RLS algorithms
    5. Applications: equalization, notch filtering, echo cancellation, antenna beamforming

Textbook/reading:

  1. M. Hayes, Statistical Digital Signal Processing and Modeling J. Wiley, 1996.

Instructors: Levy, Ding

THIS COURSE DOES NOT DUPLICATE ANY EXISTING COURSE.

Last revised: Spring 2003