Advanced Topics in Electrical Engineering

Sparse Representations: Theory, Algorithms, and Applications

Due to their wide applicability, sparse and low-rank models for dimensionality reduction have quickly become some of the most important tools for today’s researchers in signal processing, machine learning, statistics, optimization, bioinformatics, computer vision, and image processing. Application areas in which sparse and low-rank modelling tools have been applied span a wide range of topics including: clustering, object recognition and classification, regularized regression, face recognition, robust principle component analysis, deep learning feature selection, natural language processing, compressive sensing, multi-user communication, super-resolution, denoising, wireless, and computational neuroscience.

 With this topics class, we want to help quickly bring interested students and researchers from this wide array of disciplines up to speed on the power and wide applicability of sparse models and theory. The ultimate aim of the course is to equip you with all the modeling and optimization tools you’ll need in order to formulate and solve problems of interest.

Grading:  Grades will be based on presentations of papers, participation, and a project.  In the project, students will study some topic beyond the material of the class and provide a report as well as an end-of semester presentation.   The project  could be related to your current research or be a paper survey

Time/Location: MW 3:30-4:45:  Both at UCSC and Silicon Valley, Videos of the lectures will be available

Text:  Readings will be based on papers and lecture notes

Register:  Click to ask for registration code

My Office Hours:  Monday: 12:30-1:30 E2 Room 245A,  Wednesday: 2:30-3:30 SVC 

Syllabus/Topics

·       Introduction and motivating examples.  Sparsity, l0, l1 and l2 norms . Contrast with linear or nonadaptive models.

·       Sparse inverse problems via L1-regularization:  Basis pursuit, LASSO, the restricted isometry property,

·       Exact recovery of sparse solutions, coherence, large random matrices, performance bounds.

·       Inexact observations: noisy recovery of compressible signals

·        ·       Scalable algorithms & greedy methods such as orthogonal matching pursuit.

·        ·       Model selection consistency

·       ·       Probabilistic tools: Johnson-Lindenstrauss lemma, Kashin's theorem,

  ·      ·       Matrix rank minimization ,low-rank matrix factorization:  examples, connections to dictionary learning,  nuclear norm minimization algorithms

·        ·      Graphical models & message passing

    ·   ·     Computer Vision & neural sparse coding.

·        ·         Other potential topics, depending on time and interest .  (May be pursued in the projects):  Group sparsity, graphical models methods, finite rate of innovations, imaging, biomedical applications, connectomics.

Pre-requisites:  To enable students from a range of fields can take this course, the only pre-requisite is knowledge of linear algebra and basic probability.  Students will also need to be comfortable with MATLAB. 

Texts: The class will be taught out of recent research papers in the area. The following texts may
enhance students appreciation for the material:
- Elad, Sparse and Redundant Representations
- Matousek, Lectures on Discrete Geometry
- Boyd and Vandenberghe, Convex Optimization

 

Instructors and Assistants