Learning Sparse Representations for Image and Signal Modeling
AA 2018/2019, PhD Course, Politecnico di Milano
Mission:
The main goal of this course is to provide the student with an understanding of the most important aspects of the theory underlying sparse representation and, more in general, of sparsity as a form of regularization in learning problems. Students will have the opportunity to develop and understand the main algorithms for learning sparse models and computing sparse representations. These methods have wide applicability in computer science, and these will be a useful background for their research.
More information on the Course program page
Detailed Program (tentative):
- Basics on linear orthonormal representations
- Introduction: getting familiar with transform-domain representation and sparsity.
- Basic ground: linear representations, norms, orthonormal basis/projections, sparsity in a strict ell0 sense.
- Computer Lab: projection w.r.t. orthonormal basis, computing sparse solution. Introduce denoising as a running example.
- Redundant representations and Sparse Coding (Minimum ell0 norm)
- Sparse Coding w.r.t redundant dictionaries. Greedy Algorithms: Matching Pursuit, Orthogonal Matching Pursuit.
- Computer Lab: Greedy Sparse-Coding Algorithms
- An overview of theoretical guarantees and convergence results.
- Convex Relaxation of the Sparse Coding Problem (Minimum ell1 norm)
- Sparse coding as a convex optimization problem, connections with ell0 solutions.
- BPDN and the LASSO, the two formulation and ell1 sparsity as a regularization prior in linear regression
- Other norms promoting sparsity: visual intuition. Theoretical guarantees.
- Minimum ell1 Sparse Coding Algorithms: Iterative Reweighted Least Squares, Proximal Methods, Iterative Soft Thresholding, ADMM.
- Computer Lab: Minimum ell1 Sparse Coding Algorithms
- Dictionary Learning
- Dictionary Learning Algorithms: Gradient Descent, MOD, KSVD
- Task-Driven Dictionary Learning for Supervised learning tasks
- Computer Lab: Dictionary Learning
Calendar:
- June 12, 2019 from 10:00 to 13:00
- June 13, 2019 from 10:00 to 13:00
- June 14, 2019 from 10:00 to 13:00
Teaching Materials:
On this website we will upload additional teching materials before each lecture stars.
Fisrt Class: Background on Sparsity and Transform Domain Signal Processing
Second Class: minimum ell0 algorithms, minimum ell1 algorithms
Fourth Class: Dictionary Learning algorithms
|