Rigorous optimization recipes for sparse and low rank inverse problems with applications in data sciences

by Anastasios Kyrillidis

Institution: EPFL
Year: 2014
Keywords: Sparse Euclidean projections; sparse linear regression; compressed sensing; affine rank minimization; matrix completion; structured sparsity; convex composite minimization; self-concordance
Record ID: 1091156
Full text PDF: http://infoscience.epfl.ch/record/202053


Many natural and man-made signals can be described as having a few degrees of freedom relative to their size due to natural parameterizations or constraints; examples include bandlimited signals, collections of signals observed from multiple viewpoints in a network-of-sensors, and per-flow traffic measurements of the Internet. Low-dimensional models (LDMs) mathematically capture the inherent structure of such signals via combinatorial and geometric data models, such as sparsity, unions-of-subspaces, low-rankness, manifolds, and mixtures of factor analyzers, and are emerging to revolutionize the way we treat inverse problems (e.g., signal recovery, parameter estimation, or structure learning) from dimensionality-reduced or incomplete data. Assuming our problem resides in a LDM space, in this thesis we investigate how to integrate such models in convex and non-convex optimization algorithms for significant gains in computational complexity. We mostly focus on two LDMs: $(i)$ sparsity and $(ii)$ low-rankness. We study trade-offs and their implications to develop efficient and provable optimization algorithms, and – more importantly – to exploit convex and combinatorial optimization that can enable cross-pollination of decades of research in both.