March 3, 2017

5:00 pm / 6:00 pm

Venue

Clark 314

Talk 2: Theoretical Guarantees for Convolutional Sparse Coding, and a Look into Convolutional Neural Networks 5:00 pm ? 6:00 pm

Within the wide field of sparse approximation, convolutional sparse coding (CSC) has gained increasing attention in recent years, assuming a structured dictionary built as a union of banded Circulant matrices. While several works have been devoted to the practical aspects this model, a systematic theoretical understanding of CSC seems to have been left aside. In this talk Iwill present a novel analysis of CSC problem, based on the observation that while being global, this model can be characterized and analyzed locally. By imposing only local sparsity conditions, we show that uniqueness of solutions, stability to noise contamination and success of pursuit algorithms (both greedy and convex-relaxations) are globally guaranteed. These newresults are much stronger and informative than those obtained by deployingthe classical sparse theory. Finally, I will briefly present a multi-layer extension of this model and show that it is closely related to Convolutional Neural Networks (CNN). This connection brings a fresh view to CNN, as one can to attribute to this architecture theoretical claims under simple local sparse assumptions. This, in turn, will shed light on ways of improving the design and implementation of algorithms for CNN