November 24, 2020

12:00 pm / 1:15 pm

Venue

https://wse.zoom.us/j/93732848196?pwd=Vi9HZlRoUytyOHBVYVNkYjdMRjZ0dz09

Recorded Seminar: 
https://wse.zoom.us/rec/share/nZxIgAABKiSv3Czlaah1i9opoaXLt4LypMx9zJIvfofSQg_MdhxRORXkEyrxEEHC.rYkND0jCzQORCyWv?startTime=1606236757000
Poorya Mianjy
PhD Candidate
Department of Computer Science
Johns Hopkins University

Join Zoom Meeting
https://wse.zoom.us/j/93732848196?pwd=Vi9HZlRoUytyOHBVYVNkYjdMRjZ0dz09

Meeting ID: 937 3284 8196
Passcode: clark_hall

Abstract:  Dropout is a popular algorithmicregularization technique for training deep neural networks. While it has beenshown effective across a wide range of machine learning tasks ? like many otherpopular heuristics in deep learning ? dropout lacks a strong theoreticaljustification. In this talk, we present statistical and computational learningtheoretic guarantees for dropout training in several machinelearning models,including matrix sensing, deep linear networks, and two-layer ReLUnetworks. This talk is based primarily on the following two papers:https://arxiv.org/pdf/2003.03397.pdf, https://arxiv.org/pdf/2010.12711.pdf.

Bio:  Poorya Mianjy is a Ph.D. candidatein the Department of Computer Science at the Johns Hopkins University, advisedby Raman Arora. He is interested in theoretical machine learning, and inparticular, the theory of deep learning.