November 24, 2020

12:00 pm / 1:15 pm


Recorded Seminar:
Poorya Mianjy
PhD Candidate
Department of Computer Science
Johns Hopkins University

Join Zoom Meeting

Meeting ID: 937 3284 8196
Passcode: clark_hall

Abstract:  Dropout is a popular algorithmicregularization technique for training deep neural networks. While it has beenshown effective across a wide range of machine learning tasks ? like many otherpopular heuristics in deep learning ? dropout lacks a strong theoreticaljustification. In this talk, we present statistical and computational learningtheoretic guarantees for dropout training in several machinelearning models,including matrix sensing, deep linear networks, and two-layer ReLUnetworks. This talk is based primarily on the following two papers:,

Bio:  Poorya Mianjy is a Ph.D. candidatein the Department of Computer Science at the Johns Hopkins University, advisedby Raman Arora. He is interested in theoretical machine learning, and inparticular, the theory of deep learning.