[Theory] 2/27 Talks at TTIC: Jason Lee, USC

Mary Marre via Theory theory at mailman.cs.uchicago.edu
Thu Feb 21 14:07:07 CST 2019


When:     Wednesday, February 27th at *11:00 am*

Where:    TTIC, 6045 S Kenwood Avenue, 5th Floor, Room 526

Who:       Jason Lee, USC



*Title:       *On the Foundations of Deep Learning: SGD,
Overparametrization, and Generalization



*Abstract: *We provide new results on the effectiveness of SGD and
overparametrization in deep learning.



a) SGD: We show that SGD converges to stationary points for general
nonsmooth , nonconvex functions, and that stochastic subgradients can be
efficiently computed via Automatic Differentiation. For smooth functions,
we show that gradient descent, coordinate descent, ADMM, and many other
algorithms, avoid saddle points and converge to local minimizers. For a
large family of problems including matrix completion and shallow ReLU
networks, this guarantees that gradient descent converges to a global
minimum.



b) Overparametrization: We show that gradient descent finds global
minimizers of the training loss of overparametrized deep networks in
polynomial time.



c) Generalization:

For general neural networks, we establish a margin-based theory. The
minimizer of the cross-entropy loss with weak regularization is a
max-margin predictor, and enjoys stronger generalization guarantees as the
amount of overparametrization increases.



d) Algorithmic and Implicit Regularization: We analyze the implicit
regularization effects of various optimization algorithms on
overparametrized networks. In particular we prove that for least squares
with mirror descent, the algorithm converges to the closest solution in
terms of the bregman divergence. For linearly separable classification
problems, we prove that the steepest descent with respect to a norm solves
SVM with respect to the same norm. For over-parametrized non-convex
problems such as matrix sensing or neural net with quadratic activation, we
prove that gradient descent converges to the minimum nuclear norm solution,
which allows for both meaningful optimization and generalization guarantees.





Host:  Nathan Srebro <nati at ttic.edu>




Mary C. Marre
Administrative Assistant
*Toyota Technological Institute*
*6045 S. Kenwood Avenue*
*Room 517*
*Chicago, IL  60637*
*p:(773) 834-1757*
*f: (773) 357-6970*
*mmarre at ttic.edu <mmarre at ttic.edu>*
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.cs.uchicago.edu/pipermail/theory/attachments/20190221/d9b11e30/attachment-0002.html>


More information about the Theory mailing list