<div dir="ltr"><div class="gmail_default" style="font-weight:bold"><font color="#000000" face="arial, helvetica, sans-serif">When: <span style="font-weight:400">    Wednesday, February 27th </span><span class="m_-1857413853192218168gmail-m_-7514465738588597227gmail-m_3310145888887649371m_-7279440879016530634gmail-m_5313719385091340337gmail-m_-1668257695080322674gmail-m_-7879518409613487194gmail-m_-5333227643664982572m_2625127627517695854m_2683896348608817813gmail-m_7672563966056633266gmail-m_-6461243813863673855gmail-m_-742000311328020925gmail-m_7559459027998801583gmail-m_4801029585485711767gmail-m_8517121454174849988gmail-m_-6691959996525573090gmail-m_1517372298344856049gmail-m_491069367152086750gmail-m_-8327640324523575189gmail-m_2420618808463760418gmail-m_7960197898027616883gmail-m_8692226636264124041gmail-m_2794822896869921223gmail-m_7508998950622620526gmail-m_-7153355664495542534gmail-il" style="font-weight:400">at</span><span style="font-weight:400"> </span><b>11:00 am</b></font></div><div class="gmail_default"><font color="#000000" face="arial, helvetica, sans-serif"><br></font></div><div class="gmail_default" style="font-weight:bold"><font color="#000000" face="arial, helvetica, sans-serif">Where:<span style="font-weight:400">    </span><span class="m_-1857413853192218168gmail-m_-7514465738588597227gmail-m_3310145888887649371m_-7279440879016530634gmail-m_5313719385091340337gmail-m_-1668257695080322674gmail-m_-7879518409613487194gmail-m_-5333227643664982572m_2625127627517695854m_2683896348608817813gmail-m_7672563966056633266gmail-m_-6461243813863673855gmail-m_-742000311328020925gmail-m_7559459027998801583gmail-m_4801029585485711767gmail-m_8517121454174849988gmail-m_-6691959996525573090gmail-m_1517372298344856049gmail-m_491069367152086750gmail-m_-8327640324523575189gmail-m_2420618808463760418gmail-m_7960197898027616883gmail-m_8692226636264124041gmail-m_2794822896869921223gmail-m_7508998950622620526gmail-m_-7153355664495542534gmail-m_8421504075585210435gmail-m_3262824545120381495gmail-m_-1141671822915777344gmail-m_-7219251726624328345gmail-m_-8588148075564318222gmail-m_-8767966813928691312gmail-m_-1542318334608687154gmail-m_5717104778280916634gmail-m_4845490158781220632gmail-m_5124567205141626540gmail-m_3209361100497750746gmail-m_2953668934074478317gmail-m_-3155518689668024534m_9067904842688472155gmail-m_3071693547520408192gmail-il" style="font-weight:400"><span class="m_-1857413853192218168gmail-m_-7514465738588597227gmail-m_3310145888887649371m_-7279440879016530634gmail-m_5313719385091340337gmail-m_-1668257695080322674gmail-m_-7879518409613487194gmail-m_-5333227643664982572m_2625127627517695854m_2683896348608817813gmail-m_7672563966056633266gmail-m_-6461243813863673855gmail-m_-742000311328020925gmail-m_7559459027998801583gmail-m_4801029585485711767gmail-m_8517121454174849988gmail-m_-6691959996525573090gmail-m_1517372298344856049gmail-m_491069367152086750gmail-m_-8327640324523575189gmail-m_2420618808463760418gmail-m_7960197898027616883gmail-m_8692226636264124041gmail-m_2794822896869921223gmail-m_7508998950622620526gmail-m_-7153355664495542534gmail-il"><span class="m_-1857413853192218168gmail-m_-7514465738588597227gmail-m_3310145888887649371m_-7279440879016530634gmail-m_5313719385091340337gmail-m_-1668257695080322674gmail-m_-7879518409613487194gmail-m_-5333227643664982572m_2625127627517695854m_2683896348608817813gmail-m_7672563966056633266gmail-m_-6461243813863673855gmail-m_-742000311328020925gmail-m_7559459027998801583gmail-m_4801029585485711767gmail-il">TTIC</span></span></span><span style="font-weight:400">, 6045 S Kenwood Avenue, 5th Floor, Room 526</span></font></div><div class="gmail_default"><font face="arial, helvetica, sans-serif" color="#000000"><br></font></div><p class="MsoNormal" style="margin:0in 0in 0.0001pt;line-height:normal;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial"><font face="arial, helvetica, sans-serif" color="#000000"><span style="font-weight:bold">Who:</span>       Jason Lee, USC<b><br></b></font></p><p class="MsoNormal" style="margin:0in 0in 0.0001pt;line-height:normal;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial"><br></p><p class="MsoNormal" style="margin:0in 0in 0.0001pt;line-height:normal;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial"><font face="arial, helvetica, sans-serif" color="#000000"><b>Title:       </b>On the Foundations of Deep Learning: SGD, Overparametrization, and Generalization</font></p><p class="MsoNormal" style="margin:0in 0in 0.0001pt;line-height:normal;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial"><font face="arial, helvetica, sans-serif" color="#000000"> </font></p><p class="MsoNormal" style="margin:0in 0in 0.0001pt;line-height:normal;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial"><font face="arial, helvetica, sans-serif" color="#000000"><b>Abstract: </b>We provide new results on the effectiveness of SGD and overparametrization in deep learning.</font></p><p class="MsoNormal" style="margin:0in 0in 0.0001pt;line-height:normal;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial"><font face="arial, helvetica, sans-serif" color="#000000"> </font></p><p class="MsoNormal" style="margin:0in 0in 0.0001pt;line-height:normal;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial"><font face="arial, helvetica, sans-serif" color="#000000">a) SGD: We show that SGD converges to stationary points for general nonsmooth , nonconvex functions, and that stochastic subgradients can be efficiently computed via Automatic Differentiation. For smooth functions, we show that gradient descent, coordinate descent, ADMM, and many other algorithms, avoid saddle points and converge to local minimizers. For a large family of problems including matrix completion and shallow ReLU networks, this guarantees that gradient descent converges to a global minimum.</font></p><p class="MsoNormal" style="margin:0in 0in 0.0001pt;line-height:normal;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial"><font face="arial, helvetica, sans-serif" color="#000000"> </font></p><p class="MsoNormal" style="margin:0in 0in 0.0001pt;line-height:normal;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial"><font face="arial, helvetica, sans-serif" color="#000000">b) Overparametrization: We show that gradient descent finds global minimizers of the training loss of overparametrized deep networks in polynomial time.</font></p><p class="MsoNormal" style="margin:0in 0in 0.0001pt;line-height:normal;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial"><font face="arial, helvetica, sans-serif" color="#000000"> </font></p><p class="MsoNormal" style="margin:0in 0in 0.0001pt;line-height:normal;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial"><font face="arial, helvetica, sans-serif" color="#000000">c) Generalization:</font></p><p class="MsoNormal" style="margin:0in 0in 0.0001pt;line-height:normal;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial"><font face="arial, helvetica, sans-serif" color="#000000">For general neural networks, we establish a margin-based theory. The minimizer of the cross-entropy loss with weak regularization is a max-margin predictor, and enjoys stronger generalization guarantees as the amount of overparametrization increases.</font></p><p class="MsoNormal" style="margin:0in 0in 0.0001pt;line-height:normal;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial"><font face="arial, helvetica, sans-serif" color="#000000"> </font></p><p class="MsoNormal" style="margin:0in 0in 0.0001pt;line-height:normal;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial"><font face="arial, helvetica, sans-serif" color="#000000">d) Algorithmic and Implicit Regularization: We analyze the implicit regularization effects of various optimization algorithms on overparametrized networks. In particular we prove that for least squares with mirror descent, the algorithm converges to the closest solution in terms of the bregman divergence. For linearly separable classification problems, we prove that the steepest descent with respect to a norm solves SVM with respect to the same norm. For over-parametrized non-convex problems such as matrix sensing or neural net with quadratic activation, we prove that gradient descent converges to the minimum nuclear norm solution, which allows for both meaningful optimization and generalization guarantees.</font></p><p class="MsoNormal" style="margin:0in 0in 0.0001pt;line-height:normal;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial"><font face="arial, helvetica, sans-serif" color="#000000"> </font></p><p class="MsoNormal" style="margin:0in 0in 0.0001pt;line-height:normal;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial"><font face="arial, helvetica, sans-serif" color="#000000"><br></font></p><p class="MsoNormal" style="margin:0in 0in 0.0001pt;line-height:normal;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial"><font face="arial, helvetica, sans-serif" color="#000000"><br></font></p><p class="MsoNormal" style="margin:0in 0in 10pt;line-height:14.95px"><font face="arial, helvetica, sans-serif" color="#000000">Host: <a href="mailto:nati@ttic.edu" target="_blank"> Nathan Srebro</a></font></p><div><br></div>-- <br><div dir="ltr" class="m_-1857413853192218168gmail_signature" data-smartmail="gmail_signature"><div dir="ltr"><div><div dir="ltr"><div dir="ltr"><b><font color="#0b5394">Alicia McClarin</font></b><div><div><font color="#0b5394"><i>Toyota Technological Institute at Chicago</i></font></div><div><div><font color="#0b5394"><i>6045 S. Kenwood Ave., </i></font><i style="color:rgb(11,83,148)">Office 510</i></div><div><font color="#0b5394"><i>Chicago, IL 60637</i></font></div><div><font color="#0b5394"><i>773-702-5370</i></font></div></div><div><a href="http://www.ttic.edu/" target="_blank"><font color="#0b5394"><i>www.ttic.edu</i></font></a></div></div></div></div></div></div></div></div>