[Colloquium] NOW: 3/8 Talks at TTIC: Yi Zhang, Princeton University

Mary Marre mmarre at ttic.edu
Mon Mar 8 11:04:12 CST 2021


*When:*      Monday, March 8th at* 11:10 am CT*



*Where:*     Zoom Virtual Talk (*register in advance here
<https://uchicagogroup.zoom.us/webinar/register/WN_Yxm8COaqRmigedIoz6CZuQ>*)



*Who: *       Yi Zhang, Princeton University


*Title: *Advancing Deep Learning by Integrating Theory and Empirics *Abstract:
*In sharp contrast to its remarkable empirical success stands a
mathematical understanding of deep learning that is still in its infancy.
Various puzzling behaviors of deep neural nets remain unexplained, and many
widely deployed deep learning systems lack theoretical guarantees. This
talk offers a perspective on the resemblance between deep learning research
and natural sciences, especially modern physics at its formative stage
where reciprocal interactions between theoretical and experimental studies
fueled the growth. As the central object of study, deep neural networks are
to deep learning as unknown particles are to physics. We can make
significant progress by building theories out of inspiring empirical
observations and verifying hypotheses with carefully designed experiments.

In this talk, I will present several of my representative works that follow
this research philosophy. Firstly, I will introduce a finite-sample
analysis of Generative Adversarial Networks that predicts the existence of
degenerate solutions (i.e. mode collapse), which we confirm empirically
using a principled test. Then I will present how the empirically identified
'noise stability' of deep neural networks trained on real-life data leads
to a substantially stronger generalization measure for deep learning.
Finally, I will describe our recent work on designing a simple test for
measuring how much deep learning has overfitted to standard datasets. *BIO:
*Yi Zhang is a Ph.D. Candidate at Princeton University where he is advised
by Sanjeev Arora. His research interests are broadly in machine learning,
with a focus on understanding the empirical success of deep learning from a
theoretical perspective. He is the recipient of the Wallace Memorial
Fellowship in Engineering (an Honorific Fellowship Award) from Princeton
University.


*Host:* David McAllester <mcallester at ttic.edu>

Mary C. Marre
Faculty Administrative Support
*Toyota Technological Institute*
*6045 S. Kenwood Avenue*
*Room 517*
*Chicago, IL  60637*
*p:(773) 834-1757*
*f: (773) 357-6970*
*mmarre at ttic.edu <mmarre at ttic.edu>*


On Mon, Mar 8, 2021 at 10:00 AM Mary Marre <mmarre at ttic.edu> wrote:

> *When:*      Monday, March 8th at* 11:10 am CT*
>
>
>
> *Where:*     Zoom Virtual Talk (*register in advance here
> <https://uchicagogroup.zoom.us/webinar/register/WN_Yxm8COaqRmigedIoz6CZuQ>*
> )
>
>
>
> *Who: *       Yi Zhang, Princeton University
>
>
> *Title: *Advancing Deep Learning by Integrating Theory and Empirics *Abstract:
> *In sharp contrast to its remarkable empirical success stands a
> mathematical understanding of deep learning that is still in its infancy.
> Various puzzling behaviors of deep neural nets remain unexplained, and many
> widely deployed deep learning systems lack theoretical guarantees. This
> talk offers a perspective on the resemblance between deep learning research
> and natural sciences, especially modern physics at its formative stage
> where reciprocal interactions between theoretical and experimental studies
> fueled the growth. As the central object of study, deep neural networks are
> to deep learning as unknown particles are to physics. We can make
> significant progress by building theories out of inspiring empirical
> observations and verifying hypotheses with carefully designed experiments.
>
> In this talk, I will present several of my representative works that
> follow this research philosophy. Firstly, I will introduce a finite-sample
> analysis of Generative Adversarial Networks that predicts the existence of
> degenerate solutions (i.e. mode collapse), which we confirm empirically
> using a principled test. Then I will present how the empirically identified
> 'noise stability' of deep neural networks trained on real-life data leads
> to a substantially stronger generalization measure for deep learning.
> Finally, I will describe our recent work on designing a simple test for
> measuring how much deep learning has overfitted to standard datasets. *BIO:
> *Yi Zhang is a Ph.D. Candidate at Princeton University where he is
> advised by Sanjeev Arora. His research interests are broadly in machine
> learning, with a focus on understanding the empirical success of deep
> learning from a theoretical perspective. He is the recipient of the Wallace
> Memorial Fellowship in Engineering (an Honorific Fellowship Award) from
> Princeton University.
>
>
> *Host:* David McAllester <mcallester at ttic.edu>
>
>
> Mary C. Marre
> Faculty Administrative Support
> *Toyota Technological Institute*
> *6045 S. Kenwood Avenue*
> *Room 517*
> *Chicago, IL  60637*
> *p:(773) 834-1757*
> *f: (773) 357-6970*
> *mmarre at ttic.edu <mmarre at ttic.edu>*
>
>
> On Sun, Mar 7, 2021 at 3:53 PM Mary Marre <mmarre at ttic.edu> wrote:
>
>> *When:*      Monday, March 8th at* 11:10 am CT*
>>
>>
>>
>> *Where:*     Zoom Virtual Talk (*register in advance here
>> <https://uchicagogroup.zoom.us/webinar/register/WN_Yxm8COaqRmigedIoz6CZuQ>*
>> )
>>
>>
>>
>> *Who: *       Yi Zhang, Princeton University
>>
>>
>> *Title: *Advancing Deep Learning by Integrating Theory and Empirics *Abstract:
>> *In sharp contrast to its remarkable empirical success stands a
>> mathematical understanding of deep learning that is still in its infancy.
>> Various puzzling behaviors of deep neural nets remain unexplained, and many
>> widely deployed deep learning systems lack theoretical guarantees. This
>> talk offers a perspective on the resemblance between deep learning research
>> and natural sciences, especially modern physics at its formative stage
>> where reciprocal interactions between theoretical and experimental studies
>> fueled the growth. As the central object of study, deep neural networks are
>> to deep learning as unknown particles are to physics. We can make
>> significant progress by building theories out of inspiring empirical
>> observations and verifying hypotheses with carefully designed experiments.
>>
>> In this talk, I will present several of my representative works that
>> follow this research philosophy. Firstly, I will introduce a finite-sample
>> analysis of Generative Adversarial Networks that predicts the existence of
>> degenerate solutions (i.e. mode collapse), which we confirm empirically
>> using a principled test. Then I will present how the empirically identified
>> 'noise stability' of deep neural networks trained on real-life data leads
>> to a substantially stronger generalization measure for deep learning.
>> Finally, I will describe our recent work on designing a simple test for
>> measuring how much deep learning has overfitted to standard datasets. *BIO:
>> *Yi Zhang is a Ph.D. Candidate at Princeton University where he is
>> advised by Sanjeev Arora. His research interests are broadly in machine
>> learning, with a focus on understanding the empirical success of deep
>> learning from a theoretical perspective. He is the recipient of the Wallace
>> Memorial Fellowship in Engineering (an Honorific Fellowship Award) from
>> Princeton University.
>>
>>
>> *Host:* David McAllester <mcallester at ttic.edu>
>>
>>
>>
>>
>>
>> Mary C. Marre
>> Faculty Administrative Support
>> *Toyota Technological Institute*
>> *6045 S. Kenwood Avenue*
>> *Room 517*
>> *Chicago, IL  60637*
>> *p:(773) 834-1757*
>> *f: (773) 357-6970*
>> *mmarre at ttic.edu <mmarre at ttic.edu>*
>>
>>
>> On Wed, Mar 3, 2021 at 4:31 PM Mary Marre <mmarre at ttic.edu> wrote:
>>
>>> *When:*      Monday, March 8th at* 11:10 am CT*
>>>
>>>
>>>
>>> *Where:*     Zoom Virtual Talk (*register in advance here
>>> <https://uchicagogroup.zoom.us/webinar/register/WN_Yxm8COaqRmigedIoz6CZuQ>*
>>> )
>>>
>>>
>>>
>>> *Who: *       Yi Zhang, Princeton University
>>>
>>>
>>> *Title: *Advancing Deep Learning by Integrating Theory and Empirics *Abstract:
>>> *In sharp contrast to its remarkable empirical success stands a
>>> mathematical understanding of deep learning that is still in its infancy.
>>> Various puzzling behaviors of deep neural nets remain unexplained, and many
>>> widely deployed deep learning systems lack theoretical guarantees. This
>>> talk offers a perspective on the resemblance between deep learning research
>>> and natural sciences, especially modern physics at its formative stage
>>> where reciprocal interactions between theoretical and experimental studies
>>> fueled the growth. As the central object of study, deep neural networks are
>>> to deep learning as unknown particles are to physics. We can make
>>> significant progress by building theories out of inspiring empirical
>>> observations and verifying hypotheses with carefully designed experiments.
>>>
>>> In this talk, I will present several of my representative works that
>>> follow this research philosophy. Firstly, I will introduce a finite-sample
>>> analysis of Generative Adversarial Networks that predicts the existence of
>>> degenerate solutions (i.e. mode collapse), which we confirm empirically
>>> using a principled test. Then I will present how the empirically identified
>>> 'noise stability' of deep neural networks trained on real-life data leads
>>> to a substantially stronger generalization measure for deep learning.
>>> Finally, I will describe our recent work on designing a simple test for
>>> measuring how much deep learning has overfitted to standard datasets. *BIO:
>>> *Yi Zhang is a Ph.D. Candidate at Princeton University where he is
>>> advised by Sanjeev Arora. His research interests are broadly in machine
>>> learning, with a focus on understanding the empirical success of deep
>>> learning from a theoretical perspective. He is the recipient of the Wallace
>>> Memorial Fellowship in Engineering (an Honorific Fellowship Award) from
>>> Princeton University.
>>>
>>>
>>> *Host:* David McAllester <mcallester at ttic.edu>
>>>
>>>
>>>
>>>
>>> Mary C. Marre
>>> Faculty Administrative Support
>>> *Toyota Technological Institute*
>>> *6045 S. Kenwood Avenue*
>>> *Room 517*
>>> *Chicago, IL  60637*
>>> *p:(773) 834-1757*
>>> *f: (773) 357-6970*
>>> *mmarre at ttic.edu <mmarre at ttic.edu>*
>>>
>>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.cs.uchicago.edu/pipermail/colloquium/attachments/20210308/0992cdfe/attachment-0001.html>


More information about the Colloquium mailing list