[Theory] Reminder: 2/1 Talks at TTIC: Surbhi Goel, Microsoft Research NYC

Mary Marre mmarre at ttic.edu
Tue Feb 1 10:01:09 CST 2022


*When:*      Tuesday, February 1st at* 11:00 am CT*



*Where:*     Zoom Virtual Talk (*register in advance here
<https://uchicagogroup.zoom.us/webinar/register/WN_sWFGjgjZQcW0ax8O9OcvXw>*)


*Who: *       Surbhi Goel, Microsoft Research NYC


*Title:*        Principled Algorithm Design in the Era of Deep Learning

*Abstract:* Deep learning has seen tremendous growth in the last decade
with applications across almost all fields of science and technology. In
the pursuit of making deep learning methods more efficient and adaptable,
there is an increasing need to design better algorithms and architectures.
In this talk, I will give an overview of my research efforts towards
advancing the statistical and computational foundations of deep learning
with the goal of designing new principled algorithms and models. I will
show how techniques originally developed for classical learning theory and
convex optimization can be combined and extended for the era of deep
learning. I will highlight this through two main contributions:
(1) New algorithms for training basic deep learning architectures that are
simple, computationally efficient, and provably succeed even when the
standard pipelines fail,
(2) A statistical characterization of state-of-the-art attention
architectures, like Transformers, that gives new insights on their ability
to capture long-range dependencies.

*Bio:* Surbhi Goel is currently a postdoctoral researcher at Microsoft
Research NYC with the Machine Learning group. She received her Ph.D. from
the Department of Computer Science at the University of Texas at Austin
where she was advised by Adam Klivans. Her work lies at the intersection of
machine learning and theoretical computer science, with a focus on
developing the statistical and computational foundations of modern machine
learning paradigms, especially deep learning. Among other honors, she is a
recipient of UT Austin's Bert Kay Dissertation award, the J.P. Morgan AI
PhD Fellowship, the Simons-Berkeley Research Fellowship, and several
fellowships from UT Austin. She has also been recognized as a Rising Star
in ML by University of Maryland and in EECS by UIUC.

*Host:* *Avrim Blum <avrim at ttic.edu>*


Mary C. Marre
Faculty Administrative Support
*Toyota Technological Institute*
*6045 S. Kenwood Avenue*
*Chicago, IL  60637*
*mmarre at ttic.edu <mmarre at ttic.edu>*


On Mon, Jan 31, 2022 at 2:52 PM Mary Marre <mmarre at ttic.edu> wrote:

> *When:*      Tuesday, February 1st at* 11:00 am CT*
>
>
>
> *Where:*     Zoom Virtual Talk (*register in advance here
> <https://uchicagogroup.zoom.us/webinar/register/WN_sWFGjgjZQcW0ax8O9OcvXw>*
> )
>
>
> *Who: *       Surbhi Goel, Microsoft Research NYC
>
>
> *Title:*        Principled Algorithm Design in the Era of Deep Learning
>
> *Abstract:* Deep learning has seen tremendous growth in the last decade
> with applications across almost all fields of science and technology. In
> the pursuit of making deep learning methods more efficient and adaptable,
> there is an increasing need to design better algorithms and architectures.
> In this talk, I will give an overview of my research efforts towards
> advancing the statistical and computational foundations of deep learning
> with the goal of designing new principled algorithms and models. I will
> show how techniques originally developed for classical learning theory and
> convex optimization can be combined and extended for the era of deep
> learning. I will highlight this through two main contributions:
> (1) New algorithms for training basic deep learning architectures that
> are simple, computationally efficient, and provably succeed even when the
> standard pipelines fail,
> (2) A statistical characterization of state-of-the-art attention
> architectures, like Transformers, that gives new insights on their ability
> to capture long-range dependencies.
>
> *Bio:* Surbhi Goel is currently a postdoctoral researcher at Microsoft
> Research NYC with the Machine Learning group. She received her Ph.D. from
> the Department of Computer Science at the University of Texas at Austin
> where she was advised by Adam Klivans. Her work lies at the intersection of
> machine learning and theoretical computer science, with a focus on
> developing the statistical and computational foundations of modern machine
> learning paradigms, especially deep learning. Among other honors, she is a
> recipient of UT Austin's Bert Kay Dissertation award, the J.P. Morgan AI
> PhD Fellowship, the Simons-Berkeley Research Fellowship, and several
> fellowships from UT Austin. She has also been recognized as a Rising Star
> in ML by University of Maryland and in EECS by UIUC.
>
> *Host:* *Avrim Blum <avrim at ttic.edu>*
>
>
>
>
> Mary C. Marre
> Faculty Administrative Support
> *Toyota Technological Institute*
> *6045 S. Kenwood Avenue*
> *Chicago, IL  60637*
> *mmarre at ttic.edu <mmarre at ttic.edu>*
>
>
> On Wed, Jan 26, 2022 at 8:19 AM Mary Marre <mmarre at ttic.edu> wrote:
>
>> *When:*      Tuesday, February 1st at* 11:00 am CT*
>>
>>
>>
>> *Where:*     Zoom Virtual Talk (*register in advance here
>> <https://uchicagogroup.zoom.us/webinar/register/WN_sWFGjgjZQcW0ax8O9OcvXw>*
>> )
>>
>>
>> *Who: *       Surbhi Goel, Microsoft Research NYC
>>
>>
>> *Title:*        Principled Algorithm Design in the Era of Deep Learning
>>
>> *Abstract:* Deep learning has seen tremendous growth in the last decade
>> with applications across almost all fields of science and technology. In
>> the pursuit of making deep learning methods more efficient and adaptable,
>> there is an increasing need to design better algorithms and architectures.
>> In this talk, I will give an overview of my research efforts towards
>> advancing the statistical and computational foundations of deep learning
>> with the goal of designing new principled algorithms and models. I will
>> show how techniques originally developed for classical learning theory and
>> convex optimization can be combined and extended for the era of deep
>> learning. I will highlight this through two main contributions:
>> (1) New algorithms for training basic deep learning architectures that
>> are simple, computationally efficient, and provably succeed even when the
>> standard pipelines fail,
>> (2) A statistical characterization of state-of-the-art attention
>> architectures, like Transformers, that gives new insights on their ability
>> to capture long-range dependencies.
>>
>> *Bio:* Surbhi Goel is currently a postdoctoral researcher at Microsoft
>> Research NYC with the Machine Learning group. She received her Ph.D. from
>> the Department of Computer Science at the University of Texas at Austin
>> where she was advised by Adam Klivans. Her work lies at the intersection of
>> machine learning and theoretical computer science, with a focus on
>> developing the statistical and computational foundations of modern machine
>> learning paradigms, especially deep learning. Among other honors, she is a
>> recipient of UT Austin's Bert Kay Dissertation award, the J.P. Morgan AI
>> PhD Fellowship, the Simons-Berkeley Research Fellowship, and several
>> fellowships from UT Austin. She has also been recognized as a Rising Star
>> in ML by University of Maryland and in EECS by UIUC.
>>
>> *Host:* *Avrim Blum <avrim at ttic.edu>*
>>
>>
>>
>> Mary C. Marre
>> Faculty Administrative Support
>> *Toyota Technological Institute*
>> *6045 S. Kenwood Avenue*
>> *Chicago, IL  60637*
>> *mmarre at ttic.edu <mmarre at ttic.edu>*
>>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.cs.uchicago.edu/pipermail/theory/attachments/20220201/57056c40/attachment-0001.html>


More information about the Theory mailing list