[Theory] REMINDER: 12/14 TTIC Colloquium: Andrew Gordon Wilson, NYU
Mary Marre
mmarre at ttic.edu
Mon Dec 14 10:00:00 CST 2020
*When:* Monday, December 14th at 11:10 am
*Where:* Zoom Virtual Talk (*register in advance here
<https://uchicagogroup.zoom.us/webinar/register/WN_4Spz5uMSTReAa-kCJ8sUlQ>*)
*Who: * Andrew Gordon Wilson, NYU
*Title:* How Do We Build Models That Learn and Generalize?
*Abstract: *To answer scientific questions, and reason about data, we must
build models and perform inference within those models. But how should we
approach model construction and inference to make the most successful
predictions? How do we represent uncertainty and prior knowledge? How
flexible should our models be? Should we use a single model, or multiple
different models? Should we follow a different procedure depending on how
much data are available? How do we learn desirable constraints, such as
rotation, translation, or reflection symmetries, when they don't improve
standard training loss?
In this talk I will present a philosophy for model construction, grounded
in probability theory. I will exemplify this approach with methods that
exploit loss surface geometry for scalable and practical Bayesian deep
learning, and resolutions to seemingly mysterious generalization behaviour
such as double descent. I will also consider prior specification,
generalized Bayesian inference, and automatic symmetry learning. The talk
will primarily be based on https://arxiv.org/abs/2002.08791, and it will
also touch on https://arxiv.org/abs/2002.12880 and
https://arxiv.org/abs/2010.11882.
*Bio:* Andrew Gordon Wilson is faculty in the Courant Institute of
Mathematical Sciences and Center for Data Science at NYU. Before joining
NYU, he was an assistant professor at Cornell University from 2016-2019. He
was a research fellow in the Machine Learning Department at Carnegie Mellon
University from 2014-2016, and completed his PhD at the University of
Cambridge in 2014. Andrew's interests include probabilistic modelling,
Gaussian processes, Bayesian statistics, physics inspired machine learning,
and loss surfaces and generalization in deep learning. His webpage is
https://cims.nyu.edu/~andrewgw.
*Host: **Dougal Sutherland* <dougal at ttic.edu>
For more information on the colloquium series or to subscribe to the
mailing list, please see http://www.ttic.edu/colloquium.php
Mary C. Marre
Faculty Administrative Support
*Toyota Technological Institute*
*6045 S. Kenwood Avenue*
*Room 517*
*Chicago, IL 60637*
*p:(773) 834-1757*
*f: (773) 357-6970*
*mmarre at ttic.edu <mmarre at ttic.edu>*
On Sun, Dec 13, 2020 at 3:42 PM Mary Marre <mmarre at ttic.edu> wrote:
> *When:* Monday, December 14th at 11:10 am
>
>
>
> *Where:* Zoom Virtual Talk (*register in advance here
> <https://uchicagogroup.zoom.us/webinar/register/WN_4Spz5uMSTReAa-kCJ8sUlQ>*
> )
>
>
>
> *Who: * Andrew Gordon Wilson, NYU
>
>
>
> *Title:* How Do We Build Models That Learn and Generalize?
>
>
> *Abstract: *To answer scientific questions, and reason about data, we
> must build models and perform inference within those models. But how
> should we approach model construction and inference to make the most
> successful predictions? How do we represent uncertainty and prior
> knowledge? How flexible should our models be? Should we use a single
> model, or multiple different models? Should we follow a different
> procedure depending on how much data are available? How do we learn
> desirable constraints, such as rotation, translation, or reflection
> symmetries, when they don't improve standard training loss?
>
>
> In this talk I will present a philosophy for model construction, grounded
> in probability theory. I will exemplify this approach with methods that
> exploit loss surface geometry for scalable and practical Bayesian deep
> learning, and resolutions to seemingly mysterious generalization behaviour
> such as double descent. I will also consider prior specification,
> generalized Bayesian inference, and automatic symmetry learning. The talk
> will primarily be based on https://arxiv.org/abs/2002.08791, and it will
> also touch on https://arxiv.org/abs/2002.12880 and
> https://arxiv.org/abs/2010.11882.
>
>
> *Bio:* Andrew Gordon Wilson is faculty in the Courant Institute of
> Mathematical Sciences and Center for Data Science at NYU. Before joining
> NYU, he was an assistant professor at Cornell University from 2016-2019. He
> was a research fellow in the Machine Learning Department at Carnegie Mellon
> University from 2014-2016, and completed his PhD at the University of
> Cambridge in 2014. Andrew's interests include probabilistic modelling,
> Gaussian processes, Bayesian statistics, physics inspired machine learning,
> and loss surfaces and generalization in deep learning. His webpage is
> https://cims.nyu.edu/~andrewgw.
>
> *Host: **Dougal Sutherland* <dougal at ttic.edu>
> For more information on the colloquium series or to subscribe to the
> mailing list, please see http://www.ttic.edu/colloquium.php
>
>
> Mary C. Marre
> Faculty Administrative Support
> *Toyota Technological Institute*
> *6045 S. Kenwood Avenue*
> *Room 517*
> *Chicago, IL 60637*
> *p:(773) 834-1757*
> *f: (773) 357-6970*
> *mmarre at ttic.edu <mmarre at ttic.edu>*
>
>
> On Mon, Dec 7, 2020 at 9:25 PM Mary Marre <mmarre at ttic.edu> wrote:
>
>> *When:* Monday, December 14th at 11:10 am
>>
>>
>>
>> *Where:* Zoom Virtual Talk (*register in advance here
>> <https://uchicagogroup.zoom.us/webinar/register/WN_4Spz5uMSTReAa-kCJ8sUlQ>*
>> )
>>
>>
>>
>> *Who: * Andrew Gordon Wilson, NYU
>>
>>
>>
>> *Title:* How Do We Build Models That Learn and Generalize?
>>
>>
>> *Abstract: *To answer scientific questions, and reason about data, we
>> must build models and perform inference within those models. But how
>> should we approach model construction and inference to make the most
>> successful predictions? How do we represent uncertainty and prior
>> knowledge? How flexible should our models be? Should we use a single
>> model, or multiple different models? Should we follow a different
>> procedure depending on how much data are available? How do we learn
>> desirable constraints, such as rotation, translation, or reflection
>> symmetries, when they don't improve standard training loss?
>>
>>
>> In this talk I will present a philosophy for model construction, grounded
>> in probability theory. I will exemplify this approach with methods that
>> exploit loss surface geometry for scalable and practical Bayesian deep
>> learning, and resolutions to seemingly mysterious generalization behaviour
>> such as double descent. I will also consider prior specification,
>> generalized Bayesian inference, and automatic symmetry learning. The talk
>> will primarily be based on https://arxiv.org/abs/2002.08791, and it will
>> also touch on https://arxiv.org/abs/2002.12880 and
>> https://arxiv.org/abs/2010.11882.
>>
>>
>> *Bio:* Andrew Gordon Wilson is faculty in the Courant Institute of
>> Mathematical Sciences and Center for Data Science at NYU. Before joining
>> NYU, he was an assistant professor at Cornell University from 2016-2019. He
>> was a research fellow in the Machine Learning Department at Carnegie Mellon
>> University from 2014-2016, and completed his PhD at the University of
>> Cambridge in 2014. Andrew's interests include probabilistic modelling,
>> Gaussian processes, Bayesian statistics, physics inspired machine learning,
>> and loss surfaces and generalization in deep learning. His webpage is
>> https://cims.nyu.edu/~andrewgw.
>>
>> *Host: **Dougal Sutherland* <dougal at ttic.edu>
>> For more information on the colloquium series or to subscribe to the
>> mailing list, please see http://www.ttic.edu/colloquium.php
>>
>>
>> Mary C. Marre
>> Faculty Administrative Support
>> *Toyota Technological Institute*
>> *6045 S. Kenwood Avenue*
>> *Room 517*
>> *Chicago, IL 60637*
>> *p:(773) 834-1757*
>> *f: (773) 357-6970*
>> *mmarre at ttic.edu <mmarre at ttic.edu>*
>>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.cs.uchicago.edu/pipermail/theory/attachments/20201214/f375457d/attachment-0001.html>
More information about the Theory
mailing list