[Theory] NOW: 6/3 TTIC Colloquium: Jamie Morgenstern, University of Washington ("PLEASE NOTE SPECIAL TIME")
Mary Marre
mmarre at ttic.edu
Fri Jun 3 12:34:02 CDT 2022
*When:* Friday, June 3rd at* 12:30 pm CT*
*Where: *Talk will be given *live, in-person* at
TTIC, 6045 S. Kenwood Avenue
5th Floor, Room 530
*Virtually:* via Panopto *(livestream
<https://uchicago.hosted.panopto.com/Panopto/Pages/Viewer.aspx?id=d31590ca-57c1-4b0d-b081-aea6012e2535>)*
*Who: * Jamie Morgenstern, University of Washington
*Title:* Learning from Multiple Data Sources
*Abstract:* Many real-world ML problems violate the assumptions that (1)
training data is generated iid from the same distribution, and (2) that the
training and test distributions are the same. For example, many different
hospitals in different locations, serving different populations, might
collect data about their patients. These datasets would ideally be combined
and used to train a model which would perform well on many distributions.
In this talk, I will describe one approach to formalizing how to use data
generated from multiple distributions, and on what test distributions we
can guarantee this method will perform well.
[joint work with Christopher Jung and Pranjal Awasthi]
Visit Jamie Morgenstern's Homepage <https://jamiemorgenstern.com/>
*Host:* *Nathan Srebro* <nati at ttic.edu>
***********************************************************************************
For more information on the colloquium series or to subscribe to the
mailing list, please see http://www.ttic.edu/colloquium.php
Mary C. Marre
Faculty Administrative Support
*Toyota Technological Institute*
*6045 S. Kenwood Avenue*
*Chicago, IL 60637*
*mmarre at ttic.edu <mmarre at ttic.edu>*
On Fri, Jun 3, 2022 at 11:15 AM Mary Marre <mmarre at ttic.edu> wrote:
> *When:* Friday, June 3rd at* 12:30 pm CT*
>
>
> *Where: *Talk will be given *live, in-person* at
>
> TTIC, 6045 S. Kenwood Avenue
>
> 5th Floor, Room 530
>
>
> *Virtually:* via Panopto *(livestream
> <https://uchicago.hosted.panopto.com/Panopto/Pages/Viewer.aspx?id=d31590ca-57c1-4b0d-b081-aea6012e2535>)*
>
>
> *Who: * Jamie Morgenstern, University of Washington
>
>
> *Title:* Learning from Multiple Data Sources
>
> *Abstract:* Many real-world ML problems violate the assumptions that (1)
> training data is generated iid from the same distribution, and (2) that the
> training and test distributions are the same. For example, many different
> hospitals in different locations, serving different populations, might
> collect data about their patients. These datasets would ideally be combined
> and used to train a model which would perform well on many distributions.
> In this talk, I will describe one approach to formalizing how to use data
> generated from multiple distributions, and on what test distributions we
> can guarantee this method will perform well.
>
> [joint work with Christopher Jung and Pranjal Awasthi]
>
> Visit Jamie Morgenstern's Homepage <https://jamiemorgenstern.com/>
>
> *Host:* *Nathan Srebro* <nati at ttic.edu>
>
> ***********************************************************************************
>
> For more information on the colloquium series or to subscribe to the
> mailing list, please see http://www.ttic.edu/colloquium.php
>
>
>
>
>
> Mary C. Marre
> Faculty Administrative Support
> *Toyota Technological Institute*
> *6045 S. Kenwood Avenue*
> *Chicago, IL 60637*
> *mmarre at ttic.edu <mmarre at ttic.edu>*
>
>
> On Thu, Jun 2, 2022 at 4:20 PM Mary Marre <mmarre at ttic.edu> wrote:
>
>> *When:* Friday, June 3rd at* 12:30 pm CT*
>>
>>
>> *Where: *Talk will be given *live, in-person* at
>>
>> TTIC, 6045 S. Kenwood Avenue
>>
>> 5th Floor, Room 530
>>
>>
>> *Virtually:* via Panopto *(livestream
>> <https://uchicago.hosted.panopto.com/Panopto/Pages/Viewer.aspx?id=d31590ca-57c1-4b0d-b081-aea6012e2535>)*
>>
>>
>> *Who: * Jamie Morgenstern, University of Washington
>>
>>
>> *Title:* Learning from Multiple Data Sources
>>
>> *Abstract:* Many real-world ML problems violate the assumptions that (1)
>> training data is generated iid from the same distribution, and (2) that the
>> training and test distributions are the same. For example, many different
>> hospitals in different locations, serving different populations, might
>> collect data about their patients. These datasets would ideally be combined
>> and used to train a model which would perform well on many distributions.
>> In this talk, I will describe one approach to formalizing how to use data
>> generated from multiple distributions, and on what test distributions we
>> can guarantee this method will perform well.
>>
>> [joint work with Christopher Jung and Pranjal Awasthi]
>>
>> Visit Jamie Morgenstern's Homepage <https://jamiemorgenstern.com/>
>>
>> *Host:* *Nathan Srebro* <nati at ttic.edu>
>>
>> ***********************************************************************************
>>
>> For more information on the colloquium series or to subscribe to the
>> mailing list, please see http://www.ttic.edu/colloquium.php
>>
>>
>>
>> Mary C. Marre
>> Faculty Administrative Support
>> *Toyota Technological Institute*
>> *6045 S. Kenwood Avenue*
>> *Chicago, IL 60637*
>> *mmarre at ttic.edu <mmarre at ttic.edu>*
>>
>>
>> On Tue, May 31, 2022 at 2:19 PM Mary Marre <mmarre at ttic.edu> wrote:
>>
>>> *When:* Friday, June 3rd at* 12:30 pm CT*
>>>
>>>
>>> *Where: *Talk will be given *live, in-person* at
>>>
>>> TTIC, 6045 S. Kenwood Avenue
>>>
>>> 5th Floor, Room 530
>>>
>>>
>>> *Virtually:* via Panopto *(livestream
>>> <https://uchicago.hosted.panopto.com/Panopto/Pages/Viewer.aspx?id=d31590ca-57c1-4b0d-b081-aea6012e2535>)*
>>>
>>>
>>> *Who: * Jamie Morgenstern, University of Washington
>>>
>>>
>>> *Title:* Learning from Multiple Data Sources
>>>
>>> *Abstract:* Many real-world ML problems violate the assumptions that
>>> (1) training data is generated iid from the same distribution, and (2) that
>>> the training and test distributions are the same. For example, many
>>> different hospitals in different locations, serving different populations,
>>> might collect data about their patients. These datasets would ideally be
>>> combined and used to train a model which would perform well on many
>>> distributions. In this talk, I will describe one approach to formalizing
>>> how to use data generated from multiple distributions, and on what test
>>> distributions we can guarantee this method will perform well.
>>>
>>> [joint work with Christopher Jung and Pranjal Awasthi]
>>>
>>> Visit Jamie Morgenstern's Homepage <https://jamiemorgenstern.com/>
>>>
>>> *Host:* *Nathan Srebro* <nati at ttic.edu>
>>>
>>> ***********************************************************************************
>>>
>>> For more information on the colloquium series or to subscribe to the
>>> mailing list, please see http://www.ttic.edu/colloquium.php
>>>
>>>
>>>
>>>
>>> Mary C. Marre
>>> Faculty Administrative Support
>>> *Toyota Technological Institute*
>>> *6045 S. Kenwood Avenue*
>>> *Chicago, IL 60637*
>>> *mmarre at ttic.edu <mmarre at ttic.edu>*
>>>
>>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.cs.uchicago.edu/pipermail/theory/attachments/20220603/05da9079/attachment-0001.html>
More information about the Theory
mailing list