[Theory] REMINDER: 4/2 Talks at TTIC: Dougal Sutherland, University College London

Mary Marre mmarre at ttic.edu
Tue Apr 2 10:06:39 CDT 2019


*When: *   Tuesday, April 2nd at 11:00 AM

Where:    TTIC, 6045 S Kenwood Avenue, 5th Floor, Room 526

Who:       Dougal Sutherland, University College London

*Title:*       Kernel Distances for Distinguishing and Sampling from
Probability Distributions

*Abstract:* Probability distributions are the core object of statistical
machine learning, and one of the basic properties we can consider is
distances between them. In this talk, we will consider using these
distances for two important tasks, and show how to design distances which
will be useful for each. First, we study the problem of two-sample testing,
where we wish to determine whether (and how) two different datasets
meaningfully differ. We then study this framework in the setting of
training generative models, such as generative adversarial networks (GANs),
which learn to sample from complex distributions such as those of natural
images.

The distances used are defined in terms of kernels, but we parameterise
these kernels as deep networks for flexibility. This combination gives both
theoretical and practical benefits over staying purely in either framework,
and we obtain state-of-the-art results for unsupervised image generation on
CelebA and ImageNet with our novel Scaled MMD GAN.


*Host:* Karen Livescu <klivescu>



Mary C. Marre
Administrative Assistant
*Toyota Technological Institute*
*6045 S. Kenwood Avenue*
*Room 517*
*Chicago, IL  60637*
*p:(773) 834-1757*
*f: (773) 357-6970*
*mmarre at ttic.edu <mmarre at ttic.edu>*


On Mon, Apr 1, 2019 at 2:11 PM Mary Marre <mmarre at ttic.edu> wrote:

> *When: *   Tuesday, April 2nd at 11:00 AM
>
> Where:    TTIC, 6045 S Kenwood Avenue, 5th Floor, Room 526
>
> Who:       Dougal Sutherland, University College London
>
> *Title:*       Kernel Distances for Distinguishing and Sampling from
> Probability Distributions
>
> *Abstract:* Probability distributions are the core object of statistical
> machine learning, and one of the basic properties we can consider is
> distances between them. In this talk, we will consider using these
> distances for two important tasks, and show how to design distances which
> will be useful for each. First, we study the problem of two-sample testing,
> where we wish to determine whether (and how) two different datasets
> meaningfully differ. We then study this framework in the setting of
> training generative models, such as generative adversarial networks (GANs),
> which learn to sample from complex distributions such as those of natural
> images.
>
> The distances used are defined in terms of kernels, but we parameterise
> these kernels as deep networks for flexibility. This combination gives both
> theoretical and practical benefits over staying purely in either framework,
> and we obtain state-of-the-art results for unsupervised image generation on
> CelebA and ImageNet with our novel Scaled MMD GAN.
>
>
> *Host:* Karen Livescu <klivescu>
>
>
>
>
> Mary C. Marre
> Administrative Assistant
> *Toyota Technological Institute*
> *6045 S. Kenwood Avenue*
> *Room 517*
> *Chicago, IL  60637*
> *p:(773) 834-1757*
> *f: (773) 357-6970*
> *mmarre at ttic.edu <mmarre at ttic.edu>*
>
>
> On Wed, Mar 27, 2019 at 10:14 AM Mary Marre <mmarre at ttic.edu> wrote:
>
>> *When: *   Tuesday, April 2nd at 11:00 AM
>>
>> Where:    TTIC, 6045 S Kenwood Avenue, 5th Floor, Room 526
>>
>> Who:       Dougal Sutherland, University College London
>>
>> *Title:*       Kernel Distances for Distinguishing and Sampling from
>> Probability Distributions
>>
>> *Abstract:* Probability distributions are the core object of statistical
>> machine learning, and one of the basic properties we can consider is
>> distances between them. In this talk, we will consider using these
>> distances for two important tasks, and show how to design distances which
>> will be useful for each. First, we study the problem of two-sample testing,
>> where we wish to determine whether (and how) two different datasets
>> meaningfully differ. We then study this framework in the setting of
>> training generative models, such as generative adversarial networks (GANs),
>> which learn to sample from complex distributions such as those of natural
>> images.
>>
>> The distances used are defined in terms of kernels, but we parameterise
>> these kernels as deep networks for flexibility. This combination gives both
>> theoretical and practical benefits over staying purely in either framework,
>> and we obtain state-of-the-art results for unsupervised image generation on
>> CelebA and ImageNet with our novel Scaled MMD GAN.
>>
>>
>> *Host:* Karen Livescu <klivescu>
>>
>>
>>
>> Mary C. Marre
>> Administrative Assistant
>> *Toyota Technological Institute*
>> *6045 S. Kenwood Avenue*
>> *Room 517*
>> *Chicago, IL  60637*
>> *p:(773) 834-1757*
>> *f: (773) 357-6970*
>> *mmarre at ttic.edu <mmarre at ttic.edu>*
>>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.cs.uchicago.edu/pipermail/theory/attachments/20190402/6727e191/attachment-0001.html>


More information about the Theory mailing list