[Colloquium] REMINDER: 10/18 TTIC Colloquium: Kristina Toutanova, Google Research

Mary Marre mmarre at ttic.edu
Mon Oct 18 10:21:26 CDT 2021


*When:*      Monday, October 18th, 2021 at *11:00 am CT*



*Where:*     *Zoom Virtual Talk* (*register in advance here
<https://uchicagogroup.zoom.us/webinar/register/WN_9BoL72NZT4yXREVsKbzuCA>*)




*Who: *       Kristina Toutanova, Google Research


*Title:*        Advances and Limitations in Generalization via
Self-Supervised Pretrained Representations

*Abstract: * Pretrained neural representations, learned from unlabeled
text, have recently led to substantial improvements across many natural
language problems. Yet some components of these models are still brittle
and heuristic, and sizable human labeled data is typically needed to obtain
competitive performance on end tasks.

I will first talk about recent advances from our team leading to (i)
improved multi-lingual generalization and ease of use through
tokenization-free pretrained representations and (ii) better few-shot
generalization for underrepresented task categories via neural language
model-based example extrapolation. I will then point to limitations of
generic pre-trained representations when tasked with handling both language
variation and out-of-distribution compositional generalization, and the
relative performance of induced symbolic representations.

*Bio: *Kristina Toutanova is a research scientist at Google Research in
Seattle and an affiliate faculty at the University of Washington. She
obtained her Ph.D. from the Computer Science Department at Stanford
University with Christopher Manning, and her MSc in Computer Science from
Sofia University, Bulgaria. Prior to joining Google in 2017, she was a
researcher at Microsoft Research, Redmond. Kristina focuses on modeling the
structure of natural language using machine learning, most recently in the
areas of representation learning, question answering, information retrieval
and semantic parsing. Kristina is a past co-editor in chief of TACL, a
program co-chair for ACL 2014, and a general chair for NAACL 2021.

*Hos**t:* *Sam Wiseman* <swiseman at cs.duke.edu>


For more information on the colloquium series or to subscribe to the
mailing list, please see http://www.ttic.edu/colloquium.php



Mary C. Marre
Faculty Administrative Support
*Toyota Technological Institute*
*6045 S. Kenwood Avenue*
*Chicago, IL  60637*
*mmarre at ttic.edu <mmarre at ttic.edu>*


On Sun, Oct 17, 2021 at 4:39 PM Mary Marre <mmarre at ttic.edu> wrote:

> *When:*      Monday, October 18th, 2021 at *11:00 am CT*
>
>
>
> *Where:*     *Zoom Virtual Talk* (*register in advance here
> <https://uchicagogroup.zoom.us/webinar/register/WN_9BoL72NZT4yXREVsKbzuCA>*
> )
>
>
>
> *Who: *       Kristina Toutanova, Google Research
>
>
> *Title:*        Advances and Limitations in Generalization via
> Self-Supervised Pretrained Representations
>
> *Abstract: * Pretrained neural representations, learned from unlabeled
> text, have recently led to substantial improvements across many natural
> language problems. Yet some components of these models are still brittle
> and heuristic, and sizable human labeled data is typically needed to obtain
> competitive performance on end tasks.
>
> I will first talk about recent advances from our team leading to (i)
> improved multi-lingual generalization and ease of use through
> tokenization-free pretrained representations and (ii) better few-shot
> generalization for underrepresented task categories via neural language
> model-based example extrapolation. I will then point to limitations of
> generic pre-trained representations when tasked with handling both language
> variation and out-of-distribution compositional generalization, and the
> relative performance of induced symbolic representations.
>
> *Bio: *Kristina Toutanova is a research scientist at Google Research in
> Seattle and an affiliate faculty at the University of Washington. She
> obtained her Ph.D. from the Computer Science Department at Stanford
> University with Christopher Manning, and her MSc in Computer Science from
> Sofia University, Bulgaria. Prior to joining Google in 2017, she was a
> researcher at Microsoft Research, Redmond. Kristina focuses on modeling the
> structure of natural language using machine learning, most recently in the
> areas of representation learning, question answering, information retrieval
> and semantic parsing. Kristina is a past co-editor in chief of TACL, a
> program co-chair for ACL 2014, and a general chair for NAACL 2021.
>
> *Hos**t:* *Sam Wiseman* <swiseman at cs.duke.edu>
>
>
> For more information on the colloquium series or to subscribe to the
> mailing list, please see http://www.ttic.edu/colloquium.php
>
>
>
>
> Mary C. Marre
> Faculty Administrative Support
> *Toyota Technological Institute*
> *6045 S. Kenwood Avenue*
> *Chicago, IL  60637*
> *mmarre at ttic.edu <mmarre at ttic.edu>*
>
>
> On Tue, Oct 12, 2021 at 2:22 PM Mary Marre <mmarre at ttic.edu> wrote:
>
>> *When:*      Monday, October 18th, 2021 at *11:00 am CT*
>>
>>
>>
>> *Where:*     tba
>>
>>
>> *Virtually:  *Zoom Virtual Talk (*register in advance here
>> <https://uchicagogroup.zoom.us/webinar/register/WN_9BoL72NZT4yXREVsKbzuCA>*
>> )
>>
>>
>>
>> *Who: *       Kristina Toutanova, Google Research
>>
>>
>> *Title:*        Advances and Limitations in Generalization via
>> Self-Supervised Pretrained Representations
>>
>> *Abstract: * Pretrained neural representations, learned from unlabeled
>> text, have recently led to substantial improvements across many natural
>> language problems. Yet some components of these models are still brittle
>> and heuristic, and sizable human labeled data is typically needed to obtain
>> competitive performance on end tasks.
>>
>> I will first talk about recent advances from our team leading to (i)
>> improved multi-lingual generalization and ease of use through
>> tokenization-free pretrained representations and (ii) better few-shot
>> generalization for underrepresented task categories via neural language
>> model-based example extrapolation. I will then point to limitations of
>> generic pre-trained representations when tasked with handling both language
>> variation and out-of-distribution compositional generalization, and the
>> relative performance of induced symbolic representations.
>>
>> *Bio: *Kristina Toutanova is a research scientist at Google Research in
>> Seattle and an affiliate faculty at the University of Washington. She
>> obtained her Ph.D. from the Computer Science Department at Stanford
>> University with Christopher Manning, and her MSc in Computer Science from
>> Sofia University, Bulgaria. Prior to joining Google in 2017, she was a
>> researcher at Microsoft Research, Redmond. Kristina focuses on modeling the
>> structure of natural language using machine learning, most recently in the
>> areas of representation learning, question answering, information retrieval
>> and semantic parsing. Kristina is a past co-editor in chief of TACL, a
>> program co-chair for ACL 2014, and a general chair for NAACL 2021.
>>
>> *Hos**t:* *Sam Wiseman* <swiseman at cs.duke.edu>
>>
>>
>> For more information on the colloquium series or to subscribe to the
>> mailing list, please see http://www.ttic.edu/colloquium.php
>>
>>
>>
>>
>> Mary C. Marre
>> Faculty Administrative Support
>> *Toyota Technological Institute*
>> *6045 S. Kenwood Avenue*
>> *Chicago, IL  60637*
>> *mmarre at ttic.edu <mmarre at ttic.edu>*
>>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.cs.uchicago.edu/pipermail/colloquium/attachments/20211018/b1ff43d1/attachment-0001.html>


More information about the Colloquium mailing list