[Theory] REMINDER: 8/19 Thesis Defense: Hai Wang, TTIC
Mary Marre
mmarre at ttic.edu
Wed Aug 12 15:51:45 CDT 2020
*Thesis Defense: Hai Wang, TTIC*
*When:* Wednesday*, August 19th at 3**:00pm CT*
*Where:* *Virtually*
<https://uchicagostudents.zoom.us/j/95703570779?pwd=ZmpZWGJvZU1ITHRKc0h3UnJvb0pCUT09>
*Who: * Hai Wang, TTIC
*Title: *Knowledge Efficient Deep Learning for Natural Language
Processing
*Abstract: *
Deep learning has become the workhorse for a wide range of natural language
processing applications. But much of the success of deep learning relies on
annotated examples. Annotation is time-consuming and expensive to produce
at scale. Here we are interested in methods for reducing the required
quantity of annotated data --- by making the learning methods more
knowledge efficient so as to make them more applicable in low annotation
(low resource) settings. There are various classical approaches to making
the models more knowledge efficient such as multi-task learning, transfer
learning, weakly supervised and unsupervised learning etc. This thesis
focuses on adapting such classical methods to modern deep learning models
and algorithms.
I describe four works aimed at making machine learning models more
knowledge efficient. First, we propose a knowledge rich deep learning model
(KRDL) as a unifying learning framework for incorporating prior knowledge
into deep models. In particular, we apply KRDL built on Markov logic
networks to denoise weak supervision. Second, we apply a KRDL model to
assist the machine reading models to find the correct evidence sentences
that can support their decision. Third, we investigate the knowledge
transfer techniques in multilingual settings, where we proposed a method
that can improve pre-trained multilingual BERT based on the bilingual
dictionary. Fourth we present an episodic memory network for language
modelling, in which we encode the large external knowledge for the
pre-trained GPT.
*Thesis Advisor:* David McAllester <mcallester at ttic.edu>
******************************************************************************************************
Zoom link for the virtual presentation.
https://us02web.zoom.us/j/88466082733?pwd=SjdUOERBaDdNb1FQbHgxMFdmWFgrUT09
<https://www.google.com/url?q=https://us02web.zoom.us/j/88466082733?pwd%3DSjdUOERBaDdNb1FQbHgxMFdmWFgrUT09&sa=D&source=calendar&ust=1597077744144000&usg=AOvVaw3Px-7pJQ5T_OQdzBQ2LYwa>
Meeting ID: 884 6608 2733 Passcode: 831070
Mary C. Marre
Faculty Administrative Support
*Toyota Technological Institute*
*6045 S. Kenwood Avenue*
*Room 517*
*Chicago, IL 60637*
*p:(773) 834-1757*
*f: (773) 357-6970*
*mmarre at ttic.edu <mmarre at ttic.edu>*
On Wed, Aug 5, 2020 at 3:06 PM Mary Marre <mmarre at ttic.edu> wrote:
> *Thesis Defense: Hai Wang, TTIC*
>
> *When:* Wednesday*, August 19th at 3**:00pm CT*
>
> *Where:* *Virtually*
> <https://uchicagostudents.zoom.us/j/95703570779?pwd=ZmpZWGJvZU1ITHRKc0h3UnJvb0pCUT09>
>
> *Who: * Hai Wang, TTIC
>
> *Title: *Knowledge Efficient Deep Learning for Natural Language
> Processing
>
> *Abstract: *
> Deep learning has become the workhorse for a wide range of natural
> language processing applications. But much of the success of deep learning
> relies on annotated examples. Annotation is time-consuming and expensive
> to produce at scale. Here we are interested in methods for reducing the
> required quantity of annotated data --- by making the learning methods more
> knowledge efficient so as to make them more applicable in low annotation
> (low resource) settings. There are various classical approaches to making
> the models more knowledge efficient such as multi-task learning, transfer
> learning, weakly supervised and unsupervised learning etc. This thesis
> focuses on adapting such classical methods to modern deep learning models
> and algorithms.
>
> I describe four works aimed at making machine learning models more
> knowledge efficient. First, we propose a knowledge rich deep learning model
> (KRDL) as a unifying learning framework for incorporating prior knowledge
> into deep models. In particular, we apply KRDL built on Markov logic
> networks to denoise weak supervision. Second, we apply a KRDL model to
> assist the machine reading models to find the correct evidence sentences
> that can support their decision. Third, we investigate the knowledge
> transfer techniques in multilingual settings, where we proposed a method
> that can improve pre-trained multilingual BERT based on the bilingual
> dictionary. Fourth we present an episodic memory network for language
> modelling, in which we encode the large external knowledge for the
> pre-trained GPT.
>
> *Thesis Advisor:* David McAllester <mcallester at ttic.edu>
>
>
> ******************************************************************************************************
>
> Zoom link for the virtual presentation.
> https://us02web.zoom.us/j/88466082733?pwd=SjdUOERBaDdNb1FQbHgxMFdmWFgrUT09
> <https://www.google.com/url?q=https://us02web.zoom.us/j/88466082733?pwd%3DSjdUOERBaDdNb1FQbHgxMFdmWFgrUT09&sa=D&source=calendar&ust=1597077744144000&usg=AOvVaw3Px-7pJQ5T_OQdzBQ2LYwa>
>
> Meeting ID: 884 6608 2733 Passcode: 831070
>
>
>
>
>
>
> Mary C. Marre
> Faculty Administrative Support
> *Toyota Technological Institute*
> *6045 S. Kenwood Avenue*
> *Room 517*
> *Chicago, IL 60637*
> *p:(773) 834-1757*
> *f: (773) 357-6970*
> *mmarre at ttic.edu <mmarre at ttic.edu>*
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.cs.uchicago.edu/pipermail/theory/attachments/20200812/675a2bb0/attachment.html>
More information about the Theory
mailing list