[Colloquium] Re: talk by Sarath Chandar this Friday at 11 am

Karl Stratos via Colloquium colloquium at mailman.cs.uchicago.edu
Thu Jun 29 16:01:54 CDT 2017


Just a friendly reminder on the talk tomorrow at 11.

Best,
Karl

On Tue, Jun 27, 2017 at 11:26 AM, Karl Stratos <stratos at ttic.edu> wrote:

> Hi all,
>
> Sarath Chandar from the University of Montreal is visiting TTIC this
> Friday. His talk is at 11 am and he will be around for the day until 5 pm.
>
> Please let me know if you'd like to meet with the speaker before/after the
> talk, and if you'd like to join for lunch.
>
>
>
> When:     Friday, June 30th at 11:00 am
>
> Where:    TTIC, 6045 S Kenwood Avenue, 5th Floor, Room 526
>
> Who:      Sarath Chandar, University of Montreal
>
> *Title:* Memory Augmented Neural Networks
>
>
> *Abstract:*
>
> Designing of general-purpose learning algorithms is a long-standing goal
> of artificial intelligence. A general purpose AI agent should be able to
> have a memory that it can store and retrieve information from. Despite the
> success of deep learning in particular with the introduction of LSTMs and
> GRUs to this area, there are still a set of complex tasks that can be
> challenging for conventional neural networks. Those tasks often require a
> neural network to be equipped with an explicit, external memory in which a
> larger, potentially unbounded, set of facts need to be stored. They include
> but are not limited to, reasoning, planning, episodic question-answering
> and learning compact algorithms. Recently two promising approaches based on
> neural networks to this type of tasks have been proposed: Memory Networks
> and Neural Turing Machines.
>
>
>
> In this talk, I will give an overview of this new paradigm of "neural
> networks with memory". I will present a unified architecture for Memory
> Augmented Neural Networks (MANN) and discuss the ways in which one can
> address the external memory and hence read/write from it. In the second
> half of the talk, we will focus on recent advances in MANN which focus on
> the following questions: How can we read/write from an extremely large
> memory in a scalable way? How can we design efficient non-linear addressing
> schemes using hard attention? How can we model long term dependencies in a
> problem using MANNs? The answer to any one of these questions introduces a
> variant of MANN. I will conclude the talk with several open challenges in
> MANN.
>
>
>
> *Speaker Bio:* Sarath Chandar is currently a PhD student in University of
> Montreal under the supervision of Yoshua Bengio and Hugo Larochelle. His
> work mainly focuses on Deep Learning for complex NLP tasks like question
> answering and dialog systems. He also investigates scalable training
> procedure and memory access mechanisms for memory network architectures. In
> the past, he has worked on multilingual representation learning and
> transfer learning across multiple languages. His research interests include
> Machine Learning, Natural Language Processing, Deep Learning, and
> Reinforcement Learning. Before joining University of Montreal, he was a
> Research Scholar in IBM Research India for a year. He has completed his MS
> by Research in IIT Madras. To view the complete publication list and
> speaker profile, please visit: http://sarathchandar.in/
>
> Host: Karl Stratos
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.cs.uchicago.edu/pipermail/colloquium/attachments/20170629/05bf38d5/attachment-0001.html>


More information about the Colloquium mailing list