[Theory] TOMORROW: [Talks at TTIC] 1/25 Young Researcher Seminar Series: Enric Boix, MIT
Brandie Jones
bjones at ttic.edu
Tue Jan 24 12:00:00 CST 2023
*When: *Wednesday, January 25th* at ** 10:30AM CT*
*Where: *Talk will be given *live, in-person* at
TTIC, 6045 S. Kenwood Avenue
5th Floor, Room 530
*Virtually: *via Panopto (Livestream
<https://uchicago.hosted.panopto.com/Panopto/Pages/Viewer.aspx?id=b17f4b04-6a8f-4a69-b848-af8d00f17c6e>
)
*Who: *Enric Boix, MIT
*Title: *The staircase property and the leap complexity
*Abstract: *Which functions f : {+1,-1}^d \to \R can neural networks learn
when trained with SGD? In this talk, we will consider functions that depend
only on a small number of coordinates. We will study the dynamics of
two-layer neural networks in the mean-field parametrization, trained by SGD
with O(d) samples and will show a hierarchical property, the
“merged-staircase property”, that is both necessary and nearly sufficient
for learning in this setting. We will use this to propose a notion of “leap
complexity” for the more general setting with O(d^c) samples. Finally, we
will discuss how the necessity of a low leap complexity extends to deeper
networks beyond 2 layer networks.
Joint work with Emmanuel Abbe and Theodor Misiakiewicz.
*Bio: *Enric is 4th year PhD student in the EECS department at MIT, advised
by Guy Bresler and Philippe Rigollet. He received his undergraduate degree
in mathematics from Princeton University, where he was advised by Emmanuel
Abbe. His interests are learning theory, average-case complexity,
high-dimensional statistics, optimal transport
--
*Brandie Jones *
*Executive **Administrative Assistant*
Toyota Technological Institute
6045 S. Kenwood Avenue
Chicago, IL 60637
www.ttic.edu
Working Remotely on Tuesdays
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.cs.uchicago.edu/pipermail/theory/attachments/20230124/03b10eea/attachment.html>
More information about the Theory
mailing list