[Theory] REMINDER: 1/27 Talks at TTIC: Melanie Weber, Princeton University

Mary Marre mmarre at ttic.edu
Tue Jan 26 18:05:44 CST 2021


*When:*      Wednesday, January 27th at* 11:10 am CT*



*Where:*     Zoom Virtual Talk (*register in advance here
<https://uchicagogroup.zoom.us/webinar/register/WN_R4zb27lbQG-o0ZlYZM2g2g>*)



*Who: *       Melanie Weber, Princeton University

*Title:*  Geometric Methods for Machine Learning and Optimization

*Abstract: *Many machine learning applications involve non-Euclidean data,
such as graphs, strings or matrices. In such cases, exploiting Riemannian
geometry can deliver algorithms that are computationally superior to
standard (Euclidean) approaches. This has resulted in an
increasing interest in Riemannian methods in the machine learning
community. In this talk, I will present two lines of work that utilize
Riemannian methods in machine learning. First, we consider the task of
learning a robust classifier in hyperbolic space. Such spaces have received
a surge of interest for representing large-scale, hierarchical data, due to
the fact that they achieve better representation accuracy
with fewer dimensions. We consider an adversarial approach for learning a
robust large margin classifier that is provably efficient. We
also discuss conditions under which such hyperbolic methods are guaranteed
to outperform their Euclidean counterparts. Secondly, we introduce
Riemannian Frank-Wolfe (RFW) methods for constrained optimization on
manifolds. Here, we discuss matrix-valued tasks for which RFW improves on
classical Euclidean approaches, including the computation of Riemannian
centroids and the synchronization of data matrices.

*Bio: *Melanie is a PhD student at Princeton University, where she is
advised by Charles Fefferman. Her research focuses on understanding the
geometric features of data mathematically and on developing machine
learning methods that utilize this knowledge. Prior to her PhD, she
received undergraduate degrees in Mathematics and Physics from the
University of Leipzig in Germany. She also spent time at MIT’s Laboratory
for Information and Decision Systems, the Max Planck Institute for
Mathematics in the Sciences and the research labs of Facebook and Google.

*Host:* David McAllester <mcallester at ttic.edu>


Mary C. Marre
Faculty Administrative Support
*Toyota Technological Institute*
*6045 S. Kenwood Avenue*
*Room 517*
*Chicago, IL  60637*
*p:(773) 834-1757*
*f: (773) 357-6970*
*mmarre at ttic.edu <mmarre at ttic.edu>*


On Wed, Jan 20, 2021 at 10:03 PM Mary Marre <mmarre at ttic.edu> wrote:

> *When:*      Wednesday, January 27th at* 11:10 am CT*
>
>
>
> *Where:*     Zoom Virtual Talk (*register in advance here
> <https://uchicagogroup.zoom.us/webinar/register/WN_R4zb27lbQG-o0ZlYZM2g2g>*
> )
>
>
>
> *Who: *       Melanie Weber, Princeton University
>
> *Title:*  Geometric Methods for Machine Learning and Optimization
>
> *Abstract: *Many machine learning applications involve non-Euclidean
> data, such as graphs, strings or matrices. In such cases, exploiting
> Riemannian geometry can deliver algorithms that are computationally
> superior to standard (Euclidean) approaches. This has resulted in an
> increasing interest in Riemannian methods in the machine learning
> community. In this talk, I will present two lines of work that utilize
> Riemannian methods in machine learning. First, we consider the task of
> learning a robust classifier in hyperbolic space. Such spaces have received
> a surge of interest for representing large-scale, hierarchical data, due to
> the fact that they achieve better representation accuracy
> with fewer dimensions. We consider an adversarial approach for learning a
> robust large margin classifier that is provably efficient. We
> also discuss conditions under which such hyperbolic methods are guaranteed
> to outperform their Euclidean counterparts. Secondly, we introduce
> Riemannian Frank-Wolfe (RFW) methods for constrained optimization on
> manifolds. Here, we discuss matrix-valued tasks for which RFW improves on
> classical Euclidean approaches, including the computation of Riemannian
> centroids and the synchronization of data matrices.
>
> *Bio: *Melanie is a PhD student at Princeton University, where she is
> advised by Charles Fefferman. Her research focuses on understanding the
> geometric features of data mathematically and on developing machine
> learning methods that utilize this knowledge. Prior to her PhD, she
> received undergraduate degrees in Mathematics and Physics from the
> University of Leipzig in Germany. She also spent time at MIT’s Laboratory
> for Information and Decision Systems, the Max Planck Institute for
> Mathematics in the Sciences and the research labs of Facebook and Google.
>
> *Host:* David McAllester <mcallester at ttic.edu>
>
>
>
> Mary C. Marre
> Faculty Administrative Support
> *Toyota Technological Institute*
> *6045 S. Kenwood Avenue*
> *Room 517*
> *Chicago, IL  60637*
> *p:(773) 834-1757*
> *f: (773) 357-6970*
> *mmarre at ttic.edu <mmarre at ttic.edu>*
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.cs.uchicago.edu/pipermail/theory/attachments/20210126/0244e1da/attachment.html>


More information about the Theory mailing list