[Colloquium] Reminder - Horace Pan Candidacy Exam/Dec 7, 2021

meganwoodward at uchicago.edu meganwoodward at uchicago.edu
Tue Dec 7 08:26:42 CST 2021


This is an announcement of Horace Pan's Candidacy Exam.
===============================================
Candidate: Horace Pan

Date: Tuesday, December 07, 2021

Time: 12:30 pm CST

Remote Location: https://uchicago.zoom.us/j/97936136167?pwd=U2N1MXZkVDBKQ2dma3BSQ0p0NWFHQT09 Meeting ID: 979 3613 6167 Passcode: 665743

Title: Leveraging Structure and Symmetry in Machine Learning

Abstract: In this thesis we describe three separate works: higher order permutation equivariant layers for neural networks, Fourier bases for reinforcement learning over combinatorial puzzles and the Multiscale Laplacian Graph kernel.

Recent work on permutation equivariant neural networks has mostly focused on the first order case (sets) and the second order case (graphs). We describe the machinery for generalizing permutation equivariance to arbitrary k-ary interactions and provide a systematic approach for efficiently computing these kth-order permutation equivariant layer and enumerating all the intermediate operations involved.

Traditionally, permutation puzzles such as the Rubik's Cube were often solved by heuristic search like A*-search and value based reinforcement learning methods. Both heuristic search and Q-learning approaches to solving these puzzles can be reduced to learning a heuristic/value function to decide what puzzle move to make at each step. We propose learning a value function using the irreducible representations basis (which we will also call the Fourier basis) of the puzzle’s underlying group. Classical Fourier analysis on real valued functions tells us we can approximate smooth functions with low frequency basis functions. Similarly, smooth functions on finite groups can be represented by the analogous low frequency Fourier basis functions.
                                                                                                       
Many graphs exhibit structure at multiple scales. Most existing kernels between graphs are either purely local or global in character. In contrast, by building a hierarchy subgraphs, the Multiscale Laplacian Graph Kernel(MLG kernels), can account for structure at a range of different scales. At the heart of the MLG construction is another new graph kernel, called the Feature Space Laplacian Graph kernel, which has the crucial property that it can lift a base kernel defined on the vertices of two graphs to a kernel between the graphs. The MLG kernel applies these FLG kernels to subgraphs recursively. To make the MLG kernel computationally feasible, we also introduce a randomized projection procedure, similar to the Nystrom method, but for RKHS operators.

Advisors: Risi Kondor

Committee Members: Risi Kondor, Eric Jonas, and Yuxin Chen



More information about the Colloquium mailing list