[CS] Ruoxi Jiang Candidacy Exam/July 24th, 2024

via cs cs at mailman.cs.uchicago.edu
Mon Jul 15 08:44:05 CDT 2024


This is an announcement of Ruoxi Jiang's Candidacy Exam.
===============================================
Candidate: Ruoxi Jiang

Date: Wednesday, July 24th, 2024

Time: 11am CT

Location: JCL 298

Title:  Contrastive representations in dynamical systems

Abstract: In this talk, focusing on dynamical systems, I will discuss how to learn physically meaningful representations, along with their explanations and applications in understanding and predicting the behavior of complex physical systems. Specifically, we will cover two tasks: simulation-based inference for parameter estimation and learning neural operators. 

In the absence of an analytic statistical model, modern simulation-based inference (SBI) approaches leverage the ability to simulate complex systems to draw inferences about their underlying processes, with some using flow-based models to enhance its capability of data inference. However, parameter inference for dynamical systems, such as weather and climate, is still difficult due to the high-dimensional nature of the data as well as the complexity of the physical models and simulations. We introduce Embed and Emulate, a new likelihood-free inference method for estimating arbitrary parameter posteriors based on contrastive learning. This approach learns a low-dimensional embedding for the data and a corresponding fast emulator in the embedding space, bypassing the need for running expensive simulations or high-dimensional emulators during inference. Theoretically, the symmetric contrastive objectives of Embed and Emulate ensure a robust recovery of the normalization constant of the posterior. Our empirical experiments on a realistic multimodal parameter estimation task using the high-dimensional, chaotic Lorenz 96 system verify its superiority, showcasing its effectiveness and accuracy in handling complex dynamical systems. 

We also demonstrate applications of representation learning in training neural operators, where uncovering patterns and structures in complex systems can lead to more accurate predictions and a deeper understanding of physical phenomena. Neural operators trained to minimize squared error losses often fail to reproduce statistical or structural properties of the dynamics over longer time horizons and can yield degenerate results. To address this challenge for making long-horizon forecasts, we propose an alternative framework designed to preserve invariant measures of chaotic attractors that characterize the time-invariant statistical properties of the dynamics. Specifically, in the multi-environment setting (where each sample trajectory is governed by slightly different dynamics), we show that a contrastive learning framework, which does not require any specialized prior knowledge, can preserve statistical properties of the dynamics.


Advisors: Rebecca Willett

Committee members: Rebecca Willett, Michael Maire, Yuxin Chen


More information about the cs mailing list