[CS] Daniel Grzenda MS PresentationAug 25, 2025
via cs
cs at mailman.cs.uchicago.edu
Mon Aug 11 14:11:14 CDT 2025
This is an announcement of Daniel Grzenda's MS Presentation
===============================================
Candidate: Daniel Grzenda
Date: Monday, August 25, 2025
Time: 9 am CST
Remote Location: https://urldefense.com/v3/__https://uchicago.zoom.us/j/99397184323?pwd=JGUyq5VQcYZPJcDQR6hqbh8aav8Ted.1__;!!BpyFHLRN4TMTrA!5gAk2o-HIU_rwZXZYYdJE9CUUuX4nRU0hZZSq-BYiAA1UNMMSqnaQAxSM7LLe7xTAS5_QBcv-1-xosjqy0sa3Sk$
Location: JCL 298
Title: Multivariate Functional Approximation Neural Networks (MFANN)
Abstract: We present the Multivariate Functional Approximation Neural Network (MFANN), an architecture that combines the principles of multivariate functional approximation (MFA) with iterative optimization techniques commonly found in the neural network (NN) literature. MFA is a data modeling, compression, and visualization tool that uses the tensor product of B-spline functions to build continuous, differentiable representations of input data. We extend MFA to use stochastic iterative mini-batch optimization methods, periodically updating the spline-based models instead of numerically solving for the representation. Through an ablative analysis, we study the differences between these B-spline-based networks and traditional neural networks. Our analysis demonstrates MFANN is less prone to overfitting – generalizing to multiple resolutions of input data, while remaining flexible enough to fit complex analytical functions and real-world scientific data. We empirically compare MFANN to conventional MFA and multilayer perceptrons (MLPs). Our work highlights MFANN as a promising paradigm for advancing the theory and practice of data-driven function approximation with a new class of neural networks.
Advisors: Ian Foster, Kyle Chard, and Rana Hanocka
Committee Members: Ian Foster, David Lenz and Kyle Chard
More information about the cs
mailing list