[Theory] UPDATED: [TTIC Talks] 5/5 Talks at TTIC: Johannes Schmidt-Hieber, University of Twente
Brandie Jones
bjones at ttic.edu
Tue May 2 13:02:15 CDT 2023
*When: * Friday, May 5th at *10:30 AM CT*
*Where: *Talk will be given *live, in-person* at
TTIC, 6045 S. Kenwood Avenue
5th Floor, Room 530
*Virtually: * via Panopto* (Livestream
<https://www.google.com/url?q=https://uchicago.hosted.panopto.com/Panopto/Pages/Viewer.aspx?id%3De3aaf8be-0719-4b83-a446-aff50147fa10&sa=D&source=calendar&ust=1683402961318621&usg=AOvVaw3SVc5wQfUSy1EORxGDIn-5>)
*
*Who: *Johannes Schmidt-Hieber, University of Twente
*Title: *A statistical analysis of an image classification problem
*Abstract: *The availability of massive image databases resulted in the
development of scalable machine learningmethods such as convolutional
neural network (CNNs) filtering and processing these data. While the very
recent theoretical work on CNNs focuses on standard nonparametric denoising
problems, the variability in image classification datasets does, however,
not originate from additive noise but from variation of the shape and other
characteristics of the same object across different images. To address this
problem, we consider a simple supervised classification problem for object
detection on grayscale images. While from the function estimation point of
view, every pixel is a variable and large images lead to high-dimensional
function recovery tasks suffering from the curse of dimensionality,
increasing the number of pixels in our image deformation model enhances the
image resolution and makes the object classification problem easier. We
propose and theoretically analyze two different procedures. The first
method estimates the image deformation by support alignment. Under a
minimal separation condition, it is shown that perfect classification is
possible. The second method fits a CNN to the data. We derive a rate for
the misclassification error depending on the sample size and the number of
pixels. Both classifiers are empirically compared on images generated from
the MNIST handwritten digit database. The obtained results corroborate the
theoretical findings. This is joint work with Sophie Langer (Twente).
*B**io:* Johannes Schmidt-Hieber received the master’s degree from the
University of Göttingen, Germany, in 2007, and the joint Ph.D. degree from
the University of Göttingen and the University of Bern, Switzerland, in
2010. His Ph.D. degree was followed by two one-year post-doctoral visits at
Vrije Universiteit Amsterdam, The Netherlands, and ENSAE, Paris, France.
>From 2014 to 2018, he was an Assistant Professor at the University of
Leiden. Since 2018, he has been a Full Professor at the University of
Twente, The Netherlands. His research interests include mathematical
statistics, including nonparametric Bayes and statistical theory for deep
neural networks. He serves as an Associate Editor for the Annals of
Statistics, Bernoulli, Electronic Journal of Statistics, and Information
and Inference.
*Hos**t: Nathan Srebro <nati at ttic.edu>*
--
*Brandie Jones *
*Executive **Administrative Assistant*
Toyota Technological Institute
6045 S. Kenwood Avenue
Chicago, IL 60637
www.ttic.edu
Working Remotely on Tuesdays
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.cs.uchicago.edu/pipermail/theory/attachments/20230502/1283fda8/attachment-0001.html>
More information about the Theory
mailing list