[Colloquium] Special Program in Learning Theory/Machine Learning Summer School Talks Next Week

Katherine Cumming kcumming at tti-c.org
Thu May 12 14:54:33 CDT 2005


TOYOTA TECHNOLOGICAL INSTITUTE TALKS

Guest Speaker (1)


Speaker:  Peter Bickel, UC Berkeley
Speaker's homepage: http://stat-www.berkeley.edu/users/bickel/


Time:  Monday, May 16th at 2:00pm 
Location: International House-University of Chicago


Title: On the borders of Statistics and Computer Science

Abstract: Machine learning in computer science and prediction and
classification in statistics are essentially equivalent fields. I will
try to illustrate the relation between theory and practice in this huge
area by a few examples and results. In particular I will try to address
an apparent puzzle: Worst case analyses, using empirical process theory,
seem to suggest that even for moderate data dimension and reasonable
sample sizes good prediction (supervised learning) should be very
difficult. On the other hand, practice seems to indicate that even when
the number of dimensions is very much higher than the number of
observations, we can often do very well. We also discuss a new method of
dimension estimation and some features of cross validation. 


Guest Speaker (2)

Speaker:  Ingo Steinwart, Los Alamos National Laboratory
Speaker's homepage:  http://www.c3.lanl.gov/~ingo/

Time:  Monday, May 16th at 3:00pm
Location: International House-University of Chicago



Title: Some Aspects of Learning Rates for SVMs
Abstract: We present some learning rates for support vector machine
classification. In particular we discuss a recently proposed geometric
noise assumption which allows to bound the approximation error for
Gaussian RKHSs. Furthermore we show how a noise assumption proposed by
Tsybakov can be used to obtain learning rates between $1/ \sqrt n $ and
$1/n$. Finally, we describe the influence of the approximation error on
the overall learning rate.



Guest Speaker (3)

Speaker:  Ding-Xuan Zhou, City University of Hong Kong

Speaker's homepage:  http://www6.cityu.edu.hk/ma/people/dxzhou.html

Time: Monday, May 16th at 4:30pm 
Location:  International House-University of Chicago


Title: Learning Variable Covariances via Gradients

Abstract: Learning theory studies learning function relations from
samples. In this talk we are interested in the learning of function
gradients from their sample values. A least-square type learning
algorithm based on the Tikhonov regularization in reproducing kernel
Hilbert spaces is proposed. We show with error bounds that the output
function converges to the gradient of the regression function as the
sample size becomes large. Hence variable selection and estimation of
covariation can be expected. An efficient method is provided to reduce
the size of the linear system when the number of variables is much
larger than the sample size. Some applications of our algorithm to gene
expression analysis will be mentioned. This is joint work with Sayan
Mukherjee. 



Guest Speaker (4)

Speaker:  Boaz Nadler, Yale University

Speaker's homepage:  



Time:  Wednesday, May 18th at 2:00pm
Location:  International House-University of Chicago
	

Title: TBA

Abstract: TBA

Guest Speaker (5)

Speaker:  Sayan Mukherjee, Duke University

Speaker's homepage:  http://www.stat.duke.edu/~sayan/



Time:  Wednesday, May 18th at 3:00pm
Location:  International House-University of Chicago
	

Title: Learning Patterns in Genomic Data: an Application of Statistical
Learning

Abstract: TBA


Guest Speaker (6)

Speaker:  Vladimir Temlyakov, University of South Carolina

Speaker's homepage:  http://www.math.sc.edu/people/temlyakov.html




Time:  Wednesday, May 18th at 4:30pm
Location:  International House-University of Chicago
	

Title: On Optimal Estimators in Learning Theory


Abstract: TBA

Guest Speaker (7)

Speaker:  Andrea Caponetto, University of Genoa

Speaker's homepage:
http://www.pascal-network.org/Network/Researchers/150/



Time:  Thursday, May 19th at 3:00pm
Location:  International House-University of Chicago
	

Title: Optimal Rates for Regularized Least-Squares Algirithm in
Semi-Supervised Regression

Abstract: TBA


Guest Speaker (8)

Speaker:  Petra Phillips, Australia National University

Speaker's homepage:
http://www.anu.edu.au/RSISE/teleng/teleng2004/people/students/petra.php


Time:  Thursday, May 19th at 4:30pm
Location:  International House-University of Chicago
	

Title: Data-Dependent Local Complexities for ERM


Abstract: We present data-dependent generalization bounds for a specific
algorithm which is of central importance in learning theory, namely the
Empirical Risk Minimization algorithm (ERM).

New results in Bartlett and Mendelson (2005) show that one can
significantly improve the estimates for the convergence rates for
empirical minimizers by a direct analysis of the ERM algorithm in the
case when the loss class satisfies certain structural assumptions. These
results are based on a local notion of complexity of subsets of
hypothesis functions with identical expected errors. We investigate the
extent to which one can estimate these convergence rates in a
data-dependent manner. We provide an algorithm which computes a
data-dependent upper bound for the expected error of empirical
minimizers in terms of the complexity of data-dependent local subsets.
These subsets are sets of functions of empirical errors of a given range
and can be determined based solely on empirical data. We then show that
the direct estimate in Bartlett and Mendelson (2005), which is an
essentially sharp estimate on the convergence rate for the ERM
algorithm, can not be recovered universally from empirical data. 



-----------------------------------------------------------------

If you have questions, or would like to meet the speaker, please contact
Katherine at 773-834-1994 or kcumming at tti-c.org.   For information on
future TTI-C talks and events, please go to the TTI-C Events page:
http://www.tti-c.org/events.html.  TTI-C (1427 East 60th Street,
Chicago, IL  60637)







-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://mailman.cs.uchicago.edu/pipermail/colloquium/attachments/20050512/91cff6cd/attachment.htm


More information about the Colloquium mailing list