[Colloquium] Talks Thist Week at TTI-C: 5/23-5/27

Katherine Cumming kcumming at tti-c.org
Tue May 24 08:52:53 CDT 2005


(7) Learning Theory Program/ML Summer School 
Speaker:  Peter Culicover, Ohio State University
Speaker's Homepage: http://www.ling.ohio-state.edu/~culicove/
Time:  Wednesday, May 25th @ 2:00pm
Location:  International House-Assembly Hall
Title:  Adventures with Camile, a Computational Simulation of a
Minimilist Language Learner
Abstract:
TBA
 
(8) Learning Theory Program/ML Summer School 
Speaker: Garrett Mitchener, Duke University
Speaker's Homepage:  http://fds.duke.edu/db/aas/math/faculty/wgm
Time:  Wednesday, May 25th @ 3:00pm
Location:  International House-Assembly Hall
Title:  Game Dynamics with Learning and Evolution of Universal Grammar
Abstract:
TBA
 
(9) Learning Theory Program/ML Summer School 
Speaker: Ed Stabler, UCLA
Speaker's Homepage:
http://www.humnet.ucla.edu/humnet/linguistics/people/stabler/stabler.htm
Time:  Wednesday, May 25th @ 4:30pm
Location:  International House-Assembly Hall
Title: Feasible Language Learning
Abstract:
This talk will consider how some recent models of feasible learning
might apply to human language learning, with attention to how these
results complement traditional linguistic perspectives. 
 
(10) Learning Theory Program/ML Summer School 
Speaker: Cynthia Rudin, Courant Institute/NYU
Speaker's Homepage:  http://www.cns.nyu.edu/~rudin/main.html
Time:  Thursday, May 26th @ 2:00pm
Location:  International House-Assembly Hall
Title:  The Dynamics of AdaBoost
Abstract:
One of the most successful and popular learning algorithms is AdaBoost,
which is a classification algorithm designed to construct a "strong"
classifier from a "weak" learning algorithm. Just after the development
of AdaBoost nine years ago, scientists derived margin- based
generalization bounds to explain AdaBoost's unexpectedly good
performance. Their result predicts that AdaBoost yields the best
possible performance if it always achieves a "maximum margin" solution.
Yet, does AdaBoost achieve a maximum margin solution? Empirical and
theoretical studies conducted within this period conjecture the answer
to be "yes".

In order to answer this question, we look toward AdaBoost's dynamics. We
simplify AdaBoost to reveal a nonlinear iterated map. We then analyze
the convergence of AdaBoost for cases where cyclic behavior is found;
this cyclic behavior provides the key to answering the question of
whether AdaBoost always maximizes the margin. As it turns out, the
answer to this question turns out to be the opposite of what was thought
to be true!

In this talk, I will introduce AdaBoost, describe our analysis of
AdaBoost when viewed as a dynamical system, briefly mention a new
boosting algorithm which always maximizes the margin with a fast
convergence rate, and if time permits, I will reveal a surprising new
result about AdaBoost and the problem of bipartite ranking.

Cynthia Rudin: NSF postdoctoral research fellow, BIO-division.
 
(11) Learning Theory Program/ML Summer School 
Speaker:  John Goldsmith, University of Chicago
Speaker's Homepage: http://humanities.uchicago.edu/faculty/goldsmith/
Time:  Thursday, May 26 @ 3:10pm
Location:  TTI-C, Conference Room
Title:  Unsupervised Learning of Natural Language Morphology
Abstract:
The words in virtually all natural languages have internal structure,
generally quite complex structure; they are composed of substrings of
letters (or phonemes) called morphemes. Linguists are generally in
agreement as to what the right morphological analysis of words is; we
are concerned with developing a learning system which takes a large
sample of words from an arbitrary human language as its input and which
produces an analysis of the morphological structure of these words as
its output. We discuss our work so far on this problem, and current work
in progress. We will look primarily at English and Swahili, and perhaps
more briefly at a number of other languages. See
http://linguistica.uchicago.edu for more details
 
(12)Learning Theory Program/ML Summer School
Speaker:  Tin Kam Ho, Bell Labs, Lucent Technologies
Speaker's Homepage: http://cm.bell-labs.com/cm/cs/who/tkh/index.html
Time:  Friday, May 27 @ 1:00pm
Location:  TTI-C, Conference Room
Title:  Interactive Pattern Discovery with Mirage
Abstract:
Mirage is a Java-based graphical tool for open-ended pattern discovery
that combines human and machine capabilities for correlating observed or
simulated data from multiple perspectives and at different depths of
analysis. Mirage was developed to address the practical needs in
studying the rich context beyond the core classification tasks in many
real-world learning problems. Through highly flexible visual displays
and intuitive exploratory operations, it enables domain experts to
exercise their judgement at various stages in pattern analysis, and
assists analysts in obtaining insights into the data geometry and making
critical methodological choices.

I will show the tool's applications in analyzing photonics simulations,
evaluating performances of telecommunication systems, and in astronomy.
I will also discuss directions for future research on how to organize
the tool to lay a solid foundation for meeting the diverse needs and
enabling continuous growth.

For a demonstration of the software Mirage please Click Here
<http://cm.bell-labs.com/who/tkh/mirage> 
------------------------------------------------------------------------
-----
If you have questions, or would like to meet the speaker, please contact
Katherine at 773-834-1994 or kcumming at tti-c.org. For information on
future TTI-C talks or events, please go to the TTI-C Events
<http://ttic.uchicago.edu/events/events_dyn.php>  page. TTI-C:  1427
East 60th Street (University Press Building 2F), Chicago, IL  60637
 
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://mailman.cs.uchicago.edu/pipermail/colloquium/attachments/20050524/449867e6/attachment.htm


More information about the Colloquium mailing list