[Colloquium] TTI-C Talk: Raquel Urtasun, UC Berkeley

Julia MacGlashan macglashan at tti-c.org
Wed Apr 22 11:23:23 CDT 2009


REMINDER

When:             Thursday, April 23rd @ 11:00am (lunch will be provided
after talk)

Where:            6045 S Kenwood Ave, TTI-C Conference Room #526 (5th Floor)

Who:               Raquel Urtasun, UC Berkeley

Title:                Non-parametric models for the analysis of human
behavior


Understanding human behavior is an important research endeavor because its
success can have a tremendous positive impact on our everyday lives. Health,
biology, psychology, robotics and the game industry are some examples
amongst the wide range of applications.  The development of the web together
with the reduction of storage and processing cost have recently made
available a large amount of data key for the analysis of human behavior.

This data can have very different properties and the underlying generative
process can be unknown or very difficult to model. Non-parametric models are
well suited for the analysis of human behavior since they make very few
assumptions and the structure of the problem is determined from the data. In
this talk I will show how Gaussian Processes can be used to model different
aspects of human behavior including their motions, the world they live in,
and their preferences.

When dealing with high-dimensional data, it is desirable to reduce the
dimensionality of the data while preserving the original information in the
data distribution, allowing for more efficient learning and inference.
Linear dimensionality reduction techniques and graph-based methods are
popular approaches but can result in poor approximations when dealing with
complex datasets or when the manifold assumption is violated, e.g., sparse
noisy data.  Non-linear latent variable models can recover complex manifolds
but they suffer from local minima, since they rely on the optimization of
complex non-linear functions that are non-convex. Moreover, there is no
principled way to choose the dimensionality of the latent space.

The rank of a matrix is often an efficient way to describe the complexity or
dimensionality of a system. When learning low dimensional representations we
would like to encourage the rank of the latent space to be small.  In this
talk, I will construct a relaxation of the rank minimization problem and
build a prior over the latent space that encourages sparsity of the singular
values  resulting in low-dimensional representations. In doing so, our
method is able to simultaneously estimate the latent space and its
dimensionality in a continuous fashion.

Gaussian processes scale poorly with the size of the training data, its
computational complexity being O(N^3), with N the number of examples.  In
applications such as the prediction of user preferences the amount of data
can be arbitrarily large, e.g., millions of examples for Netflix.  In this
talk I will show how to exploit the inherent sparseness of the data to
design a stochastic gradient descent algorithm that can effectively learn
non-linear representations from very large databases.

Bio: 
Raquel Urtasun is currently a Postdoctoral Research Scientist at UC Berkeley
EECS & ICSI working with Prof. Trevor Darrell. Her main research areas are
computer vision, machine learning and computer graphics. During 2006-2008,
she was a postdoctoral associate at MIT-CSAIL. She earned her PhD at EPFL
(Switzerland) in 2006 under the supervision of Prof. Pascal Fua on Motion
Models for Robust 3D Human Body Tracking.

Contact:          Greg Shakhnarovich, TTI-C		greg at tti-c.org
834-2572






-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://mailman.cs.uchicago.edu/pipermail/colloquium/attachments/20090422/d37db42d/attachment.htm 


More information about the Colloquium mailing list