[Colloquium] Reminder: today's talk by Adam Klivans

Margery Ishmael marge at cs.uchicago.edu
Mon Feb 2 08:27:23 CST 2004


------------------------------------------------------------------------ 
---

DEPARTMENT OF COMPUTER SCIENCE - TALK

Date: Monday, February 2, 2004
Time: 2:30 p.m.
Place: Ryerson 251

------------------------------------------------------------------------ 
----

Speaker:  ADAM KLIVANS, Harvard University

Url:  http://www.eecs.harvard.edu/~klivans/

Title:  Learning Halfspaces: New Algorithms and Applications

Abstract:

Learning a halfspace is one of the oldest and most fundamental tasks
in machine learning.  It is the essential underlying algorithm for many
systems used by practitioners from fields such as artifical  
intelligence,
statistics, and computational biology to classify data. In this talk we
will give several new approaches for learning halfspaces and describe
surprising algorithmic applications:

1) We show that learning halfspaces in {\it high dimensions}
yields fast algorithms for learning Boolean concept classes such as
DNF formulae, one of the major challenges in computational learning
theory. To this end, we show that any $s$-term DNF over $n$ variables
is equivalent to a halfspace in $O(2^{n^{1/3} \log s})$ dimensions.  As  
an
immediate consequence we obtain the fastest known DNF learning
algorithm which runs in time $2^{O(n^{1/3})}$.

2) We give a novel, Fourier-based algorithm for learning a halfspace.
This algorithm has an important advantage over previous work on
learning halfspaces: it generalizes to learning {\it intersections} of
halfspaces.  Intersections of halfspaces are a powerful concept class;
any convex body can be expressed as an intersection of halfspaces, and
little is known about the complexity of learning the intersection of
even two halfspaces.  Our techniques yield the first polynomial-time
algorithm for learning the intersection of a constant number of
halfspaces under the uniform distribution to within any constant error
parameter.

3) We show that learning intersections of halfspaces where a {\it
   margin} separates the positive data points from the negative data
points can be reduced to learning a single halfspace. As a result, we
give a perceptron-like algorithm for learning the intersection of a
constant number of halfspaces which runs in polynomial time as long as
the margin is at least $1/2^{\sqrt{\log n}}$.  The algorithm itself
is simple and admits an efficient implementation.

Host: Prof. Lance Fortnow

*Refreshments will follow the talk in Ryerson 255*

People in need of assistance should call 773-834-8977 in advance.




More information about the Colloquium mailing list