<html><body style="word-wrap: break-word; -webkit-nbsp-mode: space; -webkit-line-break: after-white-space; ">DEPARTMENT OF COMPUTER SCIENCE<br><br>UNIVERSITY OF CHICAGO<br><br>Date: Friday, March 7, 2008<br>Time: 2:30 p.m.<br>Place: Ryerson 251, 1100 E. 58th Street<br><br>--------------------------------------------------<br><br>Speaker:<span class="Apple-tab-span" style="white-space: pre; ">        </span>Philippe Rigollet<br><br>From:<span class="Apple-tab-span" style="white-space: pre; ">        </span><span class="Apple-tab-span" style="white-space: pre; ">        </span>Georgia Institute of Technology<br><br>Web page:<span class="Apple-tab-span" style="white-space: pre; ">        </span><a href="http://www.math.gatech.edu/~rigollet/">http://www.math.gatech.edu/~rigollet/</a><br><br>Title: Stochastic Convex Optimization Using Mirror Averaging Algorithms<br><br>Abstract: Several statistical problems where the goal is to minimize <br>an unknown convex risk function, can be formulated in the general <br>framework of stochastic convex optimization. For example the problem <br>of model selection and more generally of aggregation can be treated <br>using the machinery of stochastic optimization in several frameworks <br>including density estimation, regression and convex classification. We <br>describe a family of general algorithms called "mirror averaging <br>algorithms" that yield an estimator (or a classifier) which attains <br>optimal rates of model selection in several interesting cases. The <br>theoretical results are presented in the form of exact oracle <br>inequalities similar to those employed in optimization theory. The <br>practical performance of the algorithms is illustrated on several real <br>and artificial examples and compared to standard estimators or <br>classifiers.<br><br>----------------------------------------------<br><br>Host: Partha Niyog</body></html>