[Theory] TODAY: 1/26 Talks at TTIC: Dravyansh Sharma, Carnegie Mellon

Mary Marre mmarre at ttic.edu
Fri Jan 26 10:10:04 CST 2024


*When:*        Friday, January 26, 2024 at* 11:00** a**m CT   *


*Where:       *Talk will be given *live, in-person* at

                   TTIC, 6045 S. Kenwood Avenue

                   5th Floor, Room 530


*Virtually:*  *via *Panopto (*livestream
<https://uchicago.hosted.panopto.com/Panopto/Pages/Viewer.aspx?id=36342299-8940-4240-815c-b0fd00120def>*
)

*                        *limited access: see info below*


*Who: *         Dravyansh Sharma, Carnegie Mellon


------------------------------
*Title*: Data-driven Algorithm Design and Principled Hyperparameter Tuning
in Machine Learning

*Abstract*: For any new machine learning technique, a large body of
research often follows up in order to tune the technique to work suitably
for each of the numerous application areas, requiring significant
scientific and engineering efforts. Moreover, this typically involves
unprincipled approaches for hyperparameter selection without any guarantee
on global optimality. We develop the recently proposed paradigm of
data-driven algorithm design, and show how to tune some core machine
learning algorithms with formal near-optimality guarantees in statistical
and online learning settings.

 Given multiple problem instances of a learning problem from some problem
domain, we develop approaches to learn provably well-tuned parameters over
the domain and answer questions related to the number of problem samples
needed to learn a well-tuned learning algorithm. More precisely, our
approaches apply to the following diverse scenarios: selecting graph
hyperparameters in semi-supervised learning, setting regularization
coefficients in linear regression, controlling the robustness vs.
abstention trade-off in non-Lipschitz networks,meta-learning common
parameters for similar tasks, and learning adaptively in changing
environments. In addition to providing techniques for tuning fundamental
learning algorithms, we also develop tools applicable to data-driven design
more generally.

 *Speaker bio*:  Dravyansh (Dravy) Sharma is a final year doctoral student
at Carnegie Mellon University, advised by Nina Balcan. His research
interests include machine learning theory and algorithms, focusing on
provable hyperparameter tuning, adversarial robustness, and beyond
worst-case analysis of algorithms. His recent work develops techniques for
tuning fundamental machine learning algorithms to domain-specific data and
introduces new,powerful robust learning guarantees. He has published
several papers at top venues in the field of machine learning, including
NeurIPS, ICML, COLT, JMLR, etc.,with multiple papers awarded with Oral
presentations, and has interned with Google Research and Microsoft Research.
*Host: *Avrim Blum <Avrim at ttic.edu>

**Access to this livestream is limited to TTIC / UChicago (press panopto
link and sign in to your UChicago account with CNetID). *
Mary C. Marre
Faculty Administrative Support
*Toyota Technological Institute*
*6045 S. Kenwood Avenue, Rm 517*
*Chicago, IL  60637*
*773-834-1757*
*mmarre at ttic.edu <mmarre at ttic.edu>*


On Thu, Jan 25, 2024 at 5:02 PM Mary Marre <mmarre at ttic.edu> wrote:

> *When:*        Friday, January 26, 2024 at* 11:00** a**m CT   *
>
>
> *Where:       *Talk will be given *live, in-person* at
>
>                    TTIC, 6045 S. Kenwood Avenue
>
>                    5th Floor, Room 530
>
>
> *Virtually:*  *via *Panopto (*livestream
> <https://uchicago.hosted.panopto.com/Panopto/Pages/Viewer.aspx?id=36342299-8940-4240-815c-b0fd00120def>*
> )
>
> *                        *limited access: see info below*
>
>
> *Who: *         Dravyansh Sharma, Carnegie Mellon
>
>
> ------------------------------
> *Title*: Data-driven Algorithm Design and Principled Hyperparameter
> Tuning in Machine Learning
>
> *Abstract*: For any new machine learning technique, a large body of
> research often follows up in order to tune the technique to work suitably
> for each of the numerous application areas, requiring significant
> scientific and engineering efforts. Moreover, this typically involves
> unprincipled approaches for hyperparameter selection without any guarantee
> on global optimality. We develop the recently proposed paradigm of
> data-driven algorithm design, and show how to tune some core machine
> learning algorithms with formal near-optimality guarantees in statistical
> and online learning settings.
>
>  Given multiple problem instances of a learning problem from some problem
> domain, we develop approaches to learn provably well-tuned parameters over
> the domain and answer questions related to the number of problem samples
> needed to learn a well-tuned learning algorithm. More precisely, our
> approaches apply to the following diverse scenarios: selecting graph
> hyperparameters in semi-supervised learning, setting regularization
> coefficients in linear regression, controlling the robustness vs.
> abstention trade-off in non-Lipschitz networks,meta-learning common
> parameters for similar tasks, and learning adaptively in changing
> environments. In addition to providing techniques for tuning fundamental
> learning algorithms, we also develop tools applicable to data-driven design
> more generally.
>
>  *Speaker bio*:  Dravyansh (Dravy) Sharma is a final year doctoral
> student at Carnegie Mellon University, advised by Nina Balcan. His research
> interests include machine learning theory and algorithms, focusing on
> provable hyperparameter tuning, adversarial robustness, and beyond
> worst-case analysis of algorithms. His recent work develops techniques for
> tuning fundamental machine learning algorithms to domain-specific data and
> introduces new,powerful robust learning guarantees. He has published
> several papers at top venues in the field of machine learning, including
> NeurIPS, ICML, COLT, JMLR, etc.,with multiple papers awarded with Oral
> presentations, and has interned with Google Research and Microsoft Research.
> *Host: *Avrim Blum <Avrim at ttic.edu>
>
> **Access to this livestream is limited to TTIC / UChicago (press panopto
> link and sign in to your UChicago account with CNetID). *
>
>
> Mary C. Marre
> Faculty Administrative Support
> *Toyota Technological Institute*
> *6045 S. Kenwood Avenue, Rm 517*
> *Chicago, IL  60637*
> *773-834-1757*
> *mmarre at ttic.edu <mmarre at ttic.edu>*
>
>
> On Sat, Jan 20, 2024 at 4:53 PM Mary Marre <mmarre at ttic.edu> wrote:
>
>> *When:*        Friday, January 26, 2024 at* 11:00** a**m CT   *
>>
>>
>> *Where:       *Talk will be given *live, in-person* at
>>
>>                    TTIC, 6045 S. Kenwood Avenue
>>
>>                    5th Floor, Room 530
>>
>>
>> *Virtually:*  *via *Panopto (*livestream
>> <https://uchicago.hosted.panopto.com/Panopto/Pages/Viewer.aspx?id=36342299-8940-4240-815c-b0fd00120def>*
>> )
>>
>> *                        *limited access: see info below*
>>
>>
>> *Who: *         Dravyansh Sharma, Carnegie Mellon
>>
>>
>> ------------------------------
>> *Title*: Data-driven Algorithm Design and Principled Hyperparameter
>> Tuning in Machine Learning
>>
>> *Abstract*: For any new machine learning technique, a large body of
>> research often follows up in order to tune the technique to work suitably
>> for each of the numerous application areas, requiring significant
>> scientific and engineering efforts. Moreover, this typically involves
>> unprincipled approaches for hyperparameter selection without any guarantee
>> on global optimality. We develop the recently proposed paradigm of
>> data-driven algorithm design, and show how to tune some core machine
>> learning algorithms with formal near-optimality guarantees in statistical
>> and online learning settings.
>>
>>  Given multiple problem instances of a learning problem from some problem
>> domain, we develop approaches to learn provably well-tuned parameters over
>> the domain and answer questions related to the number of problem samples
>> needed to learn a well-tuned learning algorithm. More precisely, our
>> approaches apply to the following diverse scenarios: selecting graph
>> hyperparameters in semi-supervised learning, setting regularization
>> coefficients in linear regression, controlling the robustness vs.
>> abstention trade-off in non-Lipschitz networks,meta-learning common
>> parameters for similar tasks, and learning adaptively in changing
>> environments. In addition to providing techniques for tuning fundamental
>> learning algorithms, we also develop tools applicable to data-driven design
>> more generally.
>>
>>  *Speaker bio*:  Dravyansh (Dravy) Sharma is a final year doctoral
>> student at Carnegie Mellon University, advised by Nina Balcan. His research
>> interests include machine learning theory and algorithms, focusing on
>> provable hyperparameter tuning, adversarial robustness, and beyond
>> worst-case analysis of algorithms. His recent work develops techniques for
>> tuning fundamental machine learning algorithms to domain-specific data and
>> introduces new,powerful robust learning guarantees. He has published
>> several papers at top venues in the field of machine learning, including
>> NeurIPS, ICML, COLT, JMLR, etc.,with multiple papers awarded with Oral
>> presentations, and has interned with Google Research and Microsoft Research.
>> *Host: *Avrim Blum <Avrim at ttic.edu>
>>
>> **Access to this livestream is limited to TTIC / UChicago (press panopto
>> link and sign in to your UChicago account with CNetID). *
>>
>>
>>
>>
>> Mary C. Marre
>> Faculty Administrative Support
>> *Toyota Technological Institute*
>> *6045 S. Kenwood Avenue, Rm 517*
>> *Chicago, IL  60637*
>> *773-834-1757*
>> *mmarre at ttic.edu <mmarre at ttic.edu>*
>>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.cs.uchicago.edu/pipermail/theory/attachments/20240126/d10b9065/attachment-0001.html>


More information about the Theory mailing list