[Colloquium] [masters-presentation] Willson/MS Presentation/Jun 22, 2020

Margaret Jaffey margaret at cs.uchicago.edu
Mon Jun 8 11:36:00 CDT 2020


This is an announcement of Emily Willson's MS Presentation.

Here is the Zoom link to participate:
https://us02web.zoom.us/j/89756025184?pwd=aUhVaXFSQ2RFOVVxOHBvelAzUG15QT09.
The meeting ID is 897 5602 5184, and the password is 8CudHr.

------------------------------------------------------------------------------
Date:  Monday, June 22, 2020

Time:  10:30 AM

Place:  remote via Zoom

M.S. Candidate:  Emily Willson

M.S. Paper Title: Backdoor Attacks on Facial Recognition in the
Physical World

Abstract:
New threats have emerged as neural networks are deployed in
increasingly security-critical settings. So-called “backdoor attacks”
are one of the most ominous. Backdoors are sleeper cells trained into
neural networks that cause the model to misclassify inputs containing
an attacker-chosen trigger. Prior work on backdoor attacks has created
triggers using photo-editing software (“artificial” triggers).
However, artificial triggers cannot be used to fool a neural network
in real time, as they require an attacker to manually edit images to
create a trigger that activates the backdoor. Despite this shortcoming
in prior work, backdoor attacks have been heralded as dangerous.

In this paper, we present the first in-depth study of the real world
threat of backdoor attacks that use “physical” triggers rather than
artificial ones. Physical triggers are real life objects that can be
used to control backdoor behaviors. Physical triggers of various
shapes and sizes perform successfully as backdoor triggers even under
adverse conditions such as poor lighting and degraded image quality.
No prior work has seriously considered the possibility of physical
triggers in backdoor attacks or analyzed backdoor robustness.
Additionally, we find that physical backdoor attacks remain highly
successfully even when attackers use stealthier methods such as
poisoning only small part of the dataset or poisoning a dataset using
artificial versions of physical triggers. Finally, we find that
existing backdoor defenses fail to detect physical triggers. All in
all, our work shows that physical backdoors are indeed real world
threats and highlights the need for further work to mitigate the
danger they pose.

Emily's advisors are Prof. Ben Zhao and Prof. Heather Zheng

Login to the Computer Science Department website for details:
 https://newtraell.cs.uchicago.edu/phd/ms_announcements#ewillson

=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
Margaret P. Jaffey            margaret at cs.uchicago.edu
Department of Computer Science
Student Support Rep (JCL 350)              (773) 702-6011
The University of Chicago      http://www.cs.uchicago.edu
=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=


More information about the Colloquium mailing list