[Colloquium] Pankratov/Dissertation Defense/Apr 27, 2015

Margaret Jaffey margaret at cs.uchicago.edu
Mon Apr 13 10:25:19 CDT 2015



       Department of Computer Science/The University of Chicago

                     *** Dissertation Defense ***


Candidate:  Denis Pankratov

Date:  Monday, April 27, 2015

Time:  10:00 AM

Place:  Ryerson 277

Title: Communication Complexity and Information Complexity

Abstract:
Shannon introduced information theory in 1948. In Shannon's model, the
central question is to minimize the number of bits required to
transmit a message from one party to another over a (possibly noisy)
communication channel. Yao introduced communication complexity in
1979. In Yao's model, the central question is to minimize the amount
of communication required to compute a function of an input
distributed between two parties that communicate over a perfect
channel. In spite of the fact that communication complexity and
information theory try to quantify communication in various contexts,
communication complexity developed without the influence of
information theory. This changed recently when the notion of
information complexity was introduced. The current definition of
\emph{internal information complexity} is due to Bar-Yossef et al.
(2004), but similar definitions in restricted communication settings
can be traced back to Chakrabarti et al. (2001) and Ablayev (1996).

Information complexity enabled the use of information-theoretic tools
in communication complexity theory. Prior to the results presented in
this thesis, information complexity was mainly used for proving lower
bounds and direct-sum theorems in the setting of communication
complexity. We present three results that demonstrate new connections
between information complexity and communication complexity.

In the first contribution we thoroughly study the information
complexity of the smallest nontrivial two-party function: the
$\mathrm{AND}$ function. While computing the communication complexity
of $\mathrm{AND}$ is trivial, computing its exact information
complexity presents a major technical challenge. In overcoming this
challenge, we reveal that information complexity gives rise to rich
geometrical structures. Our analysis of information complexity relies
on new analytic techniques and new characterizations of communication
protocols. We also uncover a connection of information complexity to
the theory of elliptic partial differential equations. Once we compute
the exact information complexity of $\mathrm{AND}$, we can compute
\emph{exact communication complexity} of several related functions on
$n$-bit inputs with some additional technical work. For instance, we
show that communication complexity of the disjointness function on
$n$-bit inputs is $C_{\mathrm{DISJ}} n + o(n)$, where
$C_{\mathrm{DISJ}}\approx 0.4827$. This level of precision, i.e.,
revealing the actual coefficient in front of $n$, is unprecedented in
communication complexity. Previous combinatorial and algebraic
techniques could only prove bounds of the form $\Theta(n)$.
Interestingly, this level of precision is typical in the area of
information theory, so our result demonstrates that this meta-property
of precise bounds carries over to information complexity and in
certain cases even to communication complexity. Our result does not
only strengthen the lower bound on communication complexity of
disjointness by making it more exact, but it also shows that
information complexity provides the exact upper bound on communication
complexity. In fact, this result is more general and applies to a
whole class of communication problems.

In the second contribution, we use self-reduction methods to prove
strong lower bounds on the information complexity of two of the most
studied functions in the communication complexity literature: Gap
Hamming Distance ($\mathrm{GHD}$) and Inner Product mod 2
($\mathrm{IP}$). In our first result we affirm the conjecture that the
information complexity of $\mathrm{GHD}$ is linear even under the
\emph{uniform distribution}. This strengthens the $\Omega(n)$ bound
shown by Kerenidis et al. (2012) and answers an open problem by
Chakrabarti et al. (2012). We also prove that the information
complexity of $\mathrm{IP}$ is arbitrarily close to the trivial upper
bound $n$ as the permitted error tends to zero, again strengthening
the $\Omega(n)$ lower bound proved by Braverman and Weinstein (2011).
More importantly, our proofs demonstrate that self-reducibility makes
the connection between information complexity and communication
complexity lower bounds a two-way connection. Whereas numerous results
in the past used information complexity techniques to derive new
communication complexity lower bounds, we explore a generic way, in
which communication complexity lower bounds imply information
complexity lower bounds \emph{in a black-box manner}.

In the third contribution we consider the roles that private and
public randomness play in the definition of information complexity. In
communication complexity, private randomness can be trivially
simulated by public randomness. Moreover, the communication cost of
simulating public randomness with private randomness is well
understood due to Newman's theorem (1991). In information complexity,
the roles of public and private randomness are reversed: public
randomness can be trivially simulated by private randomness. However,
the information cost of simulating private randomness with public
randomness is not understood. We show that protocols that use only
public randomness admit a rather strong compression. In particular,
efficient simulation of private randomness by public randomness would
imply a version of a direct sum theorem in the setting of
communication complexity. This establishes a yet another connection
between the two areas.

The first and second contributions are the result of collaboration
with Braverman, Garg, and Weinstein. The third contribution is my work
alone.

Denis's advisor is Prof. Laszlo Babai

Login to the Computer Science Department website for details,
including a draft copy of the dissertation:

 https://www.cs.uchicago.edu/phd/phd_announcements#pankratov

=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
Margaret P. Jaffey            margaret at cs.uchicago.edu
Department of Computer Science
Student Support Rep (Ry 156)               (773) 702-6011
The University of Chicago      http://www.cs.uchicago.edu
=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=


More information about the Colloquium mailing list