Title: Computational Entropy and Information Leakage
Authors: Benjamni Fuller and Leonid Reyzin
Date: January 7, 2011
Abstract:
We investigate how information leakage reduces computational entropy
of a random variable X. Recall that HILL and metric computational
entropy are parameterized by quality (how distinguishable is X from a
variable Z that has true entropy) and quantity (how much true entropy
is there in Z). We prove an intuitively natural result: conditioning
on an event of probability p reduces the quality of metric entropy by
a factor of p and the quantity of metric entropy by log 1/p (note that
this means that the reduction in quantity and quality is the same,
because the quantity of entropy is measured on logarithmic scale). Our
result improves previous bounds of Dziembowski and Pietrzak (FOCS
2008), where the loss in the quantity of entropy was related to its
original quality. The use of metric entropy tightens the result of
Reingold et. al. (FOCS 2008) and makes it easy to measure entropy
even after conditioning on several events. Further, we simplify
dealing with information leakage by investigating conditional metric
entropy. We show that, conditioned on leakage of L bits, metric
entropy gets reduced by a factor 2^L in quality and L in quantity. Our
formulation allow us to formulate a -Y´chain rule¡ for leakage on
computational entropy. We show that conditioning on L bits of leakage
reduces conditional metric entropy by L bits. This is the same loss as
leaking from unconditional metric entropy. This result makes it easy
to measure entropy even after several rounds of information leakage.