Calibration to the Global Sensitivity

Một phần của tài liệu Database anonymization privacy models, data utility, and microaggregation based inter model connections (Trang 84 - 88)

LetDbe the domain of possible data sets. Letf WD!Rkbe a query function that maps data sets to vectors of real numbers. Here we seek to come up with a differentially private mechanismf

of the formf.X /Df .X /CNoi se, where the distribution of the random noise is independent of the actual data setX.

e amount of noise required depends on the variability of the functionf between neighbor data sets (data sets that differ in one record). e greater thel1-sensitivity, the greater the amount of noise that will be required to mask the effect of any single individual record in the response of the query.

Definition 8.4 l1-sensitivity el1-sensitivity of a functionf WD!Rkis

f D max

x; y2D d.x; y/D1

kf .x/ f .y/k1:

e Laplace mechanism

Random noise with Laplace distribution is commonly used to attain-differential privacy. e density of the Laplace distribution with meanand scaleb,Lap.; b/, is

Lap;b.x/D 1 2bexp

jxj b

:

8.2. CALIBRATION TO THE GLOBAL SENSITIVITY 67

eorem 8.5 e Laplace mechanism Let f WD!Rk be a function mapping data sets to vectors of real numbers. e Laplace mechanism

ML.x; f; /Df .x/C.N1; : : : ; Nk/

whereNiLap.0; f =/are independent random variables, gives-differential privacy.

e optimal a.c. mechanism

e Laplace mechanism is the most common choice to obtain-differential privacy for a given query f WD!Rk. However, Laplace noise is not optimal, in the sense that other noise dis- tributions can yield-differential privacy while having their probability mass more concentrated around zero.

Deciding which one among a pair of random noise distributions,N1andN2, yields greater utility is a question that may depend on the users’ preferences. e goal here is to come up with an optimality notion that is independent from the users’ preferences. IfN1 can be constructed fromN2 by moving some of the probability mass toward zero (but without going beyond zero), thenN1 must always be preferred toN2. e reason is that the probability mass ofN1 is more concentrated around zero, and thus the distortion introduced byN1 is smaller. A rational user always prefers less distortion and, therefore, prefersN1toN2.

For a random noise distribution inR, we use the notation< 0; ˛ >, where˛2R, to de- note the intervalŒ0; ˛when˛0, and the intervalŒ˛; 0when˛0. IfN1can be constructed fromN2 by moving some of the probability mass toward zero, it must be Pr.N1 2< 0; ˛ >/

Pr.N22< 0; ˛ >/for any˛2R. us we define:

Definition 8.6 LetN1 andN2 be two random distributions onR. We say thatN1 is smaller thanN2, denoted byN1 N2if Pr.N1 2< 0; ˛ >/Pr.N2 2< 0; ˛ >/for any˛2R. We say thatN1is strictly smaller thanN2, denoted byN1< N2, if some of the previous inequalities are strict.

e previous definition deals only with univariate noise distributions. e concept of op- timal multivariate noise can also be defined. However, dealing with multiple dimensions makes things more complex. Here we restrict the discussion to univariate random noises. See [92] for more details on the multivariate case.

We use the previous order relationship to define the concept of optimal random noise. A noise is optimal within a class if there is no other noise in the class that is strictly smaller.

Definition 8.7 A random noise distributionN1 is optimal within a class C of random noise distributions ifN1 is minimal withinC; in other words, there is no other random noiseN2 2C such thatN2 < N1.

e concept of optimality is relative to a specific class C of random noise distributions.

e goal is to determine the optimal noise for a query functionf that take values inR. To this

68 8. DIFFERENTIAL PRIVACY

end, we first need to determine which is the class of random noisesC that provide differential privacy forf. Indeed,Ccan be directly defined as the class of noise distributions that satisfy the requirements of differential privacy. However, such definition is not very useful in the construction of the optimal noise. Here we seek to characterize the noise distributions that give-differential privacy in terms of the density function.

Proposition 8.8 LetNbe an a.c. random noise with values inR. LetfNbe the density function ofN. For a query functionf WD!R, the mechanismf CN gives-differential privacy if

fN.x/exp./fN.xCf / (8.1)

for allx2Rcontinuity point offN such thatxCf is also a continuity point.

Now we show that the Laplace distribution is not optimal. e basic idea is to concentrate the probability mass around 0 as much as possible. is can only be done to a certain extent, because Inequality (8.1) limits our capability to do so.

In the construction of the distribution we will split the domain of fN into intervals of the form Œif; .i C1/f  where i 2Z. For each interval we will redistribute the probabil- ity mass that fN assigns to that interval. e new density function will take only two val- ues (see Figure8.1): maxx2Œif;.iC1/f fN.x/at the portion of the interval closer to zero and minx2Œif;.iC1/f fN.x/at the portion of the interval farther from zero. e result is an abso- lutely continuous distribution where the probability mass has clearly been moved toward zero. It can be checked that this distribution satisfies Inequality (8.1).

e process used to show that the Laplace distribution is not optimal for -differential privacy can be generalized to show that the same construction is possible independently of the initial noise distributionN.

eorem 8.9 LetN be an a.c. random noise with zero mean that provides-differential privacy to a query functionf. en there exists a random noiseN0with density of the form

fN0.x/D 8ˆ ˆˆ ˆ<

ˆˆ ˆˆ :

Mexp. i / x2Œ d .iC1/f; d if ; i 2N

M x2Œ d; 0

M x2Œ0; d 

Mexp. i / x2ŒdCif; dC.iC1/f ; i 2N that provides-differential privacy tof and satisfiesN0 N.

Now it only remains to show that the distributions constructed in eorem 8.9 are in- deed optimal. is is done by checking that, for such distributions, it is not possible to move the probability mass toward 0 any more. at is, if we try to move more probability mass toward 0, -differential privacy stops being satisfied.

eorem 8.10 LetN be a random noise distribution with density functionfN of the form specified in eorem8.9. enN is optimal at providing-differential privacy.

8.2. CALIBRATION TO THE GLOBAL SENSITIVITY 69

-4 -2 0 2 4

0.1 0.2 0.3 0.4 0.5

Figure 8.1: Construction of an optimal distribution based on the Laplace(0,1) distribution.

e discrete Laplace mechanism

e previous mechanisms (based on the addition of a random noise with values inR) are capable of providing differential privacy to query functions with values inZ. However, for such query functions, the use of a noise distribution with support over Z is a better option. e discrete version of the Laplace distribution is defined as:

Definition 8.11 Discrete Laplace distribution [46] A random variableN follows the discrete Laplace distribution with parameter˛2.0; 1/, denoted byDL.˛/, if for allk2Z

Pr.N Di /D 1 ˛ 1C˛˛jij:

Like the Laplace distribution, the discrete Laplace distribution can be used to attain- differential privacy. For this purpose, parameter˛must be adjusted to the desired level of differ- ential privacy and to the global sensitivity of the query.

eorem 8.12 e discrete Laplace mechanism Letf WD!Zkbe a function mapping data sets to vectors of integers. e discrete Laplace mechanism

MDL.x; f; /Df .x/C.N1; : : : ; Nk/

70 8. DIFFERENTIAL PRIVACY

whereNi DL.exp. =f //are independent random variables, gives-differential privacy.

Một phần của tài liệu Database anonymization privacy models, data utility, and microaggregation based inter model connections (Trang 84 - 88)

Tải bản đầy đủ (PDF)

(138 trang)