Privacy Constrained Information Processing

Privacy Constrained Information Processing

Abstract

This paper studies communication scenarios where the transmitter and the receiver have different objectives due to privacy concerns, in the context of a variation of the strategic information transfer (SIT) model of Sobel and Crawford. We first formulate the problem as the minimization of a common distortion by the transmitter and the receiver subject to a privacy constrained transmitter. We show the equivalence of this formulation to a Stackelberg equilibrium of the SIT problem. Assuming an entropy based privacy measure, a quadratic distortion measure and jointly Gaussian variables, we characterize the Stackelberg equilibrium. Next, we consider asymptotically optimal compression at the transmitter which inherently provides some level of privacy, and study equilibrium conditions. We finally analyze the impact of the presence of an average power constrained Gaussian communication channel between the transmitter and the receiver on the equilibrium conditions.

\IEEEoverridecommandlockouts

1 Introduction

This paper studies communication scenarios where the transmitter and the receiver have different objectives due to privacy concerns of the transmitter. Consider, for example, the communication between a transmitter and a receiver, where the common objective of both agents is to minimize some objective function. However, the transmitter has an additional objective: Convey as little (accurate) information as possible about some privacy related information- correlated with the transmitted message- since the reconstruction at the receiver is reported into databases visible to other parties (government agencies, police, etc.). Obviously, the receiver is oblivious to this objective, i.e., privacy is not a common goal. Then, what kind of transmitter and receiver mappings (encoders and decoders) yield equilibrium conditions? How do compression at the transmitter or the presence of a noisy channel impact such equilibria?

Such problems where better informed transmitter communicates with a receiver who makes the ultimate decision concerning both agents have been considered in the economics literature under the name of “cheap talk” or strategic information transfer (SIT), see e.g., [1, 2] and the references therein. The SIT problem[1], involves settings where the private information, available only to the transmitter, affects the transmitter utility function. The receiver utility does not depend on this private information and thus is different from that of the transmitter. The objective of the agents, the transmitter and the receiver, is to maximize their respective utility functions. One of the main results of [1] is that, all Nash equilibrium points can be achieved by a quantizer as a transmitter strategy. Here, motivated by the conventional communication systems design, we analyze the Stackelberg equilibrium [3], where the receiver knows the encoding mappings and optimizes its decoding function accordingly. This fundamental difference between the two problem settings enables in the current case the use of Shannon theoretic arguments to study the fundamental limits of compression and communication in such strategic settings. In [4], the set of Stackelberg equilibria was studied for estimation with biased sensors. Here, we study the communication and compression with privacy constraints in the same context.

Figure 1: The problem setting

The privacy considerations have recently gained renewed interest, see e.g., [5, 6, 7] and the references therein. In [8, 9], Yamamoto studied a compression problem similar to the one considered here: find an encoder such that there exists a decoder that guarantees a distortion no larger than when measured with and at the same time cannot be smaller than , when measured with in conjunction with any other decoder. In [7], Yamamoto’s result was extended to some special cases to analyze the privacy-utility tradeoff in databases.

In this paper, we explicitly study the equilibrium conditions under transmitter’s privacy constraints. The contributions of this paper are:

  • We first formulate the problem, that involves minimization of a global objective by the encoder and the decoder subject to a privacy constraint measured by a different function.

  • Assuming an entropy based privacy measure and quadratic distortion measure, we characterize the achievable distortion-privacy region with or without compression at the transmitter.

  • We study the impact of the presence of an average power constrained Gaussian communication channel on the privacy-distortion trade-off.

2 Preliminaries

2.1 Notation

Let and denote the respective sets of real numbers and positive real numbers. Let denote the expectation operator. The Gaussian density with mean and variance is denoted as . All logarithms in the paper are natural logarithms and may in general be complex valued, and the integrals are, in general, Lebesgue integrals. Let us define to denote the set of Borel measurable, square integrable functions }. For information theoretic quantities, we use standard notations as, for example, in [10]. We let and denote the entropy of a discrete random variable (or differential entropy if is continuous), and the mutual information between the random variables and , respectively.

2.2 Setting-1: Simple Equilibrium

We consider the general communication system whose block diagram is shown in Figure 1. The source and private information are mapped into which is fully determined by the conditional distribution . For the sake of brevity, and with a slight abuse of notation, we refer to this as a stochastic mapping so that

(1)

holds almost everywhere in and . Let the set of all such mappings be denoted by (which has a one-to-one correspondence to the set of all the conditional distributions that construct the transmitter output ).

The receiver produces an estimate of the source through a mapping as . An inspector observes the estimate of the receiver, aims to learn about the private information , i.e., minimize . Note that the joint statistics of the random variables is common knowledge. The common objective of the transmitter and the receiver is to minimize end-to-end distortion measured by a given distortion measure as

(2)

over the mappings subject to a privacy constraint:

(3)

over only the encoding mapping (the decoder is oblivious to the privacy objective). Here, the encoder aims to minimize in collaboration with the decoder (classical communication problem). The encoder has another objective, however: to maximize privacy, measured by, say or to guarantee that this privacy is not less than a given threshold, say . Note that the decoder has no interest in finding out this information, or in satisfying or not satisfying this constraint. This subtle difference, i.e., the fact that there is a mismatch between the objectives of the decoder and those of the encoder, motivates us to consider this problem in a game theoretic setting (the SIT problem). In game theoretic terms, we consider a constrained Stackelberg game where only one of the players (the encoder) is concerned with the constraint in (3). Here, the encoder knows that the decoder will act to minimize the global cost . Hence, Player 1 (leader) is the encoder and it knows that Player 2 (follower, the decoder) acts to minimize . The leader (the encoder) acts to minimize subject to (3) knowing the decoder’s objective. In the following, we present this optimization problem formally:

Problem 1

Find which minimizes

subject to

where

In this paper, we specialize to quadratic Gaussian settings, i.e., the source and the private information are jointly Gaussian, the distortion measure is mean squared error and the privacy measure is conditional entropy. Particularly, this setting implies that , where ; without loss of any generality we take , and naturally also . Further, the distortion and privacy measures are given as follows:

(4)

and

(5)

which results in conditional entropy as the privacy measure . The following lemma is a simple consequence of the fact that the Gaussian distribution maximizes entropy subject to covariance constraints (see [11]). This lemma will be used to convert the equilibrium conditions related to compression and communication problems to a control theoretic framework (an optimization problem involving second order statistics).

Lemma 1

At equilibrium, are jointly Gaussian.

Lemma 1 ensures optimality of a linear decoder and hence since is an invertible function of . Note that invertibility of the decoding mapping, and hence this simplification in the privacy constraint is a direct consequence of the Stackelberg equilibrium. The Nash equilibrium variant of the same problem, studied in [1] without a privacy constraint, does not yield , since the decoding mapping at equilibrium is not invertible (quantizer based). Lemma 1 and the fact that maximizing is equivalent to maximizing for jointly Gaussian , enable the following reformulation of Problem 1:

Problem 2

Find where minimizes

subject to

In this paper, we show the existence of such an equilibrium, and its essential uniqueness 1.

2.3 Setting-2: Compression

Next, we consider the compression of the source subject to privacy constraints, and analyze this problem from an information theoretic perspective.

Formally, we consider an i.i.d. source and a private information sequence to be compressed to indices through . The receiver applies a decoding function to generate the reconstruction sequence . Due to the strategic aspect of the problem, we have one distortion measure and one privacy measure. Similar to previous settings, we assume that the distortion is measured by MSE

(6)

and privacy is measured by conditional entropy.

(7)

A triple is called achievable if for every and sufficiently large , there exists a block code such that

The set of achievable rate distortion triple is denoted here as . The following theorem, whose proof directly follows from the arguments in [8] for a general distortion measure and conditional entropy, characterizes the achievable region , i.e., converts the problem from an dimensional optimization to a single letter.

Theorem 1

is the convex hull of triples for

for a conditional distribution and a deterministic decoding function which satisfy

2.4 Setting-3: Communication over Noisy Channel

Finally, we consider an additive Gaussian noise between the transmitter and the receiver. This problem setting is shown in Figure 1, where the receiver observes , where is zero-mean Gaussian and distributed independent of and . Again, we focus on entropy based privacy and quadratic distortion measure and Gaussian variables. The problem can then be reformulated as:

Problem 3

Find that minimize

subject to

where .

3 Main Results

3.1 Simple Equilibrium

Note that Lemma 1 does not provide the exact form of the function , although it implies that for some and independent of and . The following observation involves the two extreme cases of this problem, i.e., the endpoints of the - curve.

Lemma 2

At maximum privacy, where , and at minimum privacy, where , the equilibrium is achieved at for some . In other words, at end points, there is no need to have the noise term .

{proof}

At minimum privacy, obviously the optimal transmitter strategy is which results in . The only way to achieve maximum privacy, , is to render transmitter output independent of . Since variables are jointly Gaussian, this can be achieved by simply transmitting the prediction error, where is MMSE prediction coefficient of from . After prediction, the privacy constraint is satisfied and adding noise only increases , hence is the optimal transmitter strategy. In the following, we obtain auxiliary functional properties of and as a function of the encoding mapping or equivalently and the curve.

Lemma 3

and are concave functions of , and is an increasing, concave function of .

{proof}

Let be the random variables achieving be characterized by , and for .

For we define

Then, for any decoding function

(8)

which shows the concavity of in . Following similar steps, we obtain concavity of in , i.e., we have

(9)

Note that is non-decreasing since as expressed in Problem 2, is a minimization over a constraint set, as increases, minimization is performed over a smaller set, hence is non-decreasing.

Toward showing concavity of , we first note that one can show concavity of in where is the inspector’s estimate of , following similar steps to the preceding analysis. Then, let be the random variables achieving be characterized by for . We need to show

(10)

for all .

(11)
(12)
(13)

where the last step is due to the fact that is linear in .

Since both privacy and distortions measures are continuous, we only needed to show concavity, since monotonicity is a consequence of concavity and continuity.

Lemma 2 describes the equilibrium conditions at the end points. The following theorem provides the exact characterization of this equilibrium over the entire region.

Theorem 2

For the quadratic Gaussian setting with entropy base privacy constraint, the (essentially) unique equilibrium is achieved by and where and are constants given as:

(14)
(15)
Remark 1

An interesting aspect of the solution is that adding independent noise is strictly suboptimal in achieving the privacy-distortion trade-off.

{proof}

First, we note that the optimal decoder mapping is

(16)

regardless of the choice of encoder’s policy . Hence, the problem simplifies to an optimization over the encoding mapping .

Noting that at equilibrium and are jointly Gaussian, without loss of generality, we take where is independent of and . In the following, we find the value of and at equilibrium. First, let us express and using standard estimation techniques:

(17)
(18)
(19)
(20)
Figure 2: curve

Problem 1 can now be converted to an unconstrained [12] minimization of the Lagrangian cost:

(21)

for where varying provides solutions at different levels of privacy constraint . A set of necessary conditions for optimality can be obtained by applying K.K.T. conditions, one of which is that is the slope of the curve:

(22)

at the optimal values of and . Let us expand

The value of depends on the sign of , i.e., if , then . In the following, we show that for all , and hence .

First, we note that the minimum value of is reached at the maximum allowed , as depicted in Figure 2. From Lemma 2 for , the solution implies that . Hence, at . Plugging the values, we obtain . Following similar steps for , we obtain . For and , we have

(23)

Note that from Cauchy-Schwarz inequality, we have and noting that , we conclude . Hence, for all values of . Toward obtaining , plug into (20), to obtain:

The solution to this second order equation is simply

(24)

Both of these solutions (corresponding to ) satisfy the privacy constraint with equality, and the following one achieves lower , and hence is the optimal solution:

(25)

The optimal decoding mapping is

(26)

3.2 Problem-2

Next, we consider compression of jointly Gaussian source-private information with entropy based privacy measure and MSE distortion. First, we observe that all equilibrium points are achieved by a jointly Gaussian and hence can be written as for some , and is Gaussian and independent of and .

The proof of this statement follows from the well-known property of Gaussian distribution achieving maximum entropy under a variance constraint and the steps in the proof of Theorem 1. Hence, the test channel achieving the rate-distortion function adds independent Gaussian noise (forward test channel interpretation of Gaussian rate distortion also holds in this privacy constrained setting). Note that the privacy constraint is always active in the simple equilibrium setting, and the equilibrium is at the boundary of the constraint set, i.e., we find optimal by setting the privacy constraint equality. In the compression case, compression itself provides some level privacy inherently (it is evident from the forward channel interpretation of the RD function). Hence, the privacy constraint may not be active in the compression case, which yields . The following theorem characterizes the optimal rate-distortion-privacy trade-off.

Theorem 3

For a given , the space of is given as

(27)
(28)

where

(29)

and

as a function of .

{proof}

We have for some where is zero-mean Gaussian with variance and independent of and . This representation yields, by standard estimation theoretic techniques, the following characterization of in terms of :

(30)
(31)
(32)

Following steps similar to the ones in the proof of Theorem 2, we can express in terms of , when the privacy constraint is active, as:

Form these solutions, the following achieves a lower value for :

When the privacy constraint is already satisfied, .

3.3 Problem-3

We next focus on noisy communication settings, i.e., we assume there is an additive white Gaussian noise as shown in Figure 2. The following theorem provides the encoding and decoding mappings at the equilibirum.

Theorem 4

For the quadratic Gaussian communication setting with entropy based privacy constraint, the (essentially) unique equilibrium is achieved by

and where and are constants given as:

(33)

and

(34)

and

{proof}

First, we observe that linear mappings are optimal, due to the well-known optimality of linear mappings (without the privacy constraint) [13], and the fact that jointly Gaussian maximizes with fixed second-order statistics. Next, we assume without any loss of generality, that for some and . Then, we have

Hence,

(35)
(36)

Note that since , we can re-express as

(37)
(38)
(39)

Note that the power constraint is always active, and when the privacy constraint is active, we have:

From these solutions, the following achieves a lower :

When the privacy constraint is already satisfied, .

4 Conclusion

In this paper, we have addressed some fundamental problems associated with strategic communication in the presence of privacy constraints. Although the compression and communication problems are inherently information theoretic, for entropy based privacy measure, MSE distortion and jointly Gaussian source and private information, the problem admits a control theoretic representation (optimization over second order statistics). We have explicitly characterized the equilibrium conditions for compression and communication under privacy constraints. Rather surprisingly, the simple equilibrium solution (without compression) does not require addition of independent noise to satisfy the privacy constraints, as opposed to the common folklore in such problems. Some future directions include using results presented in this paper for decentralized stochastic control problems (see [14]) with privacy constraints, extending the approach to vector and network settings, and finally investigating implications in economics. \appendices

Footnotes

  1. The optimal transmitter and receiver mappings are not strictly unique, in the sense that multiple trivially “equivalent” mappings can be used to obtain the same MSE and privacy costs. For example, the transmitter can apply any invertible mapping to and the receiver applies the prior to . To account for such trivial, essentially identical solutions, we use the term “essentially unique”.

References

  1. V. Crawford and J. Sobel, “Strategic information transmission,” Econometrica: Journal of the Econometric Society, pp. 1431–1451, 1982.
  2. M. Battaglini, “Multiple referrals and multidimensional cheap talk,” Econometrica, vol. 70, no. 4, pp. 1379–1401, 2002.
  3. T. Başar and G. Olsder, Dynamic Noncooperative Game Theory, Society for Industrial Mathematics (SIAM) Series in Classics in Applied Mathematics, 1999.
  4. F. Farokhi, A. Teixeira, and C. Langbort, “Estimation with strategic sensors,” arXiv:1402.4031, 2014.
  5. C. Dwork, “Differential privacy,” in Encyclopedia of Cryptography and Security, pp. 338–340. Springer, 2011.
  6. F. McSherry and K. Talwar, “Mechanism design via differential privacy,” in Foundations of Computer Science, 2007. FOCS’07. 48th Annual IEEE Symposium on. IEEE, 2007, pp. 94–103.
  7. L. Sankar, R. Rajagopalan, and H. V. Poor, “Utility-privacy tradeoffs in databases: An information-theoretic approach,” IEEE Transactions on Information Forensics and Security, vol. 8, no. 6, pp. 838–852, 2013.
  8. H. Yamamoto, “A rate-distortion problem for a communication system with a secondary decoder to be hindered,” IEEE Transactions on Information Theory, vol. 34, no. 4, pp. 835–842, 1988.
  9. H. Yamamoto, “A source coding problem for sources with additional outputs to keep secret from the receiver or wiretappers (corresp.),” IEEE Transactions on Information Theory,, vol. 29, no. 6, pp. 918–923, Nov 1983.
  10. T. Cover and J. Thomas, Elements of Information Theory, John Wiley & Sons, 2012.
  11. S. N. Diggavi and T. M. Cover, “The worst additive noise under a covariance constraint,” IEEE Transactions on Information Theory, vol. 47, no. 7, pp. 3072–3081, Nov 2001.
  12. S. Boyd and L. Vandenberghe, Convex Optimization, Cambridge University Press, 2004.
  13. T. Goblick Jr, “Theoretical limitations on the transmission of data from analog sources,” IEEE Transactions on Information Theory, vol. 11, no. 4, pp. 558–567, 1965.
  14. R. Bansal and T. Başar, “Stochastic teams with nonclassical information revisited: When is an affine law optimal?,” IEEE Transactions on Automatic Control, vol. 32, no. 6, pp. 554–559, 1987.
112057
This is a comment super asjknd jkasnjk adsnkj
Upvote
Downvote
Edit
-  
Unpublish
""
The feedback must be of minumum 40 characters
The feedback must be of minumum 40 characters
Submit
Cancel
Comments 0
Request comment
""
The feedback must be of minumum 40 characters
Add comment
Cancel
Loading ...

You are asking your first question!
How to quickly get a good answer:
  • Keep your question short and to the point
  • Check for grammar or spelling errors.
  • Phrase it like a question
Test
Test description