Strong Secrecy and Stealth for Broadcast Channels with Confidential Messages
This paper extends the weak secrecy results of Liu et al. for broadcast channels with two confidential messages to strong secrecy. Our results are based on an extension of the techniques developed by Hou and Kramer on bounding Kullback-Leibler divergence in context of resolvability and effective secrecy.
Information theoretic security, resolvability, strong secrecy, broadcast channel.
Based on the pioneering work of Shannon , Wyner  determined the secrecy capacity of a class of wiretap channels. Wyner’s work has been generalized by Csiszár and Körner  to the non-degraded broadcast channel with a single confidential message for one user and a common message intended for both users. The confidential message has to be kept secret from the other user, while both of them decode the common message. A variant of Csiszár and Körner’s model with two confidential messages and no common message was first studied by Liu et al. in  for a discrete memoryless channel and later for the Gaussian case in . An inner and outer bound on the secrecy region can be found in .
The secrecy criterion used in [2, 3, 4], is the normalized mutual information between the message and the output distribution of the user regarded as the eavesdropper. This security criterion, called weak secrecy, delivers a very restricted security against eavesdropping attacks as was shown by Maurer in . An unnormalized definition of secrecy, called strong secrecy, was proposed in  and large parts of previous work was extended from weak to strong secrecy (e.g. , , , and, etc.). In this paper we extend the inner bound of  from weak to strong secrecy. Along the proof we extend the method of Hou and Kramer  based on even stronger notion of effective secrecy to this scenario. More precisely we prove, in addition to strong secrecy, a stealthy communication (i.e. the presence of meaningful communication is hidden) is possible.
We use capital letters for random variables (RV). If is a RV then is used to refer to an observation of . Sets are denoted by and stands for the set of probability distributions defined on the (finite set) . For the RVs with values in we write if they form a Markov chain. The symbol stands for . The mutual information between the RV’s and is denoted by , while and are entropy of and conditional entropy of given , respectively. Probability mass function of is or in short while probability of an event is denoted by . Kullback-Leibler divergence of two probability distributions defined on set is given by,
Ii System Model
We consider a broadcast scenario consisting of a sender (S) and two receivers. We assume that all channels are discrete memoryless with finite input alphabet , and finite output alphabets and . The conditional probability distribution governing the discrete memoryless broadcast channel (DMBCC) is given by
where, , , and .
The stochastic encoder at the sender is defined to be,
where and are the message sets for receiver 1 and 2, respectively. The decoder at the th node, , is defined as
Definition 1 (Strong Secrecy )
For every there is a non-negative integer such that for all ,
where the RVs and are distributed uniformly over and , respectively and the mutual information values are computed with respect to the distribution
with the stochastic encoder given in (3).
The probability of error at each node is
The above definition provides a strong condition on security. More details are given in section III-C where we show that even stronger notion of secrecy based on stealth can be achieved.
Iii Strong Secrecy for Broadcast Channels
In this section we present an achievable secure rate region with strong secrecy criterion for a DMBCC. The following theorem summarizes our results:
The rate pair () is achievable, in the sense of Definition 2, for DMBCC with confidential messages and strong secrecy criterion, if
where the information quantities are computed with respect to some probability distributions such that Markov chain condition holds and and are connected via the given broadcast channel. The auxiliary RVs and take values in finite sets and . 111The cardinalities of can be bounded by the Ahlswede-Kötner technique. We shall provide this in the journal version of this manuscript
The proof, which consists of two parts, achievability and secrecy, unfolds in the following subsections. The achievablity proof is based on techniques developed in [11, 12] and extends those devoted to secrecy developed in  for the wiretap channel to the present setting.
Iii-a Coding scheme
For each , and , , we draw independently sequences according to
Let . For and find a pair such that
If there is no such a pair choose . Then select a sequence with
To transmit select uniformly at random a pair and send .
Upon receiving decoder, , declares is sent if it is unique pair such that
Iii-B Error Analysis
The error analysis is carried out using standard arguments. Using the mutual covering lemma, packing lemma, and the properties of the typical sequences (cf. for example ) we obtain
From (9) we have
and by symmetry,
The bounds on and are derived in the following subsection.
Iii-C Strong Secrecy Criterion
Before we further continue, we define a deterministic function on pairs that returns a pair () such that
In the case that there are many such pairs, we choose one arbitrarily. However, if there are no such pairs, the function returns (1,1). From now on, we denote the codeword pairs simply based on selection function .
where the left hand side is known as the effective secrecy, is arbitrarily small and
and further is distribution of given while no meaningful message is transmitted for ,
and finally, is distribution of . We can further expand (13) as
in we used that .
Based on chain rule for informational divergence we have,
where, according to the definition of in (1), the first term above equals to zero, thus we proceed with bounding . The following lemma provides an upperbound on , which is useful for the rest of the proof.
For probability distributions and , defined on a finite set , with , we have,
Taking expectation with respect to of yields,
for all sufficiently large , where is an indicator function and we apply lemma 1 in (18). The last inequality comes from the mutual covering lemma (cf.  for example), where is a constant independent of . With (16) and (19), therefore, we have for all sufficiently large
Hence, in the following we focus on bounding the first term in the right hand side of (20).
We take the expectation over the , , and ; we obtain
where, in (21) we used Jensen’s inequality applied to the part of the expectation over for . 222A detailed proof of this derivation is due to journal version of this work.
Before proceeding with the bounds on the informational divergence, we introduce the following lemma.
Let be a probability distribution on . For and distribution
it holds for that
with and .
According to Lemma 2.6 in  we have for all ,
denotes the empirical distribution or type of the sequence pair
is the empirical distribution generated by the sequence
the distribution is the joint distribution computed with respect to and
Uniform continuity of the entropy (Lemma 2.7 in ), , and the fact that , yield
where, with . Moreover,
with and .
Inequality (27) can be seen as follows: In a first step we show that . To this end we assume
Then either and then automatically. Or and then
which, again, implies by definition of typical sequences, and thus . Since we have
and applying again the uniform continuity of entropy along with the properties of
typical sequences we obtain (27).
To proceed with the proof we need to show,
This inequality can be derived as follows,
The expression (22) can be split up as follows: (Recall that the expectation is taken over those codewords with