On the Sum Capacity of the Discrete Memoryless Interference Channel with One-Sided Weak Interference and Mixed Interference

On the Sum Capacity of the Discrete Memoryless Interference Channel with One-Sided Weak Interference and Mixed Interference

\authorblockNFangfang Zhu and Biao Chen \authorblockASyracuse University
Department of EECS
Syracuse, NY 13244
Email: fazhu{bichen}@syr.edu
Abstract

The sum capacity of a class of discrete memoryless interference channels is determined. This class of channels is defined analogous to the Gaussian Z-interference channel with weak interference; as a result, the sum capacity is achieved by letting the transceiver pair subject to the interference communicates at a rate such that its message can be decoded at the unintended receiver using single user detection. Moreover, this class of discrete memoryless interference channels is equivalent in capacity region to certain discrete degraded interference channels. This allows the construction of a capacity outer-bound using the capacity region of associated degraded broadcast channels. The same technique is then used to determine the sum capacity of the discrete memoryless interference channel with mixed interference. The above results allow one to determine sum capacities or capacity regions of several new discrete memoryless interference channels.

I Introduction

The interference channel (IC) models the situation where the transmitters communicate with their intended receivers while generating interference to unintended receivers. Despite decades of intense research, the capacity region of IC remains unknown except for a few special cases. These include interference channels with strong and very strong interference [1, 2, 3, 4, 5]; classes of deterministic and semi-deterministic ICs [6, 7]; and classes of discrete degraded ICs [8, 9].

Parallel capacity results exist for the discrete memoryless IC (DMIC) and the Gaussian IC (GIC). Carleial first obtained capacity region for GIC with very strong interference [1]. This result was subsequently extended by Sato [2] to that of DMICs with very strong interference. Note that the definition of DMIC with very strong interference can actually be broadened to be more consistent with its Gaussian counterpart [10]. Sato [3] and Han and Kobayashi [4] independently established in 1981 the capacity region of GIC with strong interference, where the capacity is the same as that of a compound multiple access channel. In [3] Sato also conjectured the conditions of DMICs under strong interfernce, which was eventually proved by Costa and El Gamal [5] in 1987.

While the capacity region for the general GIC remains unknown, there have been recent progress in characterizing the sum capacity of certain GICs, including: GICs with one-sided weak interference [11], noisy interference [12, 13, 14], and mixed interference [13]. This paper attempts to derive parallel sum capacity results for DMICs with weak one-sided and mixed inference. Our definitions of one-sided, weak, or mixed interference are motivated by properties associated with the corresponding Gaussian channels. Some of those definitions are intimately related to those introduced in [15] which studies the capacity region of the discrete memoryless Z-channel.

The rest of the paper is organized as follows. Section II presents the channel model and relevant previous results. Section III defines DMICs with one-sided weak interference and derives their sum capacities. We refer to those DMICs with one-sided interference as DMZIC (i.e., discrete memoryless Z interference channel) for ease of presentation. The equivalence between the DMZIC with weak interference and the discrete degraded interference channel (DMDIC) is established which allows one to construct a capacity outer-bound for the DMZIC using the capacity region of the associated degraded broadcast channel. Section IV defines DMICs with mixed interference and derives the sum capapcity for this class of channels. Section V concludes this paper.

Ii Preliminaries

Ii-a Discrete Memoryless Interference Channels

A DMIC is specified by its input alphabets and , output alphabets and , and the channel transition matrices:

(1)
(2)

The DMIC is said to be memoryless if

(3)

A code for a DMIC with independent information consists of two message sets and for senders and respectively, two encoding functions:

two decoding functions:

and the average probabilities of error:

A rate pair is said to be achievable for the DMZIC if and only if there exist a sequence of codes such that as . The capacity region of a DMZIC is defined as the closure of the set of all achievable rate pairs.

Ii-B Existing Results for GICs

The received signals of a GIC in its standard form are

(4)
(5)

where and are the channel coefficients corresponding to the interference links, and are the transmitted and received signals, and the channel input sequence is subject to the power constraint , , and are Gaussian noises with zero mean and unit variance, independent of .

Sason in [11] proved that the sum capacity for GICs with one-sided weak interference ( and ) is

Motahari and Khandani in [13] established that the sum capacity for GICs with mixed interference ( and ) is

We attempt to extend these results to DMICs with appropriately defined one-sided weak interference and mixed interference.

Ii-C Useful Properties of Markov Chains

The following properties of Markov chains are useful throughout the paper:

  • Decomposition: ;

  • Weak Union: ;

  • Contraction: and .

Iii The DMZIC with Weak Interference

Iii-a Discrete Memoryless Z-Interference Channel

Definition 1

For the DMIC defined in Section II-A, if

(6)

for all , , , or equivalently,

(7)

forms a Markov chain, this DMIC is said to have one-sided interference.

We refer to such DMIC as simply DMZIC. The definition is a natural extension of that for Gaussian ZIC where causes interference on . From the definition, it follows that and are independent for all input distribution .

To define DMZIC with weak interference, we first revisit some properties of Gaussian ZIC with weak interference. Similar to that established in [15], it is straightforward to show that a Gaussian ZIC with weak interference is equivalent in its capacity region to a degraded Gaussian ZIC satisfying the Markov chain

(8)

This is referred in [15] as degraded Gaussian Z channel of type-I. This motivates us to define DMZIC with weak interference as follows.

Definition 2

A DMZIC is said to have weak interference if the channel transition probability factorizes as

(9)

for some , or, equivalently, the channel is stochastically degraded.

In the absence of receiver cooperation, a stochastically degraded interference channel is equivalent in its capacity to a physically degraded interference channel. As such, we will assume in the following that the channel is physically degraded, i.e., the DMZIC admits the Markov chain . As a consequence, the following inequality holds

(10)

for all input distributions .

The channel transition probability becomes

The above definition of weak interference leads to the following sum capacity result.

Theorem 1

The sum capacity of a DMZIC with weak interference as defined above is

(11)
{proof}

This sum rate is achieved by two receivers decoding their own messages while treating any interference, if present, as noise.

For the converse,

where for all , follows the Fano’s Inequality, is from the chain rule and the definition of mutual information, is because of the fact that conditioning reduces entropy, and that is independent of any other random variables given , is due to the memoryless property of the channel and the fact that is independent of any other random variables given and , then forms a Markov chain. By the weak union property, the Markov chain holds; is because of the Markov chain . The easiest way to prove it is using the Independence Graph. Alternatively, we first note that the Markov chain

holds, since given and , is independent of . By the weak union property, the following Markov chain is obtained:

Together with the Markov chain

because of the independence between and , we get the Markov chain:

by the contraction property. Again, using the weak union property and then the decomposition property, we obtain the Markov chain

as desired. Since and are independent, then , thus comes from (10). Finally, follows from the Markov chain . At last, by introducing a time-sharing random variable , one obtains

Remark : The Markov chain (8) is a sufficient, but not necessary, condition for the mutual information condition

(12)

for all product input distribution on . One can find examples such that the mutual information condition holds but the Markov chain is not valid. This is different from that of the Gaussian case; it can be shown that the coefficient in a Gaussian ZIC is a sufficient and necessary condition for (12) to hold. It is yet unknown if condition (12) is sufficient for the sum capacity result (11) to hold for DMZIC with weak interference.

Iii-B Capacity Outer-bound for DMZIC with Weak Interference

For Gaussian ZICs with weak interference, Sato [2] obtained an outer-bound using the capacity region of a related Gaussian broadcast channel constructed due to the equivalence in capacity between a GZIC with weak interference and a degraded GIC. The same technique can be used to obtain a capacity outer-bound for DMZIC with weak interference, i.e., that satisfies the Markov chain . Specifically, for any such DMZIC with weak interference, one can find an equivalent (in capacity region) DMDIC whose capacity region is bounded by that of an associated degraded broadcast channel.

Theorem 2

For a DMZIC that satisfies the Markov chain , the capacity region is outer-bounded by

where forms a Markov chain and .

{proof}

Suppose that the DMZIC with weak interference has inputs and outputs , and , respectively. Let us denote by , and , the inputs and outputs of another DMIC. Set , , and but define to be a bijection of and , denoted as . As the Markov chain holds, the DMIC specified by the input pair , and the output pair is indeed a DMDIC.

The proof that this DMDIC has the same capacity region as the specified DMZIC, and hence is outer-bounded by the associated broadcast channel follows in exactly the same fashion as Costa’s proof for the Gaussian case [16], hence is omitted here.

Remark : The output need not necessarily be a bijection function of and ; instead, depending on the transition probability , other can be constructed. However, the associated broadcast channels would have the same the capacity region. It will become clear in the following example.

Iii-C Numerical Example

Example 1

Let and the channel transition probability be given by

where and are specified in Table I.

or
or
TABLE I: Channel Transition Probabilities

By Theorem 1, the sum capacity is

Moreover, one can construct as follows:

Then is given in Table II.

TABLE II:

Using Theorem 2, the capacity region of the DMZIC is outer-bounded by that of the associated DMDBC:

If one takes the bijection function to construct , it will lead to the same outer-bound. If we fix to be , then

where is a function defined by Witsenhausen and Wyner [17]. Fig. 1 depicts the new outer-bound specified by

This new outer-bound significantly improves upon the following bound

Fig. 1: Comparison of the outer-bounds.

Iv The DMIC with Mixed Interference

Definition 3

A DMIC is said to have mixed interference if it satisfies the Markov chain

(13)

and

(14)

for all possible product distributions on .

This definition is motivated by GIC with mixed interference, which can be shown to be equivalent in capacity region to a degraded GIC satisfying (13) by setting , where is normal distribution with mean and variance . The sum capacity for GIC with mixed interference was established in [13]. We obtain a parallel result for the DMIC with mixed interference as defined above.

Theorem 3

The sum capacity of the DMIC with mixed interference satisfying the two conditions (13) and (14) is

(15)
{proof}

In order to achieve this sum rate, user transmits its message at a rate such that both receivers can decode it by treating the signal from user as noise; user transmits at the interference-free rate since receiver is able to subtract the interference from user .

For the converse, we prove the following two sum rate bounds separately:

(16)
(17)

For (16), the derivation follows the same steps as Costa and El Gamal’s result[5]. For (17), we use similar techniques for establishing the sum capacity of the DMZIC with weak interference in Section III. First, notice that (13) implies

(18)

for any whose joint distribution with is

(19)

Then,

where is because of the independence between and ; is from the fact that conditioning reduces entropy and ; is from (18); and is because the memoryless property of the channel and (19). Finally, from (16) and (17), we have

respectively.

We give the following example where the obtained sum capacity helps determine the capacity region of a DMIC.

Example 2

Consider the following deterministic channel:

where the input and output alphabets , , and . Notice that this channel does not satisfy the condition of the deterministic interference channel in [6]. Obviously, the Markov chain (13) holds. Moreover,

Therefore,

for all possible input product distributions on . Therefore, this is a DMIC with mixed interference. Apply Theorem 3, we compute the sum capacity is

Given that and are both trivially achievable, the above sum capacity leads to the capacity region for this DMIC to be .

V Conclusion

In this paper, we derived the sum capacity for the DMZICs with weak interference where weak interference is defined using a Markov condition. Similar techniques are then applied to derive the sum capacity for DMIC with mixed interference. Both results are analogous to the sum capacity results for the corresponding Gaussian channel, both in the expression of the capacity and in the encoding schemes that achieve the capacity.

The weak interference condition is defined using a Markov chain, as opposed to that using the mutual information inequality. While it appears to be somewhat restrictive, it is not known whether the definition using the mutual information condition will lead to the same sum capacity result.

References

  • [1] A. B. Carleial, “A case where interference does not reduce capacity,” IEEE Trans. Inf. Theory, vol. 21, no. 5, pp. 569-570, Sep. 1975.
  • [2] H. Sato, “On the capacity region of a discrete two-user channel for strong interference,” IEEE Trans. Inf. Theory, vol. 24, no. 3, pp. 377-379, May 1978.
  • [3] H. Sato, “The capacity of the Gaussian interference channel under strong interference,” IEEE Trans. Inf. Theory, vol. 27, pp. 786-788, Nov. 1981.
  • [4] T. S. Han and K. Kobayashi, “A new achievable rate region for the interference channel,” IEEE Trans. Inf. Theory, vol. 27, pp. 49-60, Jan. 1981.
  • [5] M. H. M. Costa and A. El Gamal, “The capacity region of the discrete memoryless interference channel with strong interference,” IEEE Trans. Inf. Theory, vol. 33, pp. 710-711, Sep. 1987.
  • [6] A. El Gamal and M. H. M. Costa, “The capacity regio of a class of deterministic interference channels,” IEEE Trans. Inf. Theory, vol. 28, no. 2, pp. 343-346, Mar. 1982.
  • [7] H. F. Chong and M. Motani, “The capacity region of a class of semideterministic interference channels,” IEEE Trans. Inf. Theory, vol. 55, no.2 ,pp. 598-603, Feb. 2009.
  • [8] R. Benzel, “The capacity region of a class of discrete additive degraded interference channels,” IEEE Trans. Inf. Theory, vol. 25, no. 2, pp. 228-231, Mar. 1979.
  • [9] N. Liu and S. Ulukus, “The capacity region of a class of discrete degraded interference channels,” IEEE Trans. Inf. Theory, vol. 54, no. 9, pp. 4372-4378, Sep. 2008.
  • [10] J. Xu, H. Chen and B. Chen, “New observation on interference channels under strong/very strong interference,” in Proc. IEEE Global Communications Conference (Clobecom’2010), Miami, FL, December 2010.
  • [11] I. Sason, “On achievable rate regions for the Gaussian interference channels,” IEEE Trans. Inf. Theory, vol. 50, no. 6, pp. 1345-1356, Jun. 2004.
  • [12] X. Shang, G. Kramer and B. Chen, “A new outer bound and the noisy-interference sum-rate capacity for Gaussian interference channels,” IEEE Trans. Inf. Theory, vol. 55, no. 2, pp. 689-699, Feb. 2009.
  • [13] A. S. Motahari and A. K. Khandani, “Capacity bounds for the Gaussian interference channel,” IEEE Trans. Inf. Theory, vol. 55, no. 2, pp. 620-643, Feb. 2009.
  • [14] V. S. Annapureddy and V. V. Veeravalli, “Gaussian interference networks: sum capacity in the low-interference regime and new outer bounds on the capacity region,” IEEE Trans. Inf. Theory, vol. 55, no. 7, pp. 3032-3250, Jun. 2009.
  • [15] H. F. Chong, M. Motani, and H. K. Garg, “Capacity theorems for the “Z” channel,” IEEE Trans. Inf. Theory, vol. 53, no. 4, pp. 1348-1365, Apr. 2007.
  • [16] M. H. M. Costa, “On the Gaussian interference channel,” IEEE Trans. Inf. Thoery, vol. IT-31, no.5, pp. 607-615, Sep. 1985.
  • [17] H. S. Witsenhausen and A. D. Wyner, “A conditional entropy bound for a pair of discrete random variables,” IEEE Trans. Inf. Theory, vol. 21, no. 5, pp. 493-501, Sep. 1975.
Comments 0
Request Comment
You are adding the first comment!
How to quickly get a good reply:
  • Give credit where it’s due by listing out the positive aspects of a paper before getting into which changes should be made.
  • Be specific in your critique, and provide supporting evidence with appropriate references to substantiate general statements.
  • Your comment should inspire ideas to flow and help the author improves the paper.

The better we are at sharing our knowledge with each other, the faster we move forward.
""
The feedback must be of minimum 40 characters and the title a minimum of 5 characters
   
Add comment
Cancel
Loading ...
59400
This is a comment super asjknd jkasnjk adsnkj
Upvote
Downvote
""
The feedback must be of minumum 40 characters
The feedback must be of minumum 40 characters
Submit
Cancel

You are asking your first question!
How to quickly get a good answer:
  • Keep your question short and to the point
  • Check for grammar or spelling errors.
  • Phrase it like a question
Test
Test description