On the Sum Capacity of the Discrete Memoryless Interference Channel with OneSided Weak Interference and Mixed Interference
Abstract
The sum capacity of a class of discrete memoryless interference channels is determined. This class of channels is defined analogous to the Gaussian Zinterference channel with weak interference; as a result, the sum capacity is achieved by letting the transceiver pair subject to the interference communicates at a rate such that its message can be decoded at the unintended receiver using single user detection. Moreover, this class of discrete memoryless interference channels is equivalent in capacity region to certain discrete degraded interference channels. This allows the construction of a capacity outerbound using the capacity region of associated degraded broadcast channels. The same technique is then used to determine the sum capacity of the discrete memoryless interference channel with mixed interference. The above results allow one to determine sum capacities or capacity regions of several new discrete memoryless interference channels.
I Introduction
The interference channel (IC) models the situation where the transmitters communicate with their intended receivers while generating interference to unintended receivers. Despite decades of intense research, the capacity region of IC remains unknown except for a few special cases. These include interference channels with strong and very strong interference [1, 2, 3, 4, 5]; classes of deterministic and semideterministic ICs [6, 7]; and classes of discrete degraded ICs [8, 9].
Parallel capacity results exist for the discrete memoryless IC (DMIC) and the Gaussian IC (GIC). Carleial first obtained capacity region for GIC with very strong interference [1]. This result was subsequently extended by Sato [2] to that of DMICs with very strong interference. Note that the definition of DMIC with very strong interference can actually be broadened to be more consistent with its Gaussian counterpart [10]. Sato [3] and Han and Kobayashi [4] independently established in 1981 the capacity region of GIC with strong interference, where the capacity is the same as that of a compound multiple access channel. In [3] Sato also conjectured the conditions of DMICs under strong interfernce, which was eventually proved by Costa and El Gamal [5] in 1987.
While the capacity region for the general GIC remains unknown, there have been recent progress in characterizing the sum capacity of certain GICs, including: GICs with onesided weak interference [11], noisy interference [12, 13, 14], and mixed interference [13]. This paper attempts to derive parallel sum capacity results for DMICs with weak onesided and mixed inference. Our definitions of onesided, weak, or mixed interference are motivated by properties associated with the corresponding Gaussian channels. Some of those definitions are intimately related to those introduced in [15] which studies the capacity region of the discrete memoryless Zchannel.
The rest of the paper is organized as follows. Section II presents the channel model and relevant previous results. Section III defines DMICs with onesided weak interference and derives their sum capacities. We refer to those DMICs with onesided interference as DMZIC (i.e., discrete memoryless Z interference channel) for ease of presentation. The equivalence between the DMZIC with weak interference and the discrete degraded interference channel (DMDIC) is established which allows one to construct a capacity outerbound for the DMZIC using the capacity region of the associated degraded broadcast channel. Section IV defines DMICs with mixed interference and derives the sum capapcity for this class of channels. Section V concludes this paper.
Ii Preliminaries
Iia Discrete Memoryless Interference Channels
A DMIC is specified by its input alphabets and , output alphabets and , and the channel transition matrices:
(1)  
(2) 
The DMIC is said to be memoryless if
(3) 
A code for a DMIC with independent information consists of two message sets and for senders and respectively, two encoding functions:
two decoding functions:
and the average probabilities of error:
A rate pair is said to be achievable for the DMZIC if and only if there exist a sequence of codes such that as . The capacity region of a DMZIC is defined as the closure of the set of all achievable rate pairs.
IiB Existing Results for GICs
The received signals of a GIC in its standard form are
(4)  
(5) 
where and are the channel coefficients corresponding to the interference links, and are the transmitted and received signals, and the channel input sequence is subject to the power constraint , , and are Gaussian noises with zero mean and unit variance, independent of .
Sason in [11] proved that the sum capacity for GICs with onesided weak interference ( and ) is
Motahari and Khandani in [13] established that the sum capacity for GICs with mixed interference ( and ) is
We attempt to extend these results to DMICs with appropriately defined onesided weak interference and mixed interference.
IiC Useful Properties of Markov Chains
The following properties of Markov chains are useful throughout the paper:

Decomposition: ;

Weak Union: ;

Contraction: and .
Iii The DMZIC with Weak Interference
Iiia Discrete Memoryless ZInterference Channel
Definition 1
For the DMIC defined in Section IIA, if
(6) 
for all , , , or equivalently,
(7) 
forms a Markov chain, this DMIC is said to have onesided interference.
We refer to such DMIC as simply DMZIC. The definition is a natural extension of that for Gaussian ZIC where causes interference on . From the definition, it follows that and are independent for all input distribution .
To define DMZIC with weak interference, we first revisit some properties of Gaussian ZIC with weak interference. Similar to that established in [15], it is straightforward to show that a Gaussian ZIC with weak interference is equivalent in its capacity region to a degraded Gaussian ZIC satisfying the Markov chain
(8) 
This is referred in [15] as degraded Gaussian Z channel of typeI. This motivates us to define DMZIC with weak interference as follows.
Definition 2
A DMZIC is said to have weak interference if the channel transition probability factorizes as
(9) 
for some , or, equivalently, the channel is stochastically degraded.
In the absence of receiver cooperation, a stochastically degraded interference channel is equivalent in its capacity to a physically degraded interference channel. As such, we will assume in the following that the channel is physically degraded, i.e., the DMZIC admits the Markov chain . As a consequence, the following inequality holds
(10) 
for all input distributions .
The channel transition probability becomes
The above definition of weak interference leads to the following sum capacity result.
Theorem 1
The sum capacity of a DMZIC with weak interference as defined above is
(11) 
This sum rate is achieved by two receivers decoding their own messages while treating any interference, if present, as noise.
For the converse,
where for all , follows the Fano’s Inequality, is from the chain rule and the definition of mutual information, is because of the fact that conditioning reduces entropy, and that is independent of any other random variables given , is due to the memoryless property of the channel and the fact that is independent of any other random variables given and , then forms a Markov chain. By the weak union property, the Markov chain holds; is because of the Markov chain . The easiest way to prove it is using the Independence Graph. Alternatively, we first note that the Markov chain
holds, since given and , is independent of . By the weak union property, the following Markov chain is obtained:
Together with the Markov chain
because of the independence between and , we get the Markov chain:
by the contraction property. Again, using the weak union property and then the decomposition property, we obtain the Markov chain
as desired. Since and are independent, then , thus comes from (10). Finally, follows from the Markov chain . At last, by introducing a timesharing random variable , one obtains
Remark : The Markov chain (8) is a sufficient, but not necessary, condition for the mutual information condition
(12) 
for all product input distribution on . One can find examples such that the mutual information condition holds but the Markov chain is not valid. This is different from that of the Gaussian case; it can be shown that the coefficient in a Gaussian ZIC is a sufficient and necessary condition for (12) to hold. It is yet unknown if condition (12) is sufficient for the sum capacity result (11) to hold for DMZIC with weak interference.
IiiB Capacity Outerbound for DMZIC with Weak Interference
For Gaussian ZICs with weak interference, Sato [2] obtained an outerbound using the capacity region of a related Gaussian broadcast channel constructed due to the equivalence in capacity between a GZIC with weak interference and a degraded GIC. The same technique can be used to obtain a capacity outerbound for DMZIC with weak interference, i.e., that satisfies the Markov chain . Specifically, for any such DMZIC with weak interference, one can find an equivalent (in capacity region) DMDIC whose capacity region is bounded by that of an associated degraded broadcast channel.
Theorem 2
For a DMZIC that satisfies the Markov chain , the capacity region is outerbounded by
where forms a Markov chain and .
Suppose that the DMZIC with weak interference has inputs and outputs , and , respectively. Let us denote by , and , the inputs and outputs of another DMIC. Set , , and but define to be a bijection of and , denoted as . As the Markov chain holds, the DMIC specified by the input pair , and the output pair is indeed a DMDIC.
The proof that this DMDIC has the same capacity region as the specified DMZIC, and hence is outerbounded by the associated broadcast channel follows in exactly the same fashion as Costa’s proof for the Gaussian case [16], hence is omitted here.
Remark : The output need not necessarily be a bijection function of and ; instead, depending on the transition probability , other can be constructed. However, the associated broadcast channels would have the same the capacity region. It will become clear in the following example.
IiiC Numerical Example
Example 1
or  
or 
By Theorem 1, the sum capacity is
Moreover, one can construct as follows:
Then is given in Table II.
Using Theorem 2, the capacity region of the DMZIC is outerbounded by that of the associated DMDBC:
If one takes the bijection function to construct , it will lead to the same outerbound. If we fix to be , then
where is a function defined by Witsenhausen and Wyner [17]. Fig. 1 depicts the new outerbound specified by
This new outerbound significantly improves upon the following bound
Iv The DMIC with Mixed Interference
Definition 3
A DMIC is said to have mixed interference if it satisfies the Markov chain
(13) 
and
(14) 
for all possible product distributions on .
This definition is motivated by GIC with mixed interference, which can be shown to be equivalent in capacity region to a degraded GIC satisfying (13) by setting , where is normal distribution with mean and variance . The sum capacity for GIC with mixed interference was established in [13]. We obtain a parallel result for the DMIC with mixed interference as defined above.
Theorem 3
In order to achieve this sum rate, user transmits its message at a rate such that both receivers can decode it by treating the signal from user as noise; user transmits at the interferencefree rate since receiver is able to subtract the interference from user .
For the converse, we prove the following two sum rate bounds separately:
(16)  
(17) 
For (16), the derivation follows the same steps as Costa and El Gamal’s result[5]. For (17), we use similar techniques for establishing the sum capacity of the DMZIC with weak interference in Section III. First, notice that (13) implies
(18) 
for any whose joint distribution with is
(19) 
Then,
where is because of the independence between and ; is from the fact that conditioning reduces entropy and ; is from (18); and is because the memoryless property of the channel and (19). Finally, from (16) and (17), we have
respectively.
We give the following example where the obtained sum capacity helps determine the capacity region of a DMIC.
Example 2
Consider the following deterministic channel:
where the input and output alphabets , , and . Notice that this channel does not satisfy the condition of the deterministic interference channel in [6]. Obviously, the Markov chain (13) holds. Moreover,
Therefore,
for all possible input product distributions on . Therefore, this is a DMIC with mixed interference. Apply Theorem 3, we compute the sum capacity is
Given that and are both trivially achievable, the above sum capacity leads to the capacity region for this DMIC to be .
V Conclusion
In this paper, we derived the sum capacity for the DMZICs with weak interference where weak interference is defined using a Markov condition. Similar techniques are then applied to derive the sum capacity for DMIC with mixed interference. Both results are analogous to the sum capacity results for the corresponding Gaussian channel, both in the expression of the capacity and in the encoding schemes that achieve the capacity.
The weak interference condition is defined using a Markov chain, as opposed to that using the mutual information inequality. While it appears to be somewhat restrictive, it is not known whether the definition using the mutual information condition will lead to the same sum capacity result.
References
 [1] A. B. Carleial, “A case where interference does not reduce capacity,” IEEE Trans. Inf. Theory, vol. 21, no. 5, pp. 569570, Sep. 1975.
 [2] H. Sato, “On the capacity region of a discrete twouser channel for strong interference,” IEEE Trans. Inf. Theory, vol. 24, no. 3, pp. 377379, May 1978.
 [3] H. Sato, “The capacity of the Gaussian interference channel under strong interference,” IEEE Trans. Inf. Theory, vol. 27, pp. 786788, Nov. 1981.
 [4] T. S. Han and K. Kobayashi, “A new achievable rate region for the interference channel,” IEEE Trans. Inf. Theory, vol. 27, pp. 4960, Jan. 1981.
 [5] M. H. M. Costa and A. El Gamal, “The capacity region of the discrete memoryless interference channel with strong interference,” IEEE Trans. Inf. Theory, vol. 33, pp. 710711, Sep. 1987.
 [6] A. El Gamal and M. H. M. Costa, “The capacity regio of a class of deterministic interference channels,” IEEE Trans. Inf. Theory, vol. 28, no. 2, pp. 343346, Mar. 1982.
 [7] H. F. Chong and M. Motani, “The capacity region of a class of semideterministic interference channels,” IEEE Trans. Inf. Theory, vol. 55, no.2 ,pp. 598603, Feb. 2009.
 [8] R. Benzel, “The capacity region of a class of discrete additive degraded interference channels,” IEEE Trans. Inf. Theory, vol. 25, no. 2, pp. 228231, Mar. 1979.
 [9] N. Liu and S. Ulukus, “The capacity region of a class of discrete degraded interference channels,” IEEE Trans. Inf. Theory, vol. 54, no. 9, pp. 43724378, Sep. 2008.
 [10] J. Xu, H. Chen and B. Chen, “New observation on interference channels under strong/very strong interference,” in Proc. IEEE Global Communications Conference (Clobecom’2010), Miami, FL, December 2010.
 [11] I. Sason, “On achievable rate regions for the Gaussian interference channels,” IEEE Trans. Inf. Theory, vol. 50, no. 6, pp. 13451356, Jun. 2004.
 [12] X. Shang, G. Kramer and B. Chen, “A new outer bound and the noisyinterference sumrate capacity for Gaussian interference channels,” IEEE Trans. Inf. Theory, vol. 55, no. 2, pp. 689699, Feb. 2009.
 [13] A. S. Motahari and A. K. Khandani, “Capacity bounds for the Gaussian interference channel,” IEEE Trans. Inf. Theory, vol. 55, no. 2, pp. 620643, Feb. 2009.
 [14] V. S. Annapureddy and V. V. Veeravalli, “Gaussian interference networks: sum capacity in the lowinterference regime and new outer bounds on the capacity region,” IEEE Trans. Inf. Theory, vol. 55, no. 7, pp. 30323250, Jun. 2009.
 [15] H. F. Chong, M. Motani, and H. K. Garg, “Capacity theorems for the “Z” channel,” IEEE Trans. Inf. Theory, vol. 53, no. 4, pp. 13481365, Apr. 2007.
 [16] M. H. M. Costa, “On the Gaussian interference channel,” IEEE Trans. Inf. Thoery, vol. IT31, no.5, pp. 607615, Sep. 1985.
 [17] H. S. Witsenhausen and A. D. Wyner, “A conditional entropy bound for a pair of discrete random variables,” IEEE Trans. Inf. Theory, vol. 21, no. 5, pp. 493501, Sep. 1975.