Gaussian Channel with Noisy Feedback and Peak Energy Constraint
Abstract
Optimal coding over the additive white Gaussian noise channel under the peak energy constraint is studied when there is noisy feedback over an orthogonal additive white Gaussian noise channel. As shown by Pinsker, under the peak energy constraint, the best error exponent for communicating an ary message, , with noisefree feedback is strictly larger than the one without feedback. This paper extends Pinsker’s result and shows that if the noise power in the feedback link is sufficiently small, the best error exponent for conmmunicating an ary message can be strictly larger than the one without feedback. The proof involves two feedback coding schemes. One is motivated by a twostage noisy feedback coding scheme of Burnashev and Yamamoto for binary symmetric channels, while the other is a linear noisy feedback coding scheme that extends Pinsker’s noisefree feedback coding scheme. When the feedback noise power is sufficiently small, the linear coding scheme outperforms the twostage (nonlinear) coding scheme, and is asymptotically optimal as tends to zero. By contrast, when is relatively larger, the twostage coding scheme performs better.
I Introduction and Main Results
We consider a communication problem for an additive white Gaussian noise (AWGN) forward channel with feedback over an orthogonal additive white Gaussian noise backward channel as depicted in Fig. 1.
Suppose that the sender wishes to communicate a message over the (forward) additive white Gaussian noise channel
where , , and respectively denote the channel input, channel output, and additive Gaussian noise. The sender has a causal access to a noisy version of over the feedback (backward) additive white Gaussian noise channel
where is the Gaussian noise in the backward link. We assume that the forward noise process and the backward noise process are independent of each other, and respectively white Gaussian and .
We define an code with the encoding functions , , and the decoding function . We assume a peak energy constraint
(1) 
The probability of error of the code is defined as
where is distributed uniformly over and is independent of .
As is well known, the capacity of the channel (the supremum of such that there exists a sequence of codes with ) stays the same with or without feedback. Hence, our main focus is the reliability of communication, which is captured by the error exponent
of the given code. The error exponent is sensitive to the presence of noise in the feedback link. Schalkwijk and Kailath showed in their celebrated work [1] that noisefree feedback can improve the error exponent dramatically under the expected energy constraint
(2) 
(in fact, decays much faster than exponentially in ). Kim, Lapidoth, and Weissman [2] studied the optimal error exponent under the expected energy constraint and noisy feedback, and showed that the error exponent is inversely proportional to for small .
Another important factor that affects the error exponent is the energy constraint on the channel inputs—the peak energy constraint in (1) vs. the expected energy constraint in (2). Wyner [3] showed that the error probability of the Schalkwijk–Kailath coding scheme [1] degrades to an exponential form under the peak energy constraint. In fact, Shepp, Wolf, Wyner, and Ziv [4] showed that for the binarymessage case (), the best error exponent under the peak energy constraint is achieved by simple nonfeedback antipodal signaling, regardless of the presence of feedback. This negative result might lead to an impression that under the peak energy constraint, even noisefree feedback does not improve the reliability of communication. Pinsker [5] proved the contrary by showing that the best error exponent for sending an ary message does not depend on and, hence can be strictly larger than the best error exponent without feedback for .
In this paper, we show that noisy feedback can improve the reliability of communication under the peak energy constraint, provided that the feedback noise power is sufficiently small. Let
where denotes the best error probability over all codes for the AWGN channel with the noisy feedback. Thus, denotes the best error exponent for communicating an ary message over the AWGN channel without feedback. Shannon [6] showed that
(3) 
This follows by first upper bounding the error exponent with the sphere packing bound and then achieving this upper bound by using a regular simplex code on the sphere of radius , that is, each codeword satisfies and is at the same Euclidean distance from every other codeword. In particular, for ,
and
At the other extreme, denotes the best error exponent for communicating an ary message over the AWGN channel with noisefree feedback. Pinsker [5] showed that
for all . In particular,
Clearly, is decreasing in and
for every and .
Is strictly larger than (i.e., is noisy feedback better than no feedback)? Does tend to as (i.e., does the performance degrade gracefully with small noise in the feedback link)? What is the optimal feedback coding scheme that achieves ? To answer these questions, we establish the following results.
Theorem 1
For ,
where
By comparing the lower bound with (3) and identifying the critical point , we obtain the following.
Corollary 1
Thus, if the noise power in the feedback link is sufficiently small, then the noisy feedback improves the reliability of communication even under the peak energy constraint. The proof of Theorem is motivated by recent results of Burnashev and Yamamoto in a series of papers [7], [8], where they considered a communication model with a forward BSC() and a backward BSC(), and showed that when is sufficiently small, the best error exponent is strictly larger than the one without feedback.
The lower bound in Theroem shows that , which is strictly less than . To obtain a better asymptotic behavior for , we establish the following.
Theorem 2
This theorem leads to the following.
Corollary 2
Thus, the lower bound in Theorem is tight for . The proof of Theorem extends Pinsker’s linear noisefree feedback coding scheme [5] to the noisy case.
Fig. 2 compares the two bounds for the case. The linear noisy feedback coding scheme performs better when is sufficiently small, while the twostage noisy feedback coding scheme performs better when is relatively larger.
The rest of the paper is organized as follows. In Section II, we study a twostage noisy feedback coding scheme motivated by recent results of Burnashev and Yamamoto and establish Thereom . In Section III, we extends Pinsker’s noisefree linear feedback coding scheme to the noisy feedback case and establish Theorem . Section IV concludes the paper.
Ii Twostage Noisy Feedback Coding Scheme
Iia Background
It is instructive to first consider a twostage noisefree feedback coding scheme for . This twostage scheme has been studied by Schalkwijk and Barron [9] and Yamamoto and Itoh [10] for a general .
Encoding. Fix some . For simplicity of notation, assume throughout that is an integer. To send message , during the transmission time interval (namely, stage ), the encoder uses the simplex signaling:
(4) 
Based on the feedback , the encoder then chooses the two most probable message estimates and , where
(5) 
and in case of a tie the one with the smaller index is chosen. Since the channel is Gaussian and is uniform, can be written as
where denotes the Euclidean distance. During the transmission time interval (stage ), the encoder uses antipodal signaling for if and transmits allzero sequence otherwise:
Decoding. At the end of stage , the decoder chooses the two most probable message estimates and based on as the encoder does. At the end of stage , the decoder declares that is sent if
Analysis of the probability of error. Let and denote the two most probable message estimates at the end of stage . The decoder makes an error if and only if one of the following events occurs:
Thus, the probability of error is
By symmetry, we assume without loss of generality that is sent. For brevity, we do not explicitly condition on the event in probability expressions in the following, whenever it is clear from the context. Refering to Fig. 3, let
we have
where follows since (see [11, Problem ]).
On the other hand, is determined by the distance between the simplex signaling in stage and the distance between the antipodal signaling in stage (see Fig. 4). In particular,
Thus,
Therefore, the error exponent of the twostage feedback coding scheme is lower bounded as
Now let . Then it can be readily verified that both terms in the minimum are the same and we have
Remark 1
Since , this twostage noisefree feedback coding scheme is strictly suboptimal.
Remark 2
We need only three transmissions: two for stage and one for stage . Thus actually divides only the total energy , not the block length .
IiB Twostage Noisy Feedback Coding Scheme
Based on the twostage noisefree feedback coding scheme in the previous subsection and a new idea of signal protection introduced by Burnashev and Yamamoto [7], [8], we present a twostage noisy feedback coding scheme for . The coding scheme for an arbitrary is given in Appendix A.
In the twostage noisefree feedback coding scheme, the encoder and decoder agree on the same set of message estimates and at the end of stage . When there is noise in the feedback link, however, this coordination is not always possible. To solve this problem, we assign a signal protection region , , to each signal as depicted in Fig. 5. Let and denote the transmitted and received signals, respectively, and denote the feedback sequence at the encoder. Let and the signal protection region for , , is defined as
(6) 
which means that message is the most probable and the other messages and are of approximately equal posterior probabilities. Here is a fixed parameter which will be optimized later in the analysis.
Encoding. In stage , the encoder uses the same simplex signaling as in the noisefree feedback case (see (4)). Then based on the noisy feedback , the encoder chooses and such that
In stage , the encoder uses antipodal signaling for if and transmits allzero sequence otherwise.
Decoding. The decoder makes a decision immediately at the end of stage if the received signal lies in one of the signal protection regions, i.e., for . Otherwise, it chooses the two most probable message estimates and and wait for the transmission in stage . At the end of stage , the decoder declares that is sent if
Remark 3
The signal protection region corresponds to the case in which the two least probable messages are of approximately equal posterior probabilities, i.e., .
Analysis of the probability of error. Let (, ) and (, ) denote the pairs of the two most probable message estimates at the encoder and the decoder, respectively. As before, we assume that is sent. Refering to Fig. 5, let
where .
The decoder makes an error only if one or more of the following events occur:

decoding error at the end of stage

miscoordination due to the feedback noise

decoding error at the end of stage
Thus, the probability of error is upper bounded as
To simplify the analysis, we introduce a new parameter such that . It can be easily checked that corresponds to and that this constraint guarantees that (see Fig. 6(a)). Hence, for the first term
(7)  
The second term can be upper bounded (see Fig. 6(b)) as
(8)  
Finally, the third term can be upper bounded in the exactly same manner as in the noisefree feedback case:
Therefore, the error exponent of the twostage noisy feedback coding scheme is lower bounded as
Now let
and
Then it can be readily verified that all the three terms in the minimum are the same and we have
(9) 
Note that if ,
and is monotonically increasing over . Thus
This completes the proof of Theorem for the case.
Remark 4
It can be easily checked that the lower bound in is tight and characterizes the exact error exponent of the twostage noisy feedback coding scheme.
Iii Linear Noisy Feedback Coding Scheme
Iiia Background
It is instructive to revisit (a slightly simplified version of) the linear noisefree feedback coding scheme by Pinsker [5], which shows that for all . This lower bound is tight since [4] and is nonincreasing in .
Encoding. To send message , the encoder transmits
(10) 
Because of the feedback , the encoder can learn the noise . Subsequently it transmits
and afterwards, where will be optimized later and the random time is the largest such that
Decoding. Upon receiving , the decoder estimates by
and declares that is sent if
Remark 5
It can be easily checked that each time , the encoder transmits the error
in the decoder’s current estimate of the initial transmission (up to scaling). Thus, Pinsker’s coding scheme is another instance of iterative refinement used in the SchalkwijkKailth coding scheme [1] for the Gaussian channel and the Horstein coding scheme [12] for the binary symmetric channel.
Analysis of the probability of error. For simplicity of notation, assume throughout that is an integer. We use to denote a generic sequence of nonnegative numbers that tends to zero as . When there are multiple such functions , we denote them all by with the understanding that . It is easy to see that decoding error occurs only if . The probability of error is thus upper bounded as
The key idea in the analysis is to introduce a “virtual” transmission
(11) 
Let
(12) 
and define the estimate of as
(13) 
Then, it can be easily shown that
Thus we have
Now we upper bound the two terms. For the first term, we have
For the second term, note that for all if and only if , and thus that only if . Therefore,
where follows since (recall ) and denotes a chisquare random variable with degrees of freedom. By upper bounding the tail probability of the chisquare random variable [13] as
(14) 
we have
where tends to zero as . Therefore, the error exponent of the linear feedback coding scheme is lower bounded as
for any . Now let
which tends to zero as . Then the limits of both terms in the minimum are the same. Therefore,
which completes the proof of achievability.
IiiB Linear Noisy Feedback Coding Scheme
Now we formally describe and analyze a linear noisy feedback coding scheme based on Pinsker’s noisefree feedback coding scheme.
Encoding. Fix some . To send message , the encoder transmits
(15) 
Because of the noisy feedback , the encoder can learn . Subsequently it transmits
where will be optimized later and the random time