On the Capacity of Free-Space Optical Intensity Channels

Similar documents
WE consider a memoryless discrete-time channel whose

Some Expectations of a Non-Central Chi-Square Distribution With an Even Number of Degrees of Freedom

The Fading Number of a Multiple-Access Rician Fading Channel

WE study the capacity of peak-power limited, single-antenna,

Estimation of the Capacity of Multipath Infrared Channels

Generalized Writing on Dirty Paper

Chapter 8: Differential entropy. University of Illinois at Chicago ECE 534, Natasha Devroye

The Poisson Channel with Side Information

Title. Author(s)Tsai, Shang-Ho. Issue Date Doc URL. Type. Note. File Information. Equal Gain Beamforming in Rayleigh Fading Channels

Energy State Amplification in an Energy Harvesting Communication System

Threshold Optimization for Capacity-Achieving Discrete Input One-Bit Output Quantization

Capacity Analysis of Multiple-Access OFDM Channels

Capacity Pre-Log of SIMO Correlated Block-Fading Channels

Capacity Bounds for Power- and Band-Limited Wireless Infrared Channels Corrupted by Gaussian Noise

arxiv: v4 [cs.it] 17 Oct 2015

Ergodic and Outage Capacity of Narrowband MIMO Gaussian Channels

ECE598: Information-theoretic methods in high-dimensional statistics Spring 2016

USING multiple antennas has been shown to increase the

On the Capacity Region of the Gaussian Z-channel

Lecture 7 MIMO Communica2ons

Binary Transmissions over Additive Gaussian Noise: A Closed-Form Expression for the Channel Capacity 1

EE229B - Final Project. Capacity-Approaching Low-Density Parity-Check Codes

ELEC546 Review of Information Theory

Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information

Upper Bounds on the Capacity of Binary Intermittent Communication

What is the Value of Joint Processing of Pilots and Data in Block-Fading Channels?

Chapter 4: Continuous channel and its capacity

Dirty Paper Coding vs. TDMA for MIMO Broadcast Channels

IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 52, NO. 4, APRIL

EE 4TM4: Digital Communications II Scalar Gaussian Channel

On the Capacity of Fading MIMO Broadcast Channels with Imperfect Transmitter Side-Information

On the Duality between Multiple-Access Codes and Computation Codes

Channel capacity. Outline : 1. Source entropy 2. Discrete memoryless channel 3. Mutual information 4. Channel capacity 5.

Upper Bounds on MIMO Channel Capacity with Channel Frobenius Norm Constraints

One Lesson of Information Theory

Towards control over fading channels

Principles of Communications

Chapter 2: Entropy and Mutual Information. University of Illinois at Chicago ECE 534, Natasha Devroye

A Proof of the Converse for the Capacity of Gaussian MIMO Broadcast Channels

Capacity of Memoryless Channels and Block-Fading Channels With Designable Cardinality-Constrained Channel State Feedback

Nearest Neighbor Decoding in MIMO Block-Fading Channels With Imperfect CSIR

Capacity of Block Rayleigh Fading Channels Without CSI

A General Formula for Compound Channel Capacity

Entropy and Ergodic Theory Lecture 4: Conditional entropy and mutual information

Wideband Fading Channel Capacity with Training and Partial Feedback

Lecture 14 February 28

ECE Information theory Final (Fall 2008)

Optimal Power Allocation for Cognitive Radio under Primary User s Outage Loss Constraint

Recursions for the Trapdoor Channel and an Upper Bound on its Capacity

Feedback Capacity of the Gaussian Interference Channel to Within Bits: the Symmetric Case

Tight Lower Bounds on the Ergodic Capacity of Rayleigh Fading MIMO Channels

Minimum Energy Per Bit for Secret Key Acquisition Over Multipath Wireless Channels

Characterising Probability Distributions via Entropies

On the Secrecy Capacity of Fading Channels

Multiple Antennas in Wireless Communications

The Impact of Dark Current on the Wideband Poisson Channel

Lecture 2. Capacity of the Gaussian channel

Upper and Lower Bounds on the Capacity of Wireless Optical Intensity Channels Corrupted by Gaussian Noise

Computing and Communications 2. Information Theory -Entropy

An Outer Bound for the Gaussian. Interference channel with a relay.

Joint FEC Encoder and Linear Precoder Design for MIMO Systems with Antenna Correlation

On the Low-SNR Capacity of Phase-Shift Keying with Hard-Decision Detection

Lecture 8: MIMO Architectures (II) Theoretical Foundations of Wireless Communications 1. Overview. Ragnar Thobaben CommTh/EES/KTH

On Compound Channels With Side Information at the Transmitter

ECE 4400:693 - Information Theory

An Alternative Proof for the Capacity Region of the Degraded Gaussian MIMO Broadcast Channel

Capacity Analysis of MIMO Systems with Unknown Channel State Information

An Extended Fano s Inequality for the Finite Blocklength Coding

MMSE Dimension. snr. 1 We use the following asymptotic notation: f(x) = O (g(x)) if and only

Multiple-Input Multiple-Output Systems

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels

Upper and Lower Bounds on the Capacity of Amplitude-Constrained MIMO Channels

NUMERICAL COMPUTATION OF THE CAPACITY OF CONTINUOUS MEMORYLESS CHANNELS

Amobile satellite communication system, like Motorola s

Low-High SNR Transition in Multiuser MIMO

Interactions of Information Theory and Estimation in Single- and Multi-user Communications

On the Design of Raptor Codes for Binary-Input Gaussian Channels

Revision of Lecture 5

Lower Bounds on the Graphical Complexity of Finite-Length LDPC Codes

Minimum Feedback Rates for Multi-Carrier Transmission With Correlated Frequency Selective Fading

The Additive Inverse Gaussian Noise Channel: an Attempt on Modeling Molecular Communication

LOW-density parity-check (LDPC) codes were invented

SOME fundamental relationships between input output mutual

The Robustness of Dirty Paper Coding and The Binary Dirty Multiple Access Channel with Common Interference

EE/Stat 376B Handout #5 Network Information Theory October, 14, Homework Set #2 Solutions

(Classical) Information Theory III: Noisy channel coding

Nonlinear Codes Outperform the Best Linear Codes on the Binary Erasure Channel

On Wiener Phase Noise Channels at High Signal-to-Noise Ratio

Principles of Coded Modulation. Georg Böcherer

Lecture 2: August 31

Appendix B Information theory from first principles

A Single-letter Upper Bound for the Sum Rate of Multiple Access Channels with Correlated Sources

Capacity and Reliability Function for Small Peak Signal Constraints

An Achievable Rate Region for the 3-User-Pair Deterministic Interference Channel

Practicable MIMO Capacity in Ideal Channels

Information Embedding meets Distributed Control

On the Limits of Communication with Low-Precision Analog-to-Digital Conversion at the Receiver

CHANNEL FEEDBACK QUANTIZATION METHODS FOR MISO AND MIMO SYSTEMS

Decoupling of CDMA Multiuser Detection via the Replica Method

PERFORMANCE COMPARISON OF DATA-SHARING AND COMPRESSION STRATEGIES FOR CLOUD RADIO ACCESS NETWORKS. Pratik Patil, Binbin Dai, and Wei Yu

Transcription:

On the Capacity of Free-Space Optical Intensity Channels Amos Lapidoth TH Zurich Zurich, Switzerl mail: lapidoth@isi.ee.ethz.ch Stefan M. Moser National Chiao Tung University NCTU Hsinchu, Taiwan mail: stefan.moser@ieee.org Michèle A. Wigger TH Zurich Zurich, Switzerl mail: wigger@isi.ee.ethz.ch arxiv:8.v [cs.it] May 8 Abstract New upper lower bounds are presented on the capacity of the free-space optical intensity channel. This channel is characterized by inputs that are nonnegative representing the transmitted optical intensity by outputs that are corrupted by additive white Gaussian noise because in free space the disturbances arise from many independent sources. Due to battery safety reasons the inputs are simultaneously constrained in both their average peak power. For a fixed ratio of the average power to the peak power the difference between the upper the lower bounds tends to zero as the average power tends to infinity, the ratio of the upper lower bounds tends to one as the average power tends to zero. The case where only an average-power constraint is imposed on the input is treated separately. In this case, the difference of the upper lower bound tends to as the average power tends to infinity, their ratio tends to a constant as the power tends to zero. I. INTRODUCTION We consider a channel model for short optical communications in free space such as the communication between a remote control a TV. We assume a channel model based on intensity modulation where the signal is modulated onto the optical intensity of the emitted light. Thus, the channel input is proportional to the light intensity is therefore nonnegative. We further assume that the receiver directly measures the incident optical intensity of the incoming signal, i.e., it produces an electrical current at the output which is proportional to the detected intensity. Since in ambient light conditions the received signal is disturbed by a high number of independent sources, we model the noise as Gaussian. Moreover, we assume that the line-of-sight component is dominant ignore any effects due to multiple-path propagation like fading or inter-symbol interference. Optical communication is restricted not only by battery power, but also for safety reasons by the maximum allowed peak power. We therefore assume simultaneously two constraints: an average-power constraint a maximum allowed peak power. The situation where only a peak-power constraint is imposed corresponds to =. The case of only an average-power constraint is treated separately. In this work we study the channel capacity of such an optical communication channel present new upper lower bounds. The maximum gap between upper lower bound never exceeds nat when the ratio of the averagepower constraint to the peak-power constraint is larger than. or when only the average power is constrained but not the peak power. Asymptotically when the available average peak power tend to infinity with their ratio held fixed, the upper lower bounds coincide, i.e., their difference tends to. We also present the asymptotic behavior of the channel capacity in the it when the power tends to. The channel model has been studied before, e.g., in [] is described in detail in the following. The received signal is corrupted by additive noise due to strong ambient light that causes high-intensity shot noise in the electrical output signal. In a first approximation this shot noise can be assumed to be independent of the signal itself, since the noise is caused by many independent sources it is reasonable to model it as an independent identically distributed IID Gaussian process. Also, without loss of generality we assume the noise to be zero-mean, since the receiver can always subtract or add any constant signal. Hence, the channel output Y k at time k, modeling a sample of the electrical output signal, is given by Y k = x k +Z k, where x k R + denotes the time-k channel input represents a sample of the electrical input current that is proportional to the optical intensity therefore nonnegative, where the rom process {Z k } modeling the additive noise is given by {Z k } IID N R,. It is important to note that, unlike the input x k, the output Y k may be negative since the noise introduced at the receiver can be negative. Since the optical intensity is proportional to the optical power, in such a system the instantaneous optical power is proportional to the electrical input current []. This is in contrast to radio communication where usually the power is proportional to the square of the input current. Therefore, in addition to the implicit nonnegativity constraint on the input, X k, we assume constraints both on the peak the average power, i.e., Pr[X k > ] =,

[X k ]. We shall denote the ratio between the average power the peak power by α, α, 6 where we assume < α. Note that α = corresponds to the case with only a peak-power constraint. Similarly, α corresponds to a dominant average-power constraint only a very weak peak-power constraint. We denote the capacity of the described channel with peak-power constraint average-power constraint by,. The capacity is given by [], = supix;y 7 where IX;Y sts for the mutual information between the channel input X the channel output Y, where conditional on the input x the output Y is Gaussian N x, ; where the supremum is over all laws on X satisfying Pr[X ] = [X]. In the case of only an average-power constraint the capacity is denoted by. It is given as in 7 except that the supremum is taken over all laws on X satisfying [X]. The derivation of the upper bounds is based on a technique introduced in []. There a dual expression of mutual information is used to show that for any channel law W for an arbitrary distribution R over the channel output alphabet, the channel capacity is upper-bounded by Q [ D W X R ]. 8 Here, D sts for the relative entropy [, Ch. ], Q denotes the capacity-achieving input distribution. For more details about this technique for a proof of 8, see [, Sec. V], [6, Ch. ]. The challenge of using 8 lies in a clever choice of the arbitrary law R that will lead to a good upper bound. Moreover, note that the bound 8 still contains an expectation over the unknown capacity-achieving input distribution Q. To hle this expectation we will need to resort to some further bounding like, e.g., Jensen s inequality [, Ch..6]. The derivation of the firm lower bounds relies on the entropy power inequality [, Th. 7.7.]. The asymptotic results at high power follow directly by evaluation of the firm upper lower bounds. For the low power regime we introduce an additional lower bound which does not rely on the entropy power inequality. This lower bound is obtained by choosing a binary input distribution, a choice which was inspired by [7], by then evaluating the corresponding mutual information. For the cases involving a peak-power constraint we further resort to the results on the asymptotic expression of mutual information for weak signals in [8]. The results of this paper are partially based on the results in [9] [6, Ch. ]. The remainder is structured as follows. In the subsequent section we state our results in Section III we give a brief outline of some of the derivations. II. RSULTS We start with an auxiliary lemma which is based on the symmetry of the channel law the concavity of channel capacity in the input distribution. Lemma : Consider a peak-power constraint an average-power constraint such that α = >. Then the optimal input distribution in 7 has an average power equal to half the peak power Q [X] =, 9 irrespective of α. I.e., the average-power constraint is inactive for all α,], in particular,α =,, < α. We are now ready to state our results. We will distinguish between three different cases: in the first two cases we impose on the input both an average- a peak-power constraint: in the first case the average-to-peak power ratio α is restricted to lie in,, in the second case it is restricted to lie in [,]. Note that by Lemma, < α represents the situation with an inactive average-power constraint. In the third case we impose on the input an average power constraint only. In all three cases we present firm upper lower bounds on the channel capacity. The difference of the upper lower bounds tends to when the available average peak power tend to infinity with their ratio held constant at α. Thus we can derive the asymptotic capacity at high power exactly. We also present the asymptotics of capacity at low power: for the cases involving a peak-power constraint we are able to state them exactly, for the case of only an average-power constraint we give the asymptotics up to a constant factor. A. Bounds on Channel Capacity with both an Average- a Peak-Power Constraint < α < Theorem : If < α <, then,α is lowerbounded by,α log + e αµ πe e µ, upper-bounded by each of the two bounds,α log +α α, +α + α,α Q Q log e µ e µ+ πµ Q It was shown in [] that the optimal input distribution in 7 is unique. µ

[nats per channel use]..... upper bound, α=. upper bd., α=., using 6, 7 lower bound, α=. Fig.. Bounds of Theorem for a choice of the average-to-peak power ratio α =.. The free parameters have been chosen as suggested in 6 7. The maximum gap between upper lower bound is.7 nats for /.8 db. [nats per channel use]..... upper bound, α=. upper bd., α=., using 6, 7 lower bound, α=. Fig.. Bounds of Theorem for a choice of the average-to-peak power ratio α =.. The free parameters have been chosen as suggested in 6 7. The maximum gap between upper lower bound is.6 nats for / 7. db. +Q + µ π +µα Q + π e e e + Here Q denotes the Q-function defined by Qξ ξ +. π e t ṭ, ξ R; µ > > are free parameters; µ is the unique solution of α = µ e µ. e µ Note that µ is well-defined as the function µ µ e µ e µ is strictly monotonically decreasing over, tends to for µ to for µ. A suboptimal but useful choice for the free parameters in the upper bound is = log +, 6 µ = µ,α µ α e, 7 where µ is the solution to. For this choice for α =.. the bounds of Theorem are depicted in Figures. Theorem : If α lies in,, then { } χα,α log = logπe αµ log αµ 8,α / = α α. 9 B. Bounds on Channel Capacity with a Strong Peak-Power Inactive Average-Power Constraint α By Lemma we have that for < α the averagepower constraint is inactive C,α = C,. Thus we can obtain the results in this section by simply deriving bounds for the case α =. Theorem : If α [,], then,α is lower-bounded by,α log + πe, is upper-bounded by each of the two bounds,α log +, + +,α Q log π Q +Q + e, π where > is a free parameter. We suboptimally choose = log +. For this choice the bounds of Theorem are depicted in Figure. Theorem : If α lies in [,], then χα {,α log } = logπe,α = / 8. Note that exhibit the well-known asymptotic behavior of the capacity of a Gaussian channel under a peakpower constraint only [].

[nats per channel use]..... upper bound, using upper bound lower bound Fig.. Bounds on the capacity of the free-space optical intensity channel with average- peak-power constraints for α according to Theorem. This includes the case of only a peak-power constraint α =. The free parameter has been chosen as suggested in. The maximum gap between upper lower bound is. nats for / 7 db. [nats per channel use].... upper bound 8, using, upper bound 7, using 9, lower bound 6 Fig.. Bounds on the capacity of the free-space optical intensity channel with only an average-power constraint according to Theorem 6. The free parameters have been chosen as suggested in 9. The maximum gap between upper lower bound is.6 nats for /.8 db. C. Bounds on Channel Capacity with an Average-Power Constraint Finally, we consider the case with an average-power constraint only. Theorem 6: In the absence of a peak-power constraint the channel capacity is lower-bounded by log + e π, 6 is upper-bounded by each of the bounds log βe + πq + Q log π Q + +,, 7 β π e log βe + πq + Q + π e + + Q + + + e β π logπe,, 8 where β > are free parameters. Note that bound 7 only holds for e, while bound 8 only holds for. A suboptimal but useful choice for the free parameters in bound 7 is shown in 9 for the free parameters in bound 8 is shown in at the top of the next page. For these choices, the bounds of Theorem 6 are depicted in Figure. Theorem 7: In the case of only an average-power constraint, { χ log } = log e π log log,. Note that the asymptotic upper lower bound at low SNR do not coincide in the sense that their ratio equals instead of. However, they exhibit similar behavior. III. DRIVATION In the following we will outline the derivations of the firm lower upper bounds given in the previous section. One easily finds a lower bound on capacity by dropping the maximization choosing an arbitrary input distribution Q to compute the mutual information between input output. However, in order to get a tight bound, this choice of Q should yield a mutual information that is reasonably close to capacity. Such a choice is difficult to find might make the evaluation of IX;Y intractable. The reason for this is that even for relatively easy distributions Q, the corresponding distribution on the channel output Y may be difficult to compute, let alone hy. We circumvent these problems by using the entropy power inequality [, Th. 7.7.] to lower-bound hy by an expression that depends only on hx. I.e., we transfer the problem of computing or bounding hy to the input side of the channel, where it is much easier to choose an appropriate distribution that leads to a tight lower bound on channel capacity: = supix;y 6 Q IX;Y for a specific Q 7

= β = β = log β = β log, for e e. db, 9 + + + + πe Q, π π + π +, + + e + π + + e + + + πe e Q π π. = hy hy X for a specific Q 8 = hx +Z for a specific Q hz 9 log e hx +e hz for a specific Q hz = log + ehx πe for a specific Q where the inequality in follows from the entropy power inequality. To make this lower bound as tight as possible we will choose a distribution Q that maximizes differential entropy under the given constraints [, Ch. ]. The derivation of the upper bounds in Section II are based on the duality approach 8. Hence, we need to specify a distribution R evaluate the relative entropy in 8. We have chosen output distributionsr with the following densities. For we choose R y y π + e + ; for we choose R y π e y, µ Q e µ e µ + e π e y y <, µy, y +,, y > +, where > µ > are free parameters; for we choose R y π for we choose π e y R y Q +,, π e y + y e + ; y <, y +,, y > +, where > is a free parameter; for 7 8 we choose y R βe y + πq e, y <, 6 e e y+ β, y, βe + πq where R β > are free parameters. In the derivation of 8 we then restrict to be nonnegative, while in the derivation of 7 we restrict e. ACKNOWLDGMNTS The authors would like to thank Ding-Jie Lin for fruitful discussions. The work of S. M. Moser was supported in part by the TH under TH- -. RFRNCS [] T. H. Chan, S. Hranilovic, F. R. Kschischang, Capacity-achieving probability measure for conditionally Gaussian channels with bounded inputs, I Trans. Inf. Theory, vol., no. 6, pp. 7 88, Jun.. [] J. M. Kahn J. R. Barry, Wireless infrared communications, Proc. I, vol. 8, no., pp. 6 98, Feb. 997. [] C.. Shannon, A mathematical theory of communication, Bell System Techn. J., vol. 7, pp. 79 6 66, Jul. Oct. 98. [] A. Lapidoth S. M. Moser, Capacity bounds via duality with applications to multiple-antenna systems on flat fading channels, I Trans. Inf. Theory, vol. 9, no., pp. 6 67, Oct.. [] T. M. Cover J. A. Thomas, lements of Information Theory, nd ed. John Wiley & Sons, 6. [6] S. M. Moser, Duality-based bounds on channel capacity, Ph.D. dissertation, Swiss Fed. Inst. of Technol. TH, Zurich, Oct., Diss. TH No. 769. [Online]. Available: http://moser.cm.nctu.edu.tw/ [7] J. G. Smith, The information capacity of amplitude varianceconstrained scalar Gaussian channels, Inform. Contr., vol. 8, pp. 9, 97. [8] V. V. Prelov. C. van der Meulen, An asymptotic expression for the information capacity of a multidimensional channel with weak input signals, I Trans. Inf. Theory, vol. 9, no., pp. 78 7, Sep. 99. [9] M. A. Wigger, Bounds on the capacity of free-space optical intensity channels, Master s thesis, Signal Inf. Proc. Lab., TH Zurich, Switzerl, Mar., supervised by Prof. Dr. Amos Lapidoth.