Chapter 4: Continuous channel and its capacity
|
|
- Madison Wade
- 6 years ago
- Views:
Transcription
1 Reference : Elements of Information Theory by Cover and Thomas
2 Continuous random variable Gaussian multivariate random variable AWGN Band limited channel Parallel channels Flat fading channel Shannon Capacity of fading Channel CSI known at TX
3 Continuous random variable Continuous random variable Gaussian multivariate random variable In the case where X is a continuous RV, how the entropy is defined? For discrete RV we used the mass probability function, here it is replaced by probability distribution function (PDF). Definition The random variable X is said to be continuous if its cumulative distribution function F (x) = Pr(X x) is continuous.
4 Outline Continuous random variable Gaussian multivariate random variable Definition The differential entropy h(x ) of a continuous random variable X with a PDF P X (x) is defined as 1 h(x ) = P X (x) log = E S [ log 1 P X (x) P X (x) dx ] where S is the support set of the random variable. (1)
5 Example: Uniform distribution Continuous random variable Gaussian multivariate random variable Show that for X U(0, a) the differential entropy is log a. 1/a P X (x) Note Unlike discrete entropy, the differential entropy can be negative. However, 2 h(x ) = 2 log a = a is the volume of the support set, which is always non-negative. Note A horizontal shift does not change the entropy. a x
6 Continuous random variable Gaussian multivariate random variable Example: Normal and exponential distribution Show that for X N (0, σ 2 ) the differential entropy is h(x) = 1 2 log(2πeσ2 ) bits Show that for P X (x) = λe λx for X 0 the differential entropy is h(x) = log e λ bits What is the entropy if P X (x) = λ 2 e λ x?
7 Exercise Outline Continuous random variable Gaussian multivariate random variable Suppose an additive Gaussian channel defined by Y = X + N with: X N (0, P X ) and N N (0, P N ). Because of the independence of X and N, Y N (0, P X + P N ). Defining I (X ; Y ) = h(y ) h(y X ), show that I (X ; Y ) = 1 ( 2 log P ) X P N Hint: You can use the fact that h(y X ) = h(n) (why?). Actually this is the capacity of a noisy continuous channel.
8 Gaussian random vector Continuous random variable Gaussian multivariate random variable Suppose that the vector X is defined as [ ] X1 X = where X 1 and X 2 are i.i.d. N (0, 1). What is the entropy of X? h(x) = h(x 1, X 2 ) = h(x 1 ) + h(x 2 X 1 ) = h(x 1 ) + h(x 2 ) Therefore h(x) = 1 2 log(2πe)2 And for a vector of dimension n: h(x) = 1 2 log(2πe)n X 2
9 Continuous random variable Gaussian multivariate random variable Some properties 1. Chain rule: h(x, Y ) = h(x Y ) + h(y ) 2. h(x + cte) = h(x ) 3. h(cx ) = h(x ) + log c (note that in discrete case, H(cX ) = H(X )) 4. Let X be a random vector and Y = AX where A is a square non singular matrix. Then h(y) = h(x) + log A. 5. Suppose X is a random vector with E(X) = 0 and E(XX T ) = K, then h(x) 1 2 log(2πe)n K. The equality is achieved only if X is Gaussian N (0, K)
10 Shannon capacity Outline AWGN Band limited channel Parallel channels In the early 1940s it was thought to be impossible to send information at a positive rate with negligible probability of error. Shannon showed that (1948): For every channel there exists a maximum information transmission rate, below which, BER can be made nearly zero. If the entropy of source is less than channel capacity, asymptotically error free communication can be achieved. To obtain an error free communication, a coding scheme should be used. Shannon did not show the optimal coding. Today, the predicted capacity by Shannon can be achieved within only a few tenth of db. For every channel there exists a maximum information transmission rate, below which, the error probability can be made nearly zero.
11 Additive white Gaussian channel AWGN Band limited channel Parallel channels As we have seen before with an additive Gaussian noise channel, the mutual input-output information can be calculated as I (X ; Y ) = h(y ) h(y X ) = h(y) h(z) = h(y ) 1 log 2πeN 2 To maximize the mutual information, one should maximize h(y ) with the power constraint of P Y = P + N. The distribution maximizing the entropy for a continuous random variable is Gaussian. This can be obtain if X is Gaussian. C = max I (X ; Y ) = 1 ( p(x):e X 2 P 2 log 1 + P ) N
12 Band limited channels AWGN Band limited channel Parallel channels Suppose we have a continuous channel with bandwidth B and the power spectral density of noise is N 0 /2. So the analog noise power is N 0 B. On the other hand, supposing that the channel is used over the time interval [0, T ]. So the power of analog signal times T gives the total energy of the signal in this period. Using Shannon sampling theorem, there are 2B samples per second. So the power of discrete signal per sample will be PT /2BT = P/2B. The same argument can be used for the noise, so the power of samples of noise is N 0 2 2B T 2BT = N 0/2. So the capacity of the Gaussian channel per sample is: C = 1 ( 2 log 1 + P ) bits per sample N 0 B
13 Band limited channel capacity AWGN Band limited channel Parallel channels Since there are maximum 2B independent samples per second the capacity can be written as: ( C = B log 1 + P ) bits per second N 0 B Sometimes this equation is divided by B to obtain: ( C B = log 1 + P ) bits per second per Hz N 0 B It is the maximum achievable spectral efficiency through the AWGN channel.
14 AWGN Band limited channel Parallel channels Parallel independent Gaussian channel Here we consider k independent Gaussian channels in parallel with a common power constraint. The objective is to maximize the capacity by optimal distribution of the power among the channels: C = max p X1,...,X k (x 1,...,x k ): I (X 1,..., X k ; Y 1,..., Y k ) EXi 2 P
15 AWGN Band limited channel Parallel channels Parallel independent Gaussian channel Using the independence of Z 1,..., Z k : C = I (X 1,..., X k ; Y 1,..., Y k ) = h(y 1,..., Y k ) h(y 1,..., Y k X 1,..., X k ) = h(y 1,..., Y k ) h(z 1,..., Z k ) i i h(y i ) h(z i ) ( 1 2 log 1 + P ) i N i If there is no common power constraint, it is clear that the total capacity is the sum of the capacities of each channel.
16 Common power constraint AWGN Band limited channel Parallel channels The question is: how to distribute the power among the transmitter to maximize the capacity? The capacity for the equivalent channel is: C = max P 1 +P 2 P x [ B 1 log(1 + P 1h2 1 N 0 B 1 ) + B 2 log(1 + P 2h 2 2 N 0 B 2 ) ]
17 Common power constraint AWGN Band limited channel Parallel channels So we should maximize C subjected to P 1 + P 2 P x. Using Lagrangian, one can define: L(P 1, P 2, λ) = B 1 log(1+ P 1h 2 1 N 0 B 1 )+B 2 log(1+ P 2h 2 2 N 0 B 2 ) λ(p 1 +P 2 P x ) Let d(.)/dp 1 = 0 and d(.)/dp 2 = 0 and using ln instead of log 2 : B P 1h 2 1 N 0 B 1 h 2 1 N 0 B 1 = λ P 1 = 1 1 B 1 N 0 λn 0 h1 2
18 AWGN Band limited channel Parallel channels With the same operations we obtain: P 1 B 1 N 0 = Cst 1 h1 2 P 2 B 2 N 0 = Cst 1 h2 2 Where the Cte can be found by setting P 1 + P 2 = P x. Since the two powers are found, the capacity of the channel is calculated easily. The only constraint that to be considered is that P 1 and P 2 cannot be negative. If one of these is negative, the corresponding power is zero and all the power are assigned to the other one. This principle is called water filling.
19 Exercise Outline AWGN Band limited channel Parallel channels Exercise Use the same principle (water filling) and give the power allocation for a channel with three frequency bands defined as follows: h 1 = 1/2, h 2 = 1/3 and h 3 = 1; B 1 = B, B 2 = 2B and B 3 = B; N 0 B = 1; P x = P 1 + P 2 + P 3 = 10. Solution: P 1 = 3.5, P 2 = 0 and P 3 = 6.5.
20 Flat fading channel Shannon Capacity of fading Channel CSI known at TX Flat fading channel (frequency non-selective) A non LOS urban transmission results in general in many of multi paths : the received signal is the sum of many replicas of transmitted signal. Using I and Q components of received signal: r(t) = cos(2πf c t) I a i cos(φ i ) sin(2πf c t) i=0 I a i sin(φ i ) + n(t) i=0 With the central limit theorem, A = I i=0 a i cos(φ i ) and B = I i=0 a i sin(φ i ) are i.i.d. Gaussian random variables.
21 Flat fading channel Shannon Capacity of fading Channel CSI known at TX The envelope of the received signal h = A 2 + B 2 will be Rayleigh random variable with: f h (h) = h ( ) h 2 σ 2 exp 2σ 2 r 0 with σ 2 the variance of A and B. The received power will be an exponential RV with the pdf: f (p) = 1 ( ) p 2σ 2 exp 2σ 2 r 0 Therefore, the received signal can be modeled as: Y = hx + N
22 Flat fading channel Shannon Capacity of fading Channel CSI known at TX Shannon (ergodic) capacity when Rx knows CSI The Channel coefficient h is an i.i.d. random variable independent of signal and noise. We assume that the receiver knows the channel coefficient but the transmitter does not. The capacity is: C = max px :E[X ] P I (X ; Y, h) Using chain rule: I (X ; Y, h) = I (X ; h) + I (X ; Y h) = I (X ; Y h)
23 Flat fading channel Shannon Capacity of fading Channel CSI known at TX Conditioned on the fading coefficient h, the channel is transformed into a simple AWGN with equivalent P equal to h 2 P X. So we can write: I (X ; Y h = h) = 1 ( ) 2 log 1 + h 2 P X P N The ergodic capacity of the flat fading channel will be : [ ( 1 C = E h 2 log 1 + P X h 2 )] Note: Normally all the signals are complex and they are the base band equivalent of reel signals. In this case, the capacity is multiplied by two since the real and imaginary parts of signals are decorrelated. P N
24 Flat fading channel Shannon Capacity of fading Channel CSI known at TX Example(Wireless transmission by Andreas Goldsmith) Consider a wireless channel where power falloff with distance follows the formula P r (d) = P t (d 0 /d) 3 for d 0 = 10m. Assume the channel band width of B = 30 khz and AWGN with noise PSD N 0 /2, where N 0 = 10 9 W/Hz. For a transmit power of 1 W flind the capacity of the channel for a distance of 100m and 1km. Solution: The received signal to noise ratio SNR is γ = P r (d)/p N = p t (d 0 /d) 3 /(N 0 B). That is γ = 15 db for d = 100m, and 15 db for d = 1km. The capacity of complex transmission is C = B log(1 + SNR) and is kbps for d = 100m and 1.4 kbps for d = 1000 m.
25 Flat fading channel Shannon Capacity of fading Channel CSI known at TX Example(Wireless transmission by Andreas Goldsmith) Consider a flat fading channel with i.i.d. channel gain h, which can take on three possible values: 0.05 with the probability of 0.1, 0.5 with 0.5, and 1 with 0.4. The transmitted power is 10 mw, N 0 = 10 9 W/Hz, and the channel band width is 30 khz. Assume the receiver has the knowledge of instantaneous value of h but the transmitter does not. Find the Shannon capacity of this channel. Solution: The channel has three possible received SNRs: γ 1 = P t h 1 /N 0 B = 0.83, γ 2 = P t h 2 /N 0 B = 83.33, and γ 3 = P t h 3 /N 0 B = So the Shannon capacity is given by: C = i B log 2 (1 + γ i )p(γ i ) = Kbps Note: The average SNR is 175 and the corresponding capacity would be Kbps.
26 Flat fading channel Shannon Capacity of fading Channel CSI known at TX Shannon capacity defines the maximum data rate that can be sent over the channel with asymptotically small error probability. Since the TX does not know the channel, the transmitted rate is constant. When channel is in deep fade, the BER is not zero because the TX cannot adapt its rate relative to CSI. So the capacity with outage is defined and is the maximum rate that can be achieved with some outage probability (the probability of deep fading). By allowing some losses in deep fading, higher data rate can be achieved.
27 Flat fading channel Shannon Capacity of fading Channel CSI known at TX Fixing the required rate, C, a corresponding minimum SNR can be calculated (assuming complex transmission): C = log 2 (1 + γ min ) If TX sends the date at this rate, the outage (non zero BER) occurs when γ < γ min. Therefore the probability of outage is p out = p(γ < γ min ). The average rate of data that correctly received at RX is C O = (1 p out )B log 2 (1 + γ min ). The value of γ min is a design parameter based on the acceptable outage probability. Normally one draws the normalized capacity C/B = log 2 (1 + γ min ) as a function of p out = p(γ < γ min ).
28 Flat fading channel Shannon Capacity of fading Channel CSI known at TX Example(Wireless transmission by Andreas Goldsmith) Consider the same channel as in the last example with BW=30kHz and p(γ = 0.83) = 0.1, p(γ = 83.33) = 0.5, and p(γ = ) = 0.4. Find the capacity versus outage and the average rate correctly received for outage probabilities p out < 0.1, p out = 0.1 and p out=0.6. Solution: For p out < 0.1, we must decode in all the channel states. Therefore the rate must be less than the worst case: γ min = γ 1 = The corresponding capacity is Kbps. For 0.1 P out < 0.6, we can decode incorrectly only if the channel is in the weakest state: γ = So γ min = γ 2 with corresponding capacity of Kbps. For 0.6 P out < 1, we can decode incorrectly if received γ is γ 1 or γ 2. Thus, γ min = γ 3 with corresponding capacity of Kbps.
29 Flat fading channel Shannon Capacity of fading Channel CSI known at TX Example (cont.) For p out < 0.1 data rates close to Kbps are always correctly received. For p out = 0.1 we transmit at the rate but can only corecte when γ = γ 2 or γ 3. So the rate correctly received is (1-0.1)191.94= Kbps. For p out = 0.6 the rate correctly received is (1-0.6)251.55= Kbps.
30 Flat fading channel Shannon Capacity of fading Channel CSI known at TX Since the channel is known at the TX, the outage cannot be produced. That is because the TX can adapt its power to avoid the outage. The capacity is (which is the same as Shannon capacity as before): C = 0 B log 2 (1 + γ)p(γ)dγ Now we add also the power adaptation with a power constraint: 0 P(γ)p(γ)dγ P So the problem is how to distribute the available power as a function of SNR to maximize the rate while the average power dose not exceed a predefined value.
31 Flat fading channel Shannon Capacity of fading Channel CSI known at TX Water-filling The capacity is then C = max P(γ): P(γ)p(γ)dγ= P 0 0 ( B log P(γ)γ ) p(γ)dγ P Note that γ = P h 2 N 0 B. It means that for each channel level realization, a coding is employed to adjust the rate To find the optimal power allocation P(γ) we form the Lagrangian. ( J(P(γ)) = B log P(γ)γ ) p(γ)dγ λ P(γ)p(γ)dγ P 0
32 Flat fading channel Shannon Capacity of fading Channel CSI known at TX Water-filling Setting the derivative with respect to P(γ) equal to zero and solving for P(γ) with the constraint P(γ) 0: { P(γ) P = 1/γ 0 1/γ γ γ 0 0 γ < γ 0 It means that if γ is under a threshold γ 0, the channel will not be used. The capacity formula is then: ( ) γ C = B log 2 p(γ)dγ γ 0 γ 0
33 Flat fading channel Shannon Capacity of fading Channel CSI known at TX Water-filling Therefore, the capacity can be achieved by adapting the rate as a function of SNR. Another strategy would be fixing the rate and adapting only the power. Note that γ 0 must be found numerically. Replacing the optimal power allocation calculated in the constraint power, we obtain the following expression that should be satisfied to calculate γ 0. ( 1 1 ) p(γ)dγ = 1 γ 0 γ γ 0
34 Flat fading channel Shannon Capacity of fading Channel CSI known at TX Water-filling γ 0 1/γ 0 P(γ) P 1/γ γ 03 C6 03 C1 03 B5 03 C1 03 C6 03 C6 03 C1 Figure above shows why this principle is called water-filling. 1F B2 03 B2
35 Flat fading channel Shannon Capacity of fading Channel CSI known at TX Example With the same example as before: p(γ 1 = 0.83) = 0.1, p(γ 1 = 83.33) = 0.5, and p(γ 3 = ) = 0.4. Find the ergodic capacity of the channel with CSI at TX and RX. Solution: Since water-filling will be used, we must first calculate γ 0 satisfying: ( 1 1 ) p(γ i ) = 1 γ 0 γ i γ i γ 0
36 Flat fading channel Shannon Capacity of fading Channel CSI known at TX Example (cont.) First we assume that all channel states will be used. In the above equation everything is known except for γ 0 which is calculated to be Since this value exceeds γ 1 = 0.83, the first channel state should not be used. At the first iteration, the above equation will be calculated only for the second and third channel giving γ 0 = This value is acceptable because the weakest channel is better than this minimum threshold. Using this values the channel capacity can be calculated and is Kbps.
Wireless Communications Lecture 10
Wireless Communications Lecture 1 [SNR per symbol and SNR per bit] SNR = P R N B = E s N BT s = E b N BT b For BPSK: T b = T s, E b = E s, and T s = 1/B. Raised cosine pulse shaper for other pulses. T
More informationLecture 7 MIMO Communica2ons
Wireless Communications Lecture 7 MIMO Communica2ons Prof. Chun-Hung Liu Dept. of Electrical and Computer Engineering National Chiao Tung University Fall 2014 1 Outline MIMO Communications (Chapter 10
More informationELEC546 Review of Information Theory
ELEC546 Review of Information Theory Vincent Lau 1/1/004 1 Review of Information Theory Entropy: Measure of uncertainty of a random variable X. The entropy of X, H(X), is given by: If X is a discrete random
More informationChapter 8: Differential entropy. University of Illinois at Chicago ECE 534, Natasha Devroye
Chapter 8: Differential entropy Chapter 8 outline Motivation Definitions Relation to discrete entropy Joint and conditional differential entropy Relative entropy and mutual information Properties AEP for
More informationRevision of Lecture 5
Revision of Lecture 5 Information transferring across channels Channel characteristics and binary symmetric channel Average mutual information Average mutual information tells us what happens to information
More informationSolutions to Homework Set #4 Differential Entropy and Gaussian Channel
Solutions to Homework Set #4 Differential Entropy and Gaussian Channel 1. Differential entropy. Evaluate the differential entropy h(x = f lnf for the following: (a Find the entropy of the exponential density
More informationPrinciples of Communications
Principles of Communications Weiyao Lin Shanghai Jiao Tong University Chapter 10: Information Theory Textbook: Chapter 12 Communication Systems Engineering: Ch 6.1, Ch 9.1~ 9. 92 2009/2010 Meixia Tao @
More informationErgodic and Outage Capacity of Narrowband MIMO Gaussian Channels
Ergodic and Outage Capacity of Narrowband MIMO Gaussian Channels Yang Wen Liang Department of Electrical and Computer Engineering The University of British Columbia April 19th, 005 Outline of Presentation
More informationOn the Secrecy Capacity of Fading Channels
On the Secrecy Capacity of Fading Channels arxiv:cs/63v [cs.it] 7 Oct 26 Praveen Kumar Gopala, Lifeng Lai and Hesham El Gamal Department of Electrical and Computer Engineering The Ohio State University
More informationSingle-User MIMO systems: Introduction, capacity results, and MIMO beamforming
Single-User MIMO systems: Introduction, capacity results, and MIMO beamforming Master Universitario en Ingeniería de Telecomunicación I. Santamaría Universidad de Cantabria Contents Introduction Multiplexing,
More informationAppendix B Information theory from first principles
Appendix B Information theory from first principles This appendix discusses the information theory behind the capacity expressions used in the book. Section 8.3.4 is the only part of the book that supposes
More informationErgodic and Outage Capacity of Narrowband MIMO Gaussian Channels
Ergodic and Outage Capacity of Narrowband MIMO Gaussian Channels Yang Wen Liang Department of Electrical and Computer Engineering The University of British Columbia, Vancouver, British Columbia Email:
More informationChannel capacity. Outline : 1. Source entropy 2. Discrete memoryless channel 3. Mutual information 4. Channel capacity 5.
Channel capacity Outline : 1. Source entropy 2. Discrete memoryless channel 3. Mutual information 4. Channel capacity 5. Exercices Exercise session 11 : Channel capacity 1 1. Source entropy Given X a memoryless
More informationLecture 2. Capacity of the Gaussian channel
Spring, 207 5237S, Wireless Communications II 2. Lecture 2 Capacity of the Gaussian channel Review on basic concepts in inf. theory ( Cover&Thomas: Elements of Inf. Theory, Tse&Viswanath: Appendix B) AWGN
More informationDigital Band-pass Modulation PROF. MICHAEL TSAI 2011/11/10
Digital Band-pass Modulation PROF. MICHAEL TSAI 211/11/1 Band-pass Signal Representation a t g t General form: 2πf c t + φ t g t = a t cos 2πf c t + φ t Envelope Phase Envelope is always non-negative,
More informationLecture 4. Capacity of Fading Channels
1 Lecture 4. Capacity of Fading Channels Capacity of AWGN Channels Capacity of Fading Channels Ergodic Capacity Outage Capacity Shannon and Information Theory Claude Elwood Shannon (April 3, 1916 February
More informationEE 4TM4: Digital Communications II Scalar Gaussian Channel
EE 4TM4: Digital Communications II Scalar Gaussian Channel I. DIFFERENTIAL ENTROPY Let X be a continuous random variable with probability density function (pdf) f(x) (in short X f(x)). The differential
More informationOptimal Transmit Strategies in MIMO Ricean Channels with MMSE Receiver
Optimal Transmit Strategies in MIMO Ricean Channels with MMSE Receiver E. A. Jorswieck 1, A. Sezgin 1, H. Boche 1 and E. Costa 2 1 Fraunhofer Institute for Telecommunications, Heinrich-Hertz-Institut 2
More informationMultiple-Input Multiple-Output Systems
Multiple-Input Multiple-Output Systems What is the best way to use antenna arrays? MIMO! This is a totally new approach ( paradigm ) to wireless communications, which has been discovered in 95-96. Performance
More informationECE Information theory Final (Fall 2008)
ECE 776 - Information theory Final (Fall 2008) Q.1. (1 point) Consider the following bursty transmission scheme for a Gaussian channel with noise power N and average power constraint P (i.e., 1/n X n i=1
More informationAdvanced Topics in Digital Communications Spezielle Methoden der digitalen Datenübertragung
Advanced Topics in Digital Communications Spezielle Methoden der digitalen Datenübertragung Dr.-Ing. Carsten Bockelmann Institute for Telecommunications and High-Frequency Techniques Department of Communications
More informationLecture 6: Gaussian Channels. Copyright G. Caire (Sample Lectures) 157
Lecture 6: Gaussian Channels Copyright G. Caire (Sample Lectures) 157 Differential entropy (1) Definition 18. The (joint) differential entropy of a continuous random vector X n p X n(x) over R is: Z h(x
More informationCommunication Theory II
Communication Theory II Lecture 15: Information Theory (cont d) Ahmed Elnakib, PhD Assistant Professor, Mansoura University, Egypt March 29 th, 2015 1 Example: Channel Capacity of BSC o Let then: o For
More informationLecture 18: Gaussian Channel
Lecture 18: Gaussian Channel Gaussian channel Gaussian channel capacity Dr. Yao Xie, ECE587, Information Theory, Duke University Mona Lisa in AWGN Mona Lisa Noisy Mona Lisa 100 100 200 200 300 300 400
More informationCapacity of multiple-input multiple-output (MIMO) systems in wireless communications
15/11/02 Capacity of multiple-input multiple-output (MIMO) systems in wireless communications Bengt Holter Department of Telecommunications Norwegian University of Science and Technology 1 Outline 15/11/02
More informationCapacity of AWGN channels
Chapter 3 Capacity of AWGN channels In this chapter we prove that the capacity of an AWGN channel with bandwidth W and signal-tonoise ratio SNR is W log 2 (1+SNR) bits per second (b/s). The proof that
More informationLecture 8: Channel Capacity, Continuous Random Variables
EE376A/STATS376A Information Theory Lecture 8-02/0/208 Lecture 8: Channel Capacity, Continuous Random Variables Lecturer: Tsachy Weissman Scribe: Augustine Chemparathy, Adithya Ganesh, Philip Hwang Channel
More informationEE376A: Homeworks #4 Solutions Due on Thursday, February 22, 2018 Please submit on Gradescope. Start every question on a new page.
EE376A: Homeworks #4 Solutions Due on Thursday, February 22, 28 Please submit on Gradescope. Start every question on a new page.. Maximum Differential Entropy (a) Show that among all distributions supported
More informationTitle. Author(s)Tsai, Shang-Ho. Issue Date Doc URL. Type. Note. File Information. Equal Gain Beamforming in Rayleigh Fading Channels
Title Equal Gain Beamforming in Rayleigh Fading Channels Author(s)Tsai, Shang-Ho Proceedings : APSIPA ASC 29 : Asia-Pacific Signal Citationand Conference: 688-691 Issue Date 29-1-4 Doc URL http://hdl.handle.net/2115/39789
More informationMultiple Antennas in Wireless Communications
Multiple Antennas in Wireless Communications Luca Sanguinetti Department of Information Engineering Pisa University luca.sanguinetti@iet.unipi.it April, 2009 Luca Sanguinetti (IET) MIMO April, 2009 1 /
More informationCapacity of Block Rayleigh Fading Channels Without CSI
Capacity of Block Rayleigh Fading Channels Without CSI Mainak Chowdhury and Andrea Goldsmith, Fellow, IEEE Department of Electrical Engineering, Stanford University, USA Email: mainakch@stanford.edu, andrea@wsl.stanford.edu
More informationLecture 15: Thu Feb 28, 2019
Lecture 15: Thu Feb 28, 2019 Announce: HW5 posted Lecture: The AWGN waveform channel Projecting temporally AWGN leads to spatially AWGN sufficiency of projection: irrelevancy theorem in waveform AWGN:
More informationLECTURE 18. Lecture outline Gaussian channels: parallel colored noise inter-symbol interference general case: multiple inputs and outputs
LECTURE 18 Last time: White Gaussian noise Bandlimited WGN Additive White Gaussian Noise (AWGN) channel Capacity of AWGN channel Application: DS-CDMA systems Spreading Coding theorem Lecture outline Gaussian
More informationInformation Theory - Entropy. Figure 3
Concept of Information Information Theory - Entropy Figure 3 A typical binary coded digital communication system is shown in Figure 3. What is involved in the transmission of information? - The system
More informationMorning Session Capacity-based Power Control. Department of Electrical and Computer Engineering University of Maryland
Morning Session Capacity-based Power Control Şennur Ulukuş Department of Electrical and Computer Engineering University of Maryland So Far, We Learned... Power control with SIR-based QoS guarantees Suitable
More informationECE Information theory Final
ECE 776 - Information theory Final Q1 (1 point) We would like to compress a Gaussian source with zero mean and variance 1 We consider two strategies In the first, we quantize with a step size so that the
More informationLecture 4 Capacity of Wireless Channels
Lecture 4 Capacity of Wireless Channels I-Hsiang Wang ihwang@ntu.edu.tw 3/0, 014 What we have learned So far: looked at specific schemes and techniques Lecture : point-to-point wireless channel - Diversity:
More information16.36 Communication Systems Engineering
MIT OpenCourseWare http://ocw.mit.edu 16.36 Communication Systems Engineering Spring 2009 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms. 16.36: Communication
More informationExercises with solutions (Set D)
Exercises with solutions Set D. A fair die is rolled at the same time as a fair coin is tossed. Let A be the number on the upper surface of the die and let B describe the outcome of the coin toss, where
More informationOne Lesson of Information Theory
Institut für One Lesson of Information Theory Prof. Dr.-Ing. Volker Kühn Institute of Communications Engineering University of Rostock, Germany Email: volker.kuehn@uni-rostock.de http://www.int.uni-rostock.de/
More informationBlock 2: Introduction to Information Theory
Block 2: Introduction to Information Theory Francisco J. Escribano April 26, 2015 Francisco J. Escribano Block 2: Introduction to Information Theory April 26, 2015 1 / 51 Table of contents 1 Motivation
More informationELEG 5633 Detection and Estimation Signal Detection: Deterministic Signals
ELEG 5633 Detection and Estimation Signal Detection: Deterministic Signals Jingxian Wu Department of Electrical Engineering University of Arkansas Outline Matched Filter Generalized Matched Filter Signal
More informationAsymptotic Capacity Bounds for Magnetic Recording. Raman Venkataramani Seagate Technology (Joint work with Dieter Arnold)
Asymptotic Capacity Bounds for Magnetic Recording Raman Venkataramani Seagate Technology (Joint work with Dieter Arnold) Outline Problem Statement Signal and Noise Models for Magnetic Recording Capacity
More informationA Hilbert Space for Random Processes
Gaussian Basics Random Processes Filtering of Random Processes Signal Space Concepts A Hilbert Space for Random Processes I A vector space for random processes X t that is analogous to L 2 (a, b) is of
More informationWideband Fading Channel Capacity with Training and Partial Feedback
Wideband Fading Channel Capacity with Training and Partial Feedback Manish Agarwal, Michael L. Honig ECE Department, Northwestern University 145 Sheridan Road, Evanston, IL 6008 USA {m-agarwal,mh}@northwestern.edu
More informationEE 5407 Part II: Spatial Based Wireless Communications
EE 5407 Part II: Spatial Based Wireless Communications Instructor: Prof. Rui Zhang E-mail: rzhang@i2r.a-star.edu.sg Website: http://www.ece.nus.edu.sg/stfpage/elezhang/ Lecture II: Receive Beamforming
More informationInformation Theory. Coding and Information Theory. Information Theory Textbooks. Entropy
Coding and Information Theory Chris Williams, School of Informatics, University of Edinburgh Overview What is information theory? Entropy Coding Information Theory Shannon (1948): Information theory is
More informationChapter 9 Fundamental Limits in Information Theory
Chapter 9 Fundamental Limits in Information Theory Information Theory is the fundamental theory behind information manipulation, including data compression and data transmission. 9.1 Introduction o For
More informationDiversity Combining Techniques
Diversity Combining Techniques When the required signal is a combination of several plane waves (multipath), the total signal amplitude may experience deep fades (Rayleigh fading), over time or space.
More informationQuiz 2 Date: Monday, November 21, 2016
10-704 Information Processing and Learning Fall 2016 Quiz 2 Date: Monday, November 21, 2016 Name: Andrew ID: Department: Guidelines: 1. PLEASE DO NOT TURN THIS PAGE UNTIL INSTRUCTED. 2. Write your name,
More informationLecture 4 Capacity of Wireless Channels
Lecture 4 Capacity of Wireless Channels I-Hsiang Wang ihwang@ntu.edu.tw 3/0, 014 What we have learned So far: looked at specific schemes and techniques Lecture : point-to-point wireless channel - Diversity:
More informationLecture 9: Diversity-Multiplexing Tradeoff Theoretical Foundations of Wireless Communications 1
: Diversity-Multiplexing Tradeoff Theoretical Foundations of Wireless Communications 1 Rayleigh Friday, May 25, 2018 09:00-11:30, Kansliet 1 Textbook: D. Tse and P. Viswanath, Fundamentals of Wireless
More informationLecture 9: Diversity-Multiplexing Tradeoff Theoretical Foundations of Wireless Communications 1. Overview. Ragnar Thobaben CommTh/EES/KTH
: Diversity-Multiplexing Tradeoff Theoretical Foundations of Wireless Communications 1 Rayleigh Wednesday, June 1, 2016 09:15-12:00, SIP 1 Textbook: D. Tse and P. Viswanath, Fundamentals of Wireless Communication
More informationSecrecy Outage Performance of Cooperative Relay Network With Diversity Combining
Secrecy Outage erformance of Cooperative Relay Network With Diversity Combining Khyati Chopra Dept. of lectrical ngineering Indian Institute of Technology, Delhi New Delhi-110016, India mail: eez148071@ee.iitd.ac.in
More informationECE 4400:693 - Information Theory
ECE 4400:693 - Information Theory Dr. Nghi Tran Lecture 8: Differential Entropy Dr. Nghi Tran (ECE-University of Akron) ECE 4400:693 Lecture 1 / 43 Outline 1 Review: Entropy of discrete RVs 2 Differential
More informationEE4512 Analog and Digital Communications Chapter 4. Chapter 4 Receiver Design
Chapter 4 Receiver Design Chapter 4 Receiver Design Probability of Bit Error Pages 124-149 149 Probability of Bit Error The low pass filtered and sampled PAM signal results in an expression for the probability
More informationCS6304 / Analog and Digital Communication UNIT IV - SOURCE AND ERROR CONTROL CODING PART A 1. What is the use of error control coding? The main use of error control coding is to reduce the overall probability
More information18.2 Continuous Alphabet (discrete-time, memoryless) Channel
0-704: Information Processing and Learning Spring 0 Lecture 8: Gaussian channel, Parallel channels and Rate-distortion theory Lecturer: Aarti Singh Scribe: Danai Koutra Disclaimer: These notes have not
More informationParallel Additive Gaussian Channels
Parallel Additive Gaussian Channels Let us assume that we have N parallel one-dimensional channels disturbed by noise sources with variances σ 2,,σ 2 N. N 0,σ 2 x x N N 0,σ 2 N y y N Energy Constraint:
More informationEE/Stats 376A: Homework 7 Solutions Due on Friday March 17, 5 pm
EE/Stats 376A: Homework 7 Solutions Due on Friday March 17, 5 pm 1. Feedback does not increase the capacity. Consider a channel with feedback. We assume that all the recieved outputs are sent back immediately
More informationExploiting Partial Channel Knowledge at the Transmitter in MISO and MIMO Wireless
Exploiting Partial Channel Knowledge at the Transmitter in MISO and MIMO Wireless SPAWC 2003 Rome, Italy June 18, 2003 E. Yoon, M. Vu and Arogyaswami Paulraj Stanford University Page 1 Outline Introduction
More informationLecture 8: MIMO Architectures (II) Theoretical Foundations of Wireless Communications 1. Overview. Ragnar Thobaben CommTh/EES/KTH
MIMO : MIMO Theoretical Foundations of Wireless Communications 1 Wednesday, May 25, 2016 09:15-12:00, SIP 1 Textbook: D. Tse and P. Viswanath, Fundamentals of Wireless Communication 1 / 20 Overview MIMO
More informationRevision of Lecture 4
Revision of Lecture 4 We have completed studying digital sources from information theory viewpoint We have learnt all fundamental principles for source coding, provided by information theory Practical
More informationINVERSE EIGENVALUE STATISTICS FOR RAYLEIGH AND RICIAN MIMO CHANNELS
INVERSE EIGENVALUE STATISTICS FOR RAYLEIGH AND RICIAN MIMO CHANNELS E. Jorswieck, G. Wunder, V. Jungnickel, T. Haustein Abstract Recently reclaimed importance of the empirical distribution function of
More informationLecture 17: Differential Entropy
Lecture 17: Differential Entropy Differential entropy AEP for differential entropy Quantization Maximum differential entropy Estimation counterpart of Fano s inequality Dr. Yao Xie, ECE587, Information
More informationECE 564/645 - Digital Communications, Spring 2018 Midterm Exam #1 March 22nd, 7:00-9:00pm Marston 220
ECE 564/645 - Digital Communications, Spring 08 Midterm Exam # March nd, 7:00-9:00pm Marston 0 Overview The exam consists of four problems for 0 points (ECE 564) or 5 points (ECE 645). The points for each
More information3F1: Signals and Systems INFORMATION THEORY Examples Paper Solutions
Engineering Tripos Part IIA THIRD YEAR 3F: Signals and Systems INFORMATION THEORY Examples Paper Solutions. Let the joint probability mass function of two binary random variables X and Y be given in the
More informationGaussian channel. Information theory 2013, lecture 6. Jens Sjölund. 8 May Jens Sjölund (IMT, LiU) Gaussian channel 1 / 26
Gaussian channel Information theory 2013, lecture 6 Jens Sjölund 8 May 2013 Jens Sjölund (IMT, LiU) Gaussian channel 1 / 26 Outline 1 Definitions 2 The coding theorem for Gaussian channel 3 Bandlimited
More informationMulti-User Gain Maximum Eigenmode Beamforming, and IDMA. Peng Wang and Li Ping City University of Hong Kong
Multi-User Gain Maximum Eigenmode Beamforming, and IDMA Peng Wang and Li Ping City University of Hong Kong 1 Contents Introduction Multi-user gain (MUG) Maximum eigenmode beamforming (MEB) MEB performance
More informationSummary: SER formulation. Binary antipodal constellation. Generic binary constellation. Constellation gain. 2D constellations
TUTORIAL ON DIGITAL MODULATIONS Part 8a: Error probability A [2011-01-07] 07] Roberto Garello, Politecnico di Torino Free download (for personal use only) at: www.tlc.polito.it/garello 1 Part 8a: Error
More informationWeibull-Gamma composite distribution: An alternative multipath/shadowing fading model
Weibull-Gamma composite distribution: An alternative multipath/shadowing fading model Petros S. Bithas Institute for Space Applications and Remote Sensing, National Observatory of Athens, Metaxa & Vas.
More informationDirty Paper Coding vs. TDMA for MIMO Broadcast Channels
TO APPEAR IEEE INTERNATIONAL CONFERENCE ON COUNICATIONS, JUNE 004 1 Dirty Paper Coding vs. TDA for IO Broadcast Channels Nihar Jindal & Andrea Goldsmith Dept. of Electrical Engineering, Stanford University
More informationChapter 7: Channel coding:convolutional codes
Chapter 7: : Convolutional codes University of Limoges meghdadi@ensil.unilim.fr Reference : Digital communications by John Proakis; Wireless communication by Andreas Goldsmith Encoder representation Communication
More informationECE 564/645 - Digital Communications, Spring 2018 Homework #2 Due: March 19 (In Lecture)
ECE 564/645 - Digital Communications, Spring 018 Homework # Due: March 19 (In Lecture) 1. Consider a binary communication system over a 1-dimensional vector channel where message m 1 is sent by signaling
More informationLecture 5 Channel Coding over Continuous Channels
Lecture 5 Channel Coding over Continuous Channels I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw November 14, 2014 1 / 34 I-Hsiang Wang NIT Lecture 5 From
More informationOptimal Power Allocation for Cognitive Radio under Primary User s Outage Loss Constraint
This full text paper was peer reviewed at the direction of IEEE Communications Society subject matter experts for publication in the IEEE ICC 29 proceedings Optimal Power Allocation for Cognitive Radio
More informationEstimation of Performance Loss Due to Delay in Channel Feedback in MIMO Systems
MITSUBISHI ELECTRIC RESEARCH LABORATORIES http://www.merl.com Estimation of Performance Loss Due to Delay in Channel Feedback in MIMO Systems Jianxuan Du Ye Li Daqing Gu Andreas F. Molisch Jinyun Zhang
More informationELEC546 MIMO Channel Capacity
ELEC546 MIMO Channel Capacity Vincent Lau Simplified Version.0 //2004 MIMO System Model Transmitter with t antennas & receiver with r antennas. X Transmitted Symbol, received symbol Channel Matrix (Flat
More informationSpatial and Temporal Power Allocation for MISO Systems with Delayed Feedback
Spatial and Temporal ower Allocation for MISO Systems with Delayed Feedback Venkata Sreekanta Annapureddy and Srikrishna Bhashyam Department of Electrical Engineering Indian Institute of Technology Madras
More information12.4 Known Channel (Water-Filling Solution)
ECEn 665: Antennas and Propagation for Wireless Communications 54 2.4 Known Channel (Water-Filling Solution) The channel scenarios we have looed at above represent special cases for which the capacity
More informationLecture 22: Final Review
Lecture 22: Final Review Nuts and bolts Fundamental questions and limits Tools Practical algorithms Future topics Dr Yao Xie, ECE587, Information Theory, Duke University Basics Dr Yao Xie, ECE587, Information
More informationThe Effect of Memory Order on the Capacity of Finite-State Markov and Flat-Fading Channels
The Effect of Memory Order on the Capacity of Finite-State Markov and Flat-Fading Channels Parastoo Sadeghi National ICT Australia (NICTA) Sydney NSW 252 Australia Email: parastoo@student.unsw.edu.au Predrag
More informationOptimum Transmission Scheme for a MISO Wireless System with Partial Channel Knowledge and Infinite K factor
Optimum Transmission Scheme for a MISO Wireless System with Partial Channel Knowledge and Infinite K factor Mai Vu, Arogyaswami Paulraj Information Systems Laboratory, Department of Electrical Engineering
More informationThe Effect upon Channel Capacity in Wireless Communications of Perfect and Imperfect Knowledge of the Channel
The Effect upon Channel Capacity in Wireless Communications of Perfect and Imperfect Knowledge of the Channel Muriel Medard, Trans. on IT 000 Reading Group Discussion March 0, 008 Intro./ Overview Time
More informationELEC E7210: Communication Theory. Lecture 10: MIMO systems
ELEC E7210: Communication Theory Lecture 10: MIMO systems Matrix Definitions, Operations, and Properties (1) NxM matrix a rectangular array of elements a A. an 11 1....... a a 1M. NM B D C E ermitian transpose
More informationLecture 11: Continuous-valued signals and differential entropy
Lecture 11: Continuous-valued signals and differential entropy Biology 429 Carl Bergstrom September 20, 2008 Sources: Parts of today s lecture follow Chapter 8 from Cover and Thomas (2007). Some components
More informationInformation Theory Primer:
Information Theory Primer: Entropy, KL Divergence, Mutual Information, Jensen s inequality Seungjin Choi Department of Computer Science and Engineering Pohang University of Science and Technology 77 Cheongam-ro,
More informationMultiuser Capacity in Block Fading Channel
Multiuser Capacity in Block Fading Channel April 2003 1 Introduction and Model We use a block-fading model, with coherence interval T where M independent users simultaneously transmit to a single receiver
More informationMobile Communications (KECE425) Lecture Note Prof. Young-Chai Ko
Mobile Communications (KECE425) Lecture Note 20 5-19-2014 Prof Young-Chai Ko Summary Complexity issues of diversity systems ADC and Nyquist sampling theorem Transmit diversity Channel is known at the transmitter
More informationCapacity estimates of wireless networks in Poisson shot model over uniform or fractal Cantor maps. Philippe Jacquet Bell Labs
Capacity estimates of wireless networks in Poisson shot model over uniform or fractal Cantor maps Philippe Jacquet Bell Labs Simons Conference on Networks and Stochastic Geometry Classic model: Wireless
More informationLecture 14 February 28
EE/Stats 376A: Information Theory Winter 07 Lecture 4 February 8 Lecturer: David Tse Scribe: Sagnik M, Vivek B 4 Outline Gaussian channel and capacity Information measures for continuous random variables
More informationEE6604 Personal & Mobile Communications. Week 13. Multi-antenna Techniques
EE6604 Personal & Mobile Communications Week 13 Multi-antenna Techniques 1 Diversity Methods Diversity combats fading by providing the receiver with multiple uncorrelated replicas of the same information
More informationLower Bounds on the Graphical Complexity of Finite-Length LDPC Codes
Lower Bounds on the Graphical Complexity of Finite-Length LDPC Codes Igal Sason Department of Electrical Engineering Technion - Israel Institute of Technology Haifa 32000, Israel 2009 IEEE International
More informationECE6604 PERSONAL & MOBILE COMMUNICATIONS. Week 3. Flat Fading Channels Envelope Distribution Autocorrelation of a Random Process
1 ECE6604 PERSONAL & MOBILE COMMUNICATIONS Week 3 Flat Fading Channels Envelope Distribution Autocorrelation of a Random Process 2 Multipath-Fading Mechanism local scatterers mobile subscriber base station
More informationEE376A - Information Theory Final, Monday March 14th 2016 Solutions. Please start answering each question on a new page of the answer booklet.
EE376A - Information Theory Final, Monday March 14th 216 Solutions Instructions: You have three hours, 3.3PM - 6.3PM The exam has 4 questions, totaling 12 points. Please start answering each question on
More informationAgenda. Background. System model. Optimal Design of Adaptive Coded Modulation Schemes for Maximum Average Spectral E$ciency
Optimal Design of Adaptive Coded Modulation Schemes for Maximum Average Spectral E$ciency Henrik Holm Mohamed#Slim Alouini David Gesbert Geir E. %ien Frode B&hagen Kjell J. Hole Norwegian University of
More informationApproximately achieving the feedback interference channel capacity with point-to-point codes
Approximately achieving the feedback interference channel capacity with point-to-point codes Joyson Sebastian*, Can Karakus*, Suhas Diggavi* Abstract Superposition codes with rate-splitting have been used
More informationCapacity-achieving Feedback Scheme for Flat Fading Channels with Channel State Information
Capacity-achieving Feedback Scheme for Flat Fading Channels with Channel State Information Jialing Liu liujl@iastate.edu Sekhar Tatikonda sekhar.tatikonda@yale.edu Nicola Elia nelia@iastate.edu Dept. of
More informationCapacity of Memoryless Channels and Block-Fading Channels With Designable Cardinality-Constrained Channel State Feedback
2038 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 50, NO. 9, SEPTEMBER 2004 Capacity of Memoryless Channels and Block-Fading Channels With Designable Cardinality-Constrained Channel State Feedback Vincent
More informationMinimum Feedback Rates for Multi-Carrier Transmission With Correlated Frequency Selective Fading
Minimum Feedback Rates for Multi-Carrier Transmission With Correlated Frequency Selective Fading Yakun Sun and Michael L. Honig Department of ECE orthwestern University Evanston, IL 60208 Abstract We consider
More information