Lecture 2. Capacity of the Gaussian channel
|
|
- Stewart Jefferson
- 6 years ago
- Views:
Transcription
1 Spring, S, Wireless Communications II 2. Lecture 2 Capacity of the Gaussian channel Review on basic concepts in inf. theory ( Cover&Thomas: Elements of Inf. Theory, Tse&Viswanath: Appendix B) AWGN channel capacity ( Chapter , Appendix B) Resources (power and bandwidth) of the AWGN channel Linear time-invariant Gaussian channels. Single input multiple output (SIMO) channel 2. Multiple input single output (MISO) channel 3. Frequency-selective channel
2 Spring, S, Wireless Communications II 2.2 Entropy Entropy for a discrete random variable x with alphabet X and prob. mass function p x = P r(x = i), i X H(x) = i X p x (i) log(/p x (i)) (2.) H(x): the average amount of uncertainty associated with the random variable x = the information obtained when observing x 0 H(x) log X H(x) = 0: No uncertainty = deterministic H(x) = log X : all codewords uniformly distributed All logarithms are taken to the base 2 unless specified otherwise.
3 Spring, S, Wireless Communications II 2.3 Example: Binary Entropy Funtion H(p) p H B (p) = p log p ( p) log( p) (2.2)
4 Spring, S, Wireless Communications II 2.4 Joint and Conditional Entropy The joint entropy H(x, y) of a pair of discrete random variables (x, y) with a joint distribution p x,y is defined as H(x, y) = p x,y (i, j) log(/p x,y (i, j)) (2.3) i X j Y The conditional entropy H(y x) = p x (i)h(y x = i) (2.4) i X = p x (i) p y x (j i) log(/p y x (j i)) (2.5) i X j Y = i X,j Y p x,y (i, j) log(/p y x (j i)) (2.6) The average amount of uncertainty left in y after observing x
5 Spring, S, Wireless Communications II 2.5 Chain Rule The chain rule for entropies H(x, y) = H(x) + H(y x) = H(y) + H(x y) (2.7) Note that H(x y) = H(x), H(y x) = H(y) if x and y are independent, thus H(x, y) = H(x) + H(y x) H(x) + H(y) (2.8) H(y x) = 0 if y can be fully recovered after observing x no uncertainty left in y
6 Spring, S, Wireless Communications II 2.6 Mutual Information Relative entropy between two pmf s p x and q x : i X p x(i) log px(i) q x(i) Mutual information I(x; y): the relative entropy between the joint distribution p x,y and the product distribution p x p y I(x; y) = p x,y (i, j) log p x,y(i, j) (2.9) p x (i)p y (j) i X j Y = H(x) + H(y) H(x, y) (2.0) = H(x) H(x y) = H(y) H(y x) (2.) measure of the amount of (mutual) information that y (or y) contains about x (or y) reduction in uncertainty of y (or x) due to the knowledge of x (or y)
7 Spring, S, Wireless Communications II 2.7 Entropy and Mutual Information H(X,Y) H(X Y ) I(X;Y ) H(Y X ) H(X ) H(Y ) H(x, y) = H(x) + H(y x) H(x, y) = H(y) + H(x y) H(x, y) H(x) + H(y) 0 H(x y) H(x) 0 H(y x) H(y) I(x; y) = H(x) H(x y) I(x; y) = H(y) H(y x) I(x; y) = H(x) + H(y) H(x, y) I(x; y) = I(y; x) I(x; x) = H(x)
8 Spring, S, Wireless Communications II 2.8 Channel Capacity Discrete memoryless channel (DMC): input x[m] X and output y[m] Y, transition probability p(y x) Convey one of M = I equally likely messages by mapping it to its N-length codeword in I = {x,..., x M } Input sequence: N-dimensional random vector x = (x[],..., x[n]) Message i {0,,..., C } Encoder x i = (x i [],..., x i [N]) Channel p(y x) y = ( y[],..., y[n]) Decoder ^i What is the maximum achievable bit rate R R = log M N such that the average probability of error tends to 0 as N? (2.2) P e = P r(i î) (2.3)
9 Spring, S, Wireless Communications II 2.9 Channel Capacity Entropy H(x) = log M = NR, H(x y) 0 for reliable communications (P e 0) I(x; y) = H(x) H(x y) R I(x; y) (2.4) N Upper bound: Note that max I N N I(x; y) I(x; y) (2.5) m= I(x[m]; y[m]) (2.6) Equality is attained if the inputs are made independent over time N max I(x[m]; y[m]) = max I(x; y) (2.7) N p x[m] p x m= N-dimensional combinatorial problem is reduced to optimization problem over input distributions on single symbols
10 Spring, S, Wireless Communications II 2.0 Channel Capacity Is there a code that can provide rate close to (2.7) such that P e 0? Shannon: such codes exist if N is chosen large enough, see the detailed proofs in Cover&Thomas, Elements of Inf. Theory, Chapter 7 Channel capacity of a discrete memoryless channel is C = max p x I(x; y) (2.8) where the maximum is taken over all input distributions p x. I(x; y) is a concave function of p x for fixed p y x convex optimization problem (Theorem in Cover&Thomas)
11 Spring, S, Wireless Communications II 2. Example: Binary Symmetric Channel X = Y = {0, }, p(0 ) = p( 0) = p, p( ) = p(0 0) = p 0 p 0 I(x; y) = H(y) H(y x) = H(y) i X p x (i)h(y x = i) p p = H(y) i X p x (i)h B (p) p = H(y) H B (p) The capacity is achieved when the input distribution p x is uniform C = max p x I(x; y) = H B (p) (2.9)
12 Spring, S, Wireless Communications II 2.2 Differential Entropy Entropy of continuous random variable Continuous RV x with pdf f x h(x) = f x (u) log du (2.20) f x (u) Similarly, mutual information between x and y with joint pdf f x,y I(x; y) = f x,y (u, v) log f x,y(u, v) dudv (2.2) f x (u)f y (v) The properties of I(x; y) are the same as in the discrete case Example: Normal distribution, f(x) = 2πσ 2 e x2 /2σ Example 8..2 in Cover&Thomas h(x) = 2 log 2πeσ2 (2.22)
13 Spring, S, Wireless Communications II 2.3 The Gaussian Channel Impose an average power constraint for any codeword x n N N x 2 n[m] P, n I (2.23) m= The capacity of continuous-valued channel with power constraint P can be shown to be C = max I(x; y) (2.24) f x:e[x 2 ] P Proof consists of three steps. discretise the continuous valued input and output of the channel 2. approximate it by discrete memoryless channels with increasing alphabet sizes 3. take limits appropriately
14 Spring, S, Wireless Communications II 2.4 ω i encoder α x m w is independent of x Gaussian Channel w m decoder y m β X R ˆω î Now, h(w) = 2 log 2πeσ2, and E[y 2 ] = P + σ 2 I(x; y) = h(y) h(y x) Also, h(y) is maximised by choosing x from N (0, P ) C = = h(y) h(x + w x) = h(y) h(w x) = h(y) h(w) max I(x; y) = f x:e[x 2 ] P 2 log 2πe(P + σ2 ) log 2πeσ2 2 = 2 log( + P σ 2 ) (2.25) Complex baseband AWGN channel: C = log( + P σ 2 ) bits per complex dimension!
15 Spring, S, Wireless Communications II 2.5 Nσ 2 Sphere Packing Interpretation N(P + σ 2 ) NP Assume N N-dim RX vector y = x + w lie within a radius r y = N(P + σ 2 ) /N N m= w[m]2 σ 2 Thus, y lies near the noise sphere of radius r w = Nσ 2 around the transmitted codeword Maximum number of codewords is the ratio between the two volumes, V y (r y ) and V w (r w ) The volume of an N-dimensional sphere of radius r is proportional to r N, thus the max number of bits is ( N(P + σ N log 2 ) N ) Nσ 2 N = 2 log( + P σ 2 ) (2.26)
16 Spring, S, Wireless Communications II 2.6 Power and Bandwidth Constrained Capacity Consider a continuous-time AWGN channel with BW W [Hz], power constraint P [Watts] and Gaussian noise with psd N 0 /2 [Watts/Hz] Discrete-time complex baseband signal: where w[m] CN (0, N 0 ) y[m] = x[m] + w[m] (2.27) Independent noise in both I and Q branches 2 uses of a real AWGN channel C = 2 2 log( + P ) bits per complex dimension (2.28) N 0 W W complex samples per second: C(P, W ) = W log( + P ) bits/s (2.29) N 0 W
17 Spring, S, Wireless Communications II 2.7 Power and Bandwidth Constrained Capacity Maximum achievable spectral efficiency: C(γ) = log( + γ), where the SNR γ = P N 0 W Low SNR region: C(γ) γ log 2 e linear as a function of γ High SNR region: C(γ) log 2 γ logarithmic as a function of γ log ( + SNR) SNR
18 Spring, S, Wireless Communications II 2.8 Power and Bandwidth Constrained Capacity P N 0 log 2 e Power limited region C(W ) (Mbps) Capacity Limit for W 0.4 Bandwidth limited region 0.2 P/N 0 = Bandwidth W (MHz) 25 30
19 Spring, S, Wireless Communications II 2.9 Linear Time-invariant Gaussian Channels Examples of channels closely related to the simple AWGN channel Single-input multiple-output (SIMO) channel Multiple-input single-output (MISO) channel Frequency-selective channel parallel Gaussian channel Time-invariant, optimal code constructed directly from AWGN optimal codes, capacity easy to compute Amplitude (linear scale) Time (ns) (c) Power specturm (db) (d) 40 MHz Frequency (GHz)
20 Spring, S, Wireless Communications II 2.20 Single-input Multiple-output (SIMO) Channel SIMO channel with one TX antenna and L RX antennas y l [m] = h l x[m] + w l [m], l =,..., L (2.30) h l is the fixed complex channel between TX and lth RX antenna and w l [m] CN (0, N 0 ) i.i.d. noise across antennas Detection of x[m] from y[m] = [y [m],..., y L [m]] T ˆx[m] = f H y[m] = f H hx[m] + f H w[m] (2.3) h[m] = [h [m],..., h L [m]] T and w[m] = [w [m],..., w L [m]] T. Optimal f = h: Maximum ratio combining (MRC), or matched filtering (MF) E[ h H hx[m] 2] SIMO capacity with γ = E[ h H w[m] 2] = P h 2 N 0 C = log( + P h 2 N 0 ) bits/s/hz (2.32)
21 Spring, S, Wireless Communications II 2.2 Multiple-input Single-output (MISO) Channel MISO channel with one RX antenna and L TX antennas y[m] = h H x[m] + w[m] (2.33) h = [h,..., h L ] T, and h l is the fixed complex channel between lth TX antenna and the RX antenna. Reciprocal to SIMO channel optimal TX strategy is to align x with h using beamformer f, f 2 = x[m] = fx[m] = h x[m] (2.34) h [ hh MISO capacity with γ = E hx[m] 2] [ h w[m] ] /E 2 = P h 2 N 0 C = log( + P h 2 N 0 ) bits/s/hz (2.35) P is total power constraint across L antennas Requires CSI at the transmitter!
22 Spring, S, Wireless Communications II 2.22 Frequency-selective Channel L-tap frequency selective AWGN channel L y[m] = h l x[m l] + w[m] (2.36) l=0 OFDM converts (2.36) to N C parallel (sub-)channels where each h n is an AWGN channel ỹ n = h n d n + w n, n =,..., N C (2.37) Given the power allocation p n n, the maximum achievable rate per OFDM symbol is N C n= log( + p n h n 2 N 0 ) bits/ofdm symbol (2.38)
23 Spring, S, Wireless Communications II 2.23 Frequency Selective Channel Optimal power allocation Power allocation to maximise (2.38) subject to the power constraint n E [ d n 2] P N C Optimal power allocation is the solution to max p,...,p NC s. t. N C n= N C N C n= log( + p n h n 2 N 0 ) p n = P, p n 0, n (2.39) (2.40) where the variables are p,..., p NC Concave objective & linear constraints Convex optimisation problem The optimal power allocation can be explicitly found
24 Spring, S, Wireless Communications II 2.24 p n + N0 h n 2 Lagrangian L(ν, λ,..., λ NC, p,..., p NC ) n= Waterfilling N C = log( + p n h n 2 ) + ν N 0 ( NC ) N C p n N C P λ n p n (2.4) n= n= where ν and λ,..., λ NC are Lagrange multipliers Karush-Kuhn-Tucker (KKT) conditions: p n 0 n p n 0 n N C p n = P N C n= λ n 0 n λ np n = 0 n ν + λ n = 0 n ( ν N C p n = P N C n= p n + N0 h n 2 p n + N0 h n 2 ) p n = 0 n ν n (2.42) (2.43) (2.44) (2.45)
25 Spring, S, Wireless Communications II 2.25 From (2.42) (2.45) ( ) p n = ν N + 0, h n 2 Waterfilling Optimal ν can be found by bisection, for example ( ) N C N C ν n= N + 0 = P (2.46) h n 2 N 0 h ( n ) 2 P* = 0 ν P * 2 P * 3 Subcarrier n
Lecture 4 Capacity of Wireless Channels
Lecture 4 Capacity of Wireless Channels I-Hsiang Wang ihwang@ntu.edu.tw 3/0, 014 What we have learned So far: looked at specific schemes and techniques Lecture : point-to-point wireless channel - Diversity:
More informationAppendix B Information theory from first principles
Appendix B Information theory from first principles This appendix discusses the information theory behind the capacity expressions used in the book. Section 8.3.4 is the only part of the book that supposes
More informationLecture 4 Capacity of Wireless Channels
Lecture 4 Capacity of Wireless Channels I-Hsiang Wang ihwang@ntu.edu.tw 3/0, 014 What we have learned So far: looked at specific schemes and techniques Lecture : point-to-point wireless channel - Diversity:
More informationLecture 6 Channel Coding over Continuous Channels
Lecture 6 Channel Coding over Continuous Channels I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw November 9, 015 1 / 59 I-Hsiang Wang IT Lecture 6 We have
More informationELEC546 Review of Information Theory
ELEC546 Review of Information Theory Vincent Lau 1/1/004 1 Review of Information Theory Entropy: Measure of uncertainty of a random variable X. The entropy of X, H(X), is given by: If X is a discrete random
More informationLecture 6: Gaussian Channels. Copyright G. Caire (Sample Lectures) 157
Lecture 6: Gaussian Channels Copyright G. Caire (Sample Lectures) 157 Differential entropy (1) Definition 18. The (joint) differential entropy of a continuous random vector X n p X n(x) over R is: Z h(x
More informationPrinciples of Communications
Principles of Communications Weiyao Lin Shanghai Jiao Tong University Chapter 10: Information Theory Textbook: Chapter 12 Communication Systems Engineering: Ch 6.1, Ch 9.1~ 9. 92 2009/2010 Meixia Tao @
More information18.2 Continuous Alphabet (discrete-time, memoryless) Channel
0-704: Information Processing and Learning Spring 0 Lecture 8: Gaussian channel, Parallel channels and Rate-distortion theory Lecturer: Aarti Singh Scribe: Danai Koutra Disclaimer: These notes have not
More informationLecture 18: Gaussian Channel
Lecture 18: Gaussian Channel Gaussian channel Gaussian channel capacity Dr. Yao Xie, ECE587, Information Theory, Duke University Mona Lisa in AWGN Mona Lisa Noisy Mona Lisa 100 100 200 200 300 300 400
More informationLecture 8: Channel Capacity, Continuous Random Variables
EE376A/STATS376A Information Theory Lecture 8-02/0/208 Lecture 8: Channel Capacity, Continuous Random Variables Lecturer: Tsachy Weissman Scribe: Augustine Chemparathy, Adithya Ganesh, Philip Hwang Channel
More informationRevision of Lecture 5
Revision of Lecture 5 Information transferring across channels Channel characteristics and binary symmetric channel Average mutual information Average mutual information tells us what happens to information
More informationSolutions to Homework Set #4 Differential Entropy and Gaussian Channel
Solutions to Homework Set #4 Differential Entropy and Gaussian Channel 1. Differential entropy. Evaluate the differential entropy h(x = f lnf for the following: (a Find the entropy of the exponential density
More informationChannel capacity. Outline : 1. Source entropy 2. Discrete memoryless channel 3. Mutual information 4. Channel capacity 5.
Channel capacity Outline : 1. Source entropy 2. Discrete memoryless channel 3. Mutual information 4. Channel capacity 5. Exercices Exercise session 11 : Channel capacity 1 1. Source entropy Given X a memoryless
More informationChapter 9 Fundamental Limits in Information Theory
Chapter 9 Fundamental Limits in Information Theory Information Theory is the fundamental theory behind information manipulation, including data compression and data transmission. 9.1 Introduction o For
More informationLECTURE 13. Last time: Lecture outline
LECTURE 13 Last time: Strong coding theorem Revisiting channel and codes Bound on probability of error Error exponent Lecture outline Fano s Lemma revisited Fano s inequality for codewords Converse to
More informationCapacity of a channel Shannon s second theorem. Information Theory 1/33
Capacity of a channel Shannon s second theorem Information Theory 1/33 Outline 1. Memoryless channels, examples ; 2. Capacity ; 3. Symmetric channels ; 4. Channel Coding ; 5. Shannon s second theorem,
More informationLecture 5 Channel Coding over Continuous Channels
Lecture 5 Channel Coding over Continuous Channels I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw November 14, 2014 1 / 34 I-Hsiang Wang NIT Lecture 5 From
More information3F1: Signals and Systems INFORMATION THEORY Examples Paper Solutions
Engineering Tripos Part IIA THIRD YEAR 3F: Signals and Systems INFORMATION THEORY Examples Paper Solutions. Let the joint probability mass function of two binary random variables X and Y be given in the
More informationLecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122
Lecture 5: Channel Capacity Copyright G. Caire (Sample Lectures) 122 M Definitions and Problem Setup 2 X n Y n Encoder p(y x) Decoder ˆM Message Channel Estimate Definition 11. Discrete Memoryless Channel
More informationLecture 6 I. CHANNEL CODING. X n (m) P Y X
6- Introduction to Information Theory Lecture 6 Lecturer: Haim Permuter Scribe: Yoav Eisenberg and Yakov Miron I. CHANNEL CODING We consider the following channel coding problem: m = {,2,..,2 nr} Encoder
More informationChapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University
Chapter 4 Data Transmission and Channel Capacity Po-Ning Chen, Professor Department of Communications Engineering National Chiao Tung University Hsin Chu, Taiwan 30050, R.O.C. Principle of Data Transmission
More informationGaussian channel. Information theory 2013, lecture 6. Jens Sjölund. 8 May Jens Sjölund (IMT, LiU) Gaussian channel 1 / 26
Gaussian channel Information theory 2013, lecture 6 Jens Sjölund 8 May 2013 Jens Sjölund (IMT, LiU) Gaussian channel 1 / 26 Outline 1 Definitions 2 The coding theorem for Gaussian channel 3 Bandlimited
More informationLecture 4. Capacity of Fading Channels
1 Lecture 4. Capacity of Fading Channels Capacity of AWGN Channels Capacity of Fading Channels Ergodic Capacity Outage Capacity Shannon and Information Theory Claude Elwood Shannon (April 3, 1916 February
More informationLecture 4 Noisy Channel Coding
Lecture 4 Noisy Channel Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 9, 2015 1 / 56 I-Hsiang Wang IT Lecture 4 The Channel Coding Problem
More informationLecture 14 February 28
EE/Stats 376A: Information Theory Winter 07 Lecture 4 February 8 Lecturer: David Tse Scribe: Sagnik M, Vivek B 4 Outline Gaussian channel and capacity Information measures for continuous random variables
More informationRevision of Lecture 4
Revision of Lecture 4 We have completed studying digital sources from information theory viewpoint We have learnt all fundamental principles for source coding, provided by information theory Practical
More informationAdvanced Topics in Digital Communications Spezielle Methoden der digitalen Datenübertragung
Advanced Topics in Digital Communications Spezielle Methoden der digitalen Datenübertragung Dr.-Ing. Carsten Bockelmann Institute for Telecommunications and High-Frequency Techniques Department of Communications
More informationMultiple-Input Multiple-Output Systems
Multiple-Input Multiple-Output Systems What is the best way to use antenna arrays? MIMO! This is a totally new approach ( paradigm ) to wireless communications, which has been discovered in 95-96. Performance
More informationCapacity of multiple-input multiple-output (MIMO) systems in wireless communications
15/11/02 Capacity of multiple-input multiple-output (MIMO) systems in wireless communications Bengt Holter Department of Telecommunications Norwegian University of Science and Technology 1 Outline 15/11/02
More informationInformation Theory - Entropy. Figure 3
Concept of Information Information Theory - Entropy Figure 3 A typical binary coded digital communication system is shown in Figure 3. What is involved in the transmission of information? - The system
More informationECE Information theory Final (Fall 2008)
ECE 776 - Information theory Final (Fall 2008) Q.1. (1 point) Consider the following bursty transmission scheme for a Gaussian channel with noise power N and average power constraint P (i.e., 1/n X n i=1
More informationLecture 5: Antenna Diversity and MIMO Capacity Theoretical Foundations of Wireless Communications 1. Overview. CommTh/EES/KTH
: Antenna Diversity and Theoretical Foundations of Wireless Communications Wednesday, May 4, 206 9:00-2:00, Conference Room SIP Textbook: D. Tse and P. Viswanath, Fundamentals of Wireless Communication
More informationChapter 8: Differential entropy. University of Illinois at Chicago ECE 534, Natasha Devroye
Chapter 8: Differential entropy Chapter 8 outline Motivation Definitions Relation to discrete entropy Joint and conditional differential entropy Relative entropy and mutual information Properties AEP for
More informationECE Information theory Final
ECE 776 - Information theory Final Q1 (1 point) We would like to compress a Gaussian source with zero mean and variance 1 We consider two strategies In the first, we quantize with a step size so that the
More informationSingle-User MIMO systems: Introduction, capacity results, and MIMO beamforming
Single-User MIMO systems: Introduction, capacity results, and MIMO beamforming Master Universitario en Ingeniería de Telecomunicación I. Santamaría Universidad de Cantabria Contents Introduction Multiplexing,
More informationThe Method of Types and Its Application to Information Hiding
The Method of Types and Its Application to Information Hiding Pierre Moulin University of Illinois at Urbana-Champaign www.ifp.uiuc.edu/ moulin/talks/eusipco05-slides.pdf EUSIPCO Antalya, September 7,
More informationPrinciples of Coded Modulation. Georg Böcherer
Principles of Coded Modulation Georg Böcherer Contents. Introduction 9 2. Digital Communication System 2.. Transmission System............................. 2.2. Figures of Merit................................
More informationEnergy State Amplification in an Energy Harvesting Communication System
Energy State Amplification in an Energy Harvesting Communication System Omur Ozel Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland College Park, MD 20742 omur@umd.edu
More informationNotes 3: Stochastic channels and noisy coding theorem bound. 1 Model of information communication and noisy channel
Introduction to Coding Theory CMU: Spring 2010 Notes 3: Stochastic channels and noisy coding theorem bound January 2010 Lecturer: Venkatesan Guruswami Scribe: Venkatesan Guruswami We now turn to the basic
More informationOne Lesson of Information Theory
Institut für One Lesson of Information Theory Prof. Dr.-Ing. Volker Kühn Institute of Communications Engineering University of Rostock, Germany Email: volker.kuehn@uni-rostock.de http://www.int.uni-rostock.de/
More informationUpper Bounds on the Capacity of Binary Intermittent Communication
Upper Bounds on the Capacity of Binary Intermittent Communication Mostafa Khoshnevisan and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame, Indiana 46556 Email:{mhoshne,
More informationEE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15
EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15 1. Cascade of Binary Symmetric Channels The conditional probability distribution py x for each of the BSCs may be expressed by the transition probability
More informationLecture 3: Channel Capacity
Lecture 3: Channel Capacity 1 Definitions Channel capacity is a measure of maximum information per channel usage one can get through a channel. This one of the fundamental concepts in information theory.
More informationLecture 8: MIMO Architectures (II) Theoretical Foundations of Wireless Communications 1. Overview. Ragnar Thobaben CommTh/EES/KTH
MIMO : MIMO Theoretical Foundations of Wireless Communications 1 Wednesday, May 25, 2016 09:15-12:00, SIP 1 Textbook: D. Tse and P. Viswanath, Fundamentals of Wireless Communication 1 / 20 Overview MIMO
More informationEntropies & Information Theory
Entropies & Information Theory LECTURE I Nilanjana Datta University of Cambridge,U.K. See lecture notes on: http://www.qi.damtp.cam.ac.uk/node/223 Quantum Information Theory Born out of Classical Information
More informationLecture 22: Final Review
Lecture 22: Final Review Nuts and bolts Fundamental questions and limits Tools Practical algorithms Future topics Dr Yao Xie, ECE587, Information Theory, Duke University Basics Dr Yao Xie, ECE587, Information
More informationCommunication Theory II
Communication Theory II Lecture 15: Information Theory (cont d) Ahmed Elnakib, PhD Assistant Professor, Mansoura University, Egypt March 29 th, 2015 1 Example: Channel Capacity of BSC o Let then: o For
More informationChapter 4: Continuous channel and its capacity
meghdadi@ensil.unilim.fr Reference : Elements of Information Theory by Cover and Thomas Continuous random variable Gaussian multivariate random variable AWGN Band limited channel Parallel channels Flat
More informationOn the Secrecy Capacity of Fading Channels
On the Secrecy Capacity of Fading Channels arxiv:cs/63v [cs.it] 7 Oct 26 Praveen Kumar Gopala, Lifeng Lai and Hesham El Gamal Department of Electrical and Computer Engineering The Ohio State University
More informationChapter 2: Entropy and Mutual Information. University of Illinois at Chicago ECE 534, Natasha Devroye
Chapter 2: Entropy and Mutual Information Chapter 2 outline Definitions Entropy Joint entropy, conditional entropy Relative entropy, mutual information Chain rules Jensen s inequality Log-sum inequality
More informationCh. 8 Math Preliminaries for Lossy Coding. 8.4 Info Theory Revisited
Ch. 8 Math Preliminaries for Lossy Coding 8.4 Info Theory Revisited 1 Info Theory Goals for Lossy Coding Again just as for the lossless case Info Theory provides: Basis for Algorithms & Bounds on Performance
More informationLecture 10: Broadcast Channel and Superposition Coding
Lecture 10: Broadcast Channel and Superposition Coding Scribed by: Zhe Yao 1 Broadcast channel M 0M 1M P{y 1 y x} M M 01 1 M M 0 The capacity of the broadcast channel depends only on the marginal conditional
More informationOptimization of Modulation Constrained Digital Transmission Systems
University of Ottawa Optimization of Modulation Constrained Digital Transmission Systems by Yu Han A thesis submitted in fulfillment for the degree of Master of Applied Science in the Faculty of Engineering
More informationEE 4TM4: Digital Communications II. Channel Capacity
EE 4TM4: Digital Communications II 1 Channel Capacity I. CHANNEL CODING THEOREM Definition 1: A rater is said to be achievable if there exists a sequence of(2 nr,n) codes such thatlim n P (n) e (C) = 0.
More informationNational University of Singapore Department of Electrical & Computer Engineering. Examination for
National University of Singapore Department of Electrical & Computer Engineering Examination for EE5139R Information Theory for Communication Systems (Semester I, 2014/15) November/December 2014 Time Allowed:
More informationEC2252 COMMUNICATION THEORY UNIT 5 INFORMATION THEORY
EC2252 COMMUNICATION THEORY UNIT 5 INFORMATION THEORY Discrete Messages and Information Content, Concept of Amount of Information, Average information, Entropy, Information rate, Source coding to increase
More informationPhysical Layer and Coding
Physical Layer and Coding Muriel Médard Professor EECS Overview A variety of physical media: copper, free space, optical fiber Unified way of addressing signals at the input and the output of these media:
More informationOn the Capacity of the Two-Hop Half-Duplex Relay Channel
On the Capacity of the Two-Hop Half-Duplex Relay Channel Nikola Zlatanov, Vahid Jamali, and Robert Schober University of British Columbia, Vancouver, Canada, and Friedrich-Alexander-University Erlangen-Nürnberg,
More informationInformation Theory for Wireless Communications. Lecture 10 Discrete Memoryless Multiple Access Channel (DM-MAC): The Converse Theorem
Information Theory for Wireless Communications. Lecture 0 Discrete Memoryless Multiple Access Channel (DM-MAC: The Converse Theorem Instructor: Dr. Saif Khan Mohammed Scribe: Antonios Pitarokoilis I. THE
More informationEE/Stat 376B Handout #5 Network Information Theory October, 14, Homework Set #2 Solutions
EE/Stat 376B Handout #5 Network Information Theory October, 14, 014 1. Problem.4 parts (b) and (c). Homework Set # Solutions (b) Consider h(x + Y ) h(x + Y Y ) = h(x Y ) = h(x). (c) Let ay = Y 1 + Y, where
More informationCapacity of Block Rayleigh Fading Channels Without CSI
Capacity of Block Rayleigh Fading Channels Without CSI Mainak Chowdhury and Andrea Goldsmith, Fellow, IEEE Department of Electrical Engineering, Stanford University, USA Email: mainakch@stanford.edu, andrea@wsl.stanford.edu
More informationCS6304 / Analog and Digital Communication UNIT IV - SOURCE AND ERROR CONTROL CODING PART A 1. What is the use of error control coding? The main use of error control coding is to reduce the overall probability
More informationEE 4TM4: Digital Communications II Scalar Gaussian Channel
EE 4TM4: Digital Communications II Scalar Gaussian Channel I. DIFFERENTIAL ENTROPY Let X be a continuous random variable with probability density function (pdf) f(x) (in short X f(x)). The differential
More informationX 1 : X Table 1: Y = X X 2
ECE 534: Elements of Information Theory, Fall 200 Homework 3 Solutions (ALL DUE to Kenneth S. Palacio Baus) December, 200. Problem 5.20. Multiple access (a) Find the capacity region for the multiple-access
More informationLecture 15: Thu Feb 28, 2019
Lecture 15: Thu Feb 28, 2019 Announce: HW5 posted Lecture: The AWGN waveform channel Projecting temporally AWGN leads to spatially AWGN sufficiency of projection: irrelevancy theorem in waveform AWGN:
More information(Classical) Information Theory III: Noisy channel coding
(Classical) Information Theory III: Noisy channel coding Sibasish Ghosh The Institute of Mathematical Sciences CIT Campus, Taramani, Chennai 600 113, India. p. 1 Abstract What is the best possible way
More informationECE 4400:693 - Information Theory
ECE 4400:693 - Information Theory Dr. Nghi Tran Lecture 8: Differential Entropy Dr. Nghi Tran (ECE-University of Akron) ECE 4400:693 Lecture 1 / 43 Outline 1 Review: Entropy of discrete RVs 2 Differential
More information16.36 Communication Systems Engineering
MIT OpenCourseWare http://ocw.mit.edu 16.36 Communication Systems Engineering Spring 2009 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms. 16.36: Communication
More informationShannon s noisy-channel theorem
Shannon s noisy-channel theorem Information theory Amon Elders Korteweg de Vries Institute for Mathematics University of Amsterdam. Tuesday, 26th of Januari Amon Elders (Korteweg de Vries Institute for
More informationSolutions to Homework Set #1 Sanov s Theorem, Rate distortion
st Semester 00/ Solutions to Homework Set # Sanov s Theorem, Rate distortion. Sanov s theorem: Prove the simple version of Sanov s theorem for the binary random variables, i.e., let X,X,...,X n be a sequence
More informationExercise 1. = P(y a 1)P(a 1 )
Chapter 7 Channel Capacity Exercise 1 A source produces independent, equally probable symbols from an alphabet {a 1, a 2 } at a rate of one symbol every 3 seconds. These symbols are transmitted over a
More informationMultiple Antennas in Wireless Communications
Multiple Antennas in Wireless Communications Luca Sanguinetti Department of Information Engineering Pisa University luca.sanguinetti@iet.unipi.it April, 2009 Luca Sanguinetti (IET) MIMO April, 2009 1 /
More informationLecture 2: August 31
0-704: Information Processing and Learning Fall 206 Lecturer: Aarti Singh Lecture 2: August 3 Note: These notes are based on scribed notes from Spring5 offering of this course. LaTeX template courtesy
More informationElectrical and Information Technology. Information Theory. Problems and Solutions. Contents. Problems... 1 Solutions...7
Electrical and Information Technology Information Theory Problems and Solutions Contents Problems.......... Solutions...........7 Problems 3. In Problem?? the binomial coefficent was estimated with Stirling
More informationBlock 2: Introduction to Information Theory
Block 2: Introduction to Information Theory Francisco J. Escribano April 26, 2015 Francisco J. Escribano Block 2: Introduction to Information Theory April 26, 2015 1 / 51 Table of contents 1 Motivation
More informationCapacity of the Discrete Memoryless Energy Harvesting Channel with Side Information
204 IEEE International Symposium on Information Theory Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information Omur Ozel, Kaya Tutuncuoglu 2, Sennur Ulukus, and Aylin Yener
More informationEE 5407 Part II: Spatial Based Wireless Communications
EE 5407 Part II: Spatial Based Wireless Communications Instructor: Prof. Rui Zhang E-mail: rzhang@i2r.a-star.edu.sg Website: http://www.ece.nus.edu.sg/stfpage/elezhang/ Lecture II: Receive Beamforming
More informationCapacity bounds for multiple access-cognitive interference channel
Mirmohseni et al. EURASIP Journal on Wireless Communications and Networking, :5 http://jwcn.eurasipjournals.com/content///5 RESEARCH Open Access Capacity bounds for multiple access-cognitive interference
More informationChapter 9. Gaussian Channel
Chapter 9 Gaussian Channel Peng-Hua Wang Graduate Inst. of Comm. Engineering National Taipei University Chapter Outline Chap. 9 Gaussian Channel 9.1 Gaussian Channel: Definitions 9.2 Converse to the Coding
More informationSuperposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels
Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Nan Liu and Andrea Goldsmith Department of Electrical Engineering Stanford University, Stanford CA 94305 Email:
More informationShannon Information Theory
Chapter 3 Shannon Information Theory The information theory established by Shannon in 948 is the foundation discipline for communication systems, showing the potentialities and fundamental bounds of coding.
More informationInformation Theory. Lecture 10. Network Information Theory (CT15); a focus on channel capacity results
Information Theory Lecture 10 Network Information Theory (CT15); a focus on channel capacity results The (two-user) multiple access channel (15.3) The (two-user) broadcast channel (15.6) The relay channel
More informationECE598: Information-theoretic methods in high-dimensional statistics Spring 2016
ECE598: Information-theoretic methods in high-dimensional statistics Spring 06 Lecture : Mutual Information Method Lecturer: Yihong Wu Scribe: Jaeho Lee, Mar, 06 Ed. Mar 9 Quick review: Assouad s lemma
More informationPOWER ALLOCATION AND OPTIMAL TX/RX STRUCTURES FOR MIMO SYSTEMS
POWER ALLOCATION AND OPTIMAL TX/RX STRUCTURES FOR MIMO SYSTEMS R. Cendrillon, O. Rousseaux and M. Moonen SCD/ESAT, Katholiee Universiteit Leuven, Belgium {raphael.cendrillon, olivier.rousseaux, marc.moonen}@esat.uleuven.ac.be
More information3F1 Information Theory, Lecture 1
3F1 Information Theory, Lecture 1 Jossy Sayir Department of Engineering Michaelmas 2013, 22 November 2013 Organisation History Entropy Mutual Information 2 / 18 Course Organisation 4 lectures Course material:
More informationEE376A - Information Theory Final, Monday March 14th 2016 Solutions. Please start answering each question on a new page of the answer booklet.
EE376A - Information Theory Final, Monday March 14th 216 Solutions Instructions: You have three hours, 3.3PM - 6.3PM The exam has 4 questions, totaling 12 points. Please start answering each question on
More informationAn instantaneous code (prefix code, tree code) with the codeword lengths l 1,..., l N exists if and only if. 2 l i. i=1
Kraft s inequality An instantaneous code (prefix code, tree code) with the codeword lengths l 1,..., l N exists if and only if N 2 l i 1 Proof: Suppose that we have a tree code. Let l max = max{l 1,...,
More informationQuantum Information Theory and Cryptography
Quantum Information Theory and Cryptography John Smolin, IBM Research IPAM Information Theory A Mathematical Theory of Communication, C.E. Shannon, 1948 Lies at the intersection of Electrical Engineering,
More informationCapacity Pre-log of Noncoherent SIMO Channels via Hironaka s Theorem
Capacity Pre-log of Noncoherent SIMO Channels via Hironaka s Theorem Veniamin I. Morgenshtern 22. May 2012 Joint work with E. Riegler, W. Yang, G. Durisi, S. Lin, B. Sturmfels, and H. Bőlcskei SISO Fading
More informationShannon s Noisy-Channel Coding Theorem
Shannon s Noisy-Channel Coding Theorem Lucas Slot Sebastian Zur February 2015 Abstract In information theory, Shannon s Noisy-Channel Coding Theorem states that it is possible to communicate over a noisy
More informationComputing and Communications 2. Information Theory -Entropy
1896 1920 1987 2006 Computing and Communications 2. Information Theory -Entropy Ying Cui Department of Electronic Engineering Shanghai Jiao Tong University, China 2017, Autumn 1 Outline Entropy Joint entropy
More informationLECTURE 3. Last time:
LECTURE 3 Last time: Mutual Information. Convexity and concavity Jensen s inequality Information Inequality Data processing theorem Fano s Inequality Lecture outline Stochastic processes, Entropy rate
More informationOn the Duality between Multiple-Access Codes and Computation Codes
On the Duality between Multiple-Access Codes and Computation Codes Jingge Zhu University of California, Berkeley jingge.zhu@berkeley.edu Sung Hoon Lim KIOST shlim@kiost.ac.kr Michael Gastpar EPFL michael.gastpar@epfl.ch
More informationOn the Limits of Communication with Low-Precision Analog-to-Digital Conversion at the Receiver
1 On the Limits of Communication with Low-Precision Analog-to-Digital Conversion at the Receiver Jaspreet Singh, Onkar Dabeer, and Upamanyu Madhow, Abstract As communication systems scale up in speed and
More informationLecture 8: Channel and source-channel coding theorems; BEC & linear codes. 1 Intuitive justification for upper bound on channel capacity
5-859: Information Theory and Applications in TCS CMU: Spring 23 Lecture 8: Channel and source-channel coding theorems; BEC & linear codes February 7, 23 Lecturer: Venkatesan Guruswami Scribe: Dan Stahlke
More informationInformation Theory. Coding and Information Theory. Information Theory Textbooks. Entropy
Coding and Information Theory Chris Williams, School of Informatics, University of Edinburgh Overview What is information theory? Entropy Coding Information Theory Shannon (1948): Information theory is
More informationBasic information theory
Basic information theory Communication system performance is limited by Available signal power Background noise Bandwidth limits. Can we postulate an ideal system based on physical principles, against
More informationLecture 9: Diversity-Multiplexing Tradeoff Theoretical Foundations of Wireless Communications 1. Overview. Ragnar Thobaben CommTh/EES/KTH
: Diversity-Multiplexing Tradeoff Theoretical Foundations of Wireless Communications 1 Rayleigh Wednesday, June 1, 2016 09:15-12:00, SIP 1 Textbook: D. Tse and P. Viswanath, Fundamentals of Wireless Communication
More informationMMSE estimation and lattice encoding/decoding for linear Gaussian channels. Todd P. Coleman /22/02
MMSE estimation and lattice encoding/decoding for linear Gaussian channels Todd P. Coleman 6.454 9/22/02 Background: the AWGN Channel Y = X + N where N N ( 0, σ 2 N ), 1 n ni=1 X 2 i P X. Shannon: capacity
More informationSpace-Time Coding for Multi-Antenna Systems
Space-Time Coding for Multi-Antenna Systems ECE 559VV Class Project Sreekanth Annapureddy vannapu2@uiuc.edu Dec 3rd 2007 MIMO: Diversity vs Multiplexing Multiplexing Diversity Pictures taken from lectures
More information