Entropy, Inference, and Channel Coding

Size: px
Start display at page:

Download "Entropy, Inference, and Channel Coding"

Transcription

1 Entropy, Inference, and Channel Coding Sean Meyn Department of Electrical and Computer Engineering University of Illinois and the Coordinated Science Laboratory NSF support: ECS , ITR and CCF

2 Overview Hypothesis testing and channel coding Structure of optimal codes Error exponents Algorithms E r ( R) Optimal code QAM R

3 References Large deviations Dembo and Zeitouni, Large Deviations Techniques And Applications, 1998 Kontoyiannis, Lastras-Montano and Meyn, Relative Entropy and Exponential Deviation Bounds for General Markov Chains, ISIT, 2005 Pandit and Meyn, Extremal Distributions and Worst-Case Large-Deviation Bounds, 2004 Hypothesis testing D&Z 1998 Zeitouni and Gutman. On universal hypothesis testing via large deviations, IT-37, 1991 Pandit, Meyn and Veeravalli, Asymptotic Robust Neyman-Pearson Testing Based on Moment Classes, ISIT, 2004.

4 References Channel coding Csiszar and Korner. Information theory: Coding Theorems for Discrete Memoryless Systems. Academic Press New York, 1997 MacKay, Information Theory, Inference, and Learning Algorithms, CUP, Blahut, Hypothesis testing and information theory, IT-20, 1974

5 Outline (today) Introduction Relative entropy & Large deviations Hypothesis testing Channel capacity Conclusions

6 Memoryless Channel Model Memoryless channel with input sequence X, output sequence Y Channel kernel P (dy x) =P{Y t dx X t = x} If X is i.i.d. with marginal distribution µ Then, Y is i.i.d. with marginal distribution π π( )= P ( x)µ(dx)

7 Random codebook Channel kernel P (dy x) =P{Y t dx X t = x} N-dimensional code words X i, i =1, 2,...,e NR N-dimensional output Y received: i.i.d., with marginal distribution π

8 IEEE Std a SUPPLEMENT TO IEEE STANDARD FOR INFORMATION TECHNOLOGY BPSK Q I QPSK b 0 Q b 0 b I 64-QAM Q b 0 b 1 b 2 b 3 b 4 b QAM Q b 0 b 1 b 2 b I I

9 Questions & Objectives 1. What is the structure of optimal µ? 2. Construct algorithms based on this structure 3. Worst-case modeling to simplify code construction 4. Decoding algorithms and evaluation

10 Questions & Objectives 1. What is the structure of optimal µ? 2. Construct algorithms based on this structure 3. Worst-case modeling to simplify code construction 4. Decoding algorithms and evaluation Methodology & Viewpoint: Hypothesis testing Large deviations Convex & linear optimization theory

11 Example: Rayleigh Channel Y = AX + N A and N are i.i.d. and mutually independent: σ 2 A =1,σ2 N =1,andσ2 P =26.4 (SNR=14.2 db)

12 Example: Rayleigh Channel Y = AX + N A and N are i.i.d. and mutually independent: σ 2 A =1,σ2 N =1,andσ2 P =26.4 (SNR=14.2 db) Standard: Rate: 16-point QAM I =0.2 nats/symbol. 16-point QAM

13 Example: Rayleigh Channel Y = AX + N A and N are i.i.d. and mutually independent: σ 2 A =1,σ2 N =1,andσ2 P =26.4 (SNR=14.2 db) point QAM Three-point constellation

14 Example: Rayleigh Channel Y = AX + N A and N are i.i.d. and mutually independent: σ 2 A =1,σ2 N =1,andσ2 P =26.4 (SNR=14.2 db) Er ( R ) point distribution: three-fold improvement over 16-point QAM R

15 Outline Introduction Relative entropy & Large deviations Hypothesis testing Channel capacity Conclusions

16 Large Deviations X = {X 1,X 2,...} a nice Markov chain on X, marginal distribution µ Simulate a function g : X R ĉ n = n 1 n t=1 g(x t )

17 Large Deviations X = {X 1,X 2,...} a nice Markov chain on X, marginal distribution µ Simulate a function g : X R ĉ n = n 1 n g(x t ) c 0 = µ(g) t=1 Probability of over-estimate c>c 0 n 1 log P {n n 1 t=1 } g(x t ) c Λ (c)

18 Large Deviations X = {X 1,X 2,...} a nice Markov chain on X, marginal distribution µ Simulate a function g : X R ĉ n = n 1 n g(x t ) c 0 = µ(g) t=1 Probability of over-estimate c>c 0 = µ(g), n 1 log P {n n 1 t=1 } g(x t ) c Λ (c) Rate function & log-moment generating function Λ (c) =sup[θc Λ(θ)] θ>0 [ ( Λ(θ) = lim n n 1 log E exp θ n t=1 )] g(x t )

19 Hoeffding's Bound X = {X 1,X 2,...} is i.i.d. on X =[0, 1] g(x) =x Marginal distribution µ unknown ĉ n = n 1 n t=1 X t c 0 = µ(g) Worst-case rate function & log-moment generating function inf{λ µ(c) :µ(g) =c 0 } sup{λ µ (θ) :µ(g) =c 0 }

20 Hoeffding's Bound X = {X 1,X 2,...} is i.i.d. on X =[0, 1] g(x) =x Marginal distribution µ unknown ĉ n = n 1 n t=1 X t c 0 = µ(g) Worst-case rate function & log-moment generating function inf{λ µ(c) :µ(g) =c 0 } sup{λ µ (θ) :µ(g) =c 0 } Solution: µ is binary on {0, 1}

21 Bennett's Lemma X = {X 1,X 2,...} is i.i.d. on X =[0, 1] Mean and variance given Marginal distribution µ unknown g(x) =x ĉ n = n 1 n t=1 X t Worst-case rate function & log-moment generating function inf{λ µ(c) :µ(g i )=c i, i =1, 2} sup{λ µ (θ) :µ(g i )=c i, i =1, 2}

22 Bennett's Lemma X = {X 1,X 2,...} is i.i.d. on X =[0, 1] Mean and variance given Marginal distribution µ unknown g(x) =x ĉ n = n 1 n t=1 X t Worst-case rate function & log-moment generating function inf{λ µ(c) :µ(g i )=c i, i =1, 2} sup{λ µ (θ) :µ(g i )=c i, i =1, 2} Solution: µ is binary on x 0 {, 1}

23 Generalized Bennett's Lemma X = {X 1,X 2,...} is i.i.d. on X =[0, 1] n moments given g i Marginal distribution µ unknown ĉ n = n 1 n t=1 g(x t ) Worst-case moment generating function: λ(θ) =E[e θg(x t) ]= µ, e θg

24 Generalized Bennett's Lemma X = {X 1,X 2,...} is i.i.d. on X =[0, 1] n moments given Marginal distribution µ unknown g i ĉ n = n 1 n g(x t ) t=1 Worst-case moment generating function: λ(θ) =E[e θg(x t) ]= µ, e θg Linear program over M: max µ, e θg s. t. µ, g i = c i, i =1,...,n. µ is discrete

25 Sanov's Theorem State space: Notation: X Probability measures: M µ, g = µ(g):= Empirical measures: g(y) µ(dy) µ ameasure g afunction on X L n := 1 n n 1 t=0 δ Xt L n M for n 1 L n,g = 1 n n 1 t=0 g(x t )

26 Sanov's Theorem State space: Notation: X Probability measures: M µ, g = µ(g):= Empirical measures: g(y) µ(dy) µ ameasure g afunction on X L n := 1 n n 1 t=0 δ Xt L n M for n 1 Relative entropy: D(ν µ) = ν, log ( dν ) = dµ ( dν ) log ν(dx) dµ

27 Sanov's Theorem Law of large numbers: L n := 1 n n 1 δ Xt L n µ, n t=0 L n µ

28 Sanov's Theorem Convex set of probability measures K M µ K n 1 log P{L n K}? L n µ? K

29 Sanov's Theorem Convex set of probability measures K M µ K n 1 log P{L n K} η = inf J(ν) ν K L n Q η µ K Q η = {ν : J (ν) <η}

30 Sanov's Theorem i.i.d. source: J(ν) =D(ν µ) Markov: J(ν) =inf D(ν ˇP ν P ) : ˇP tr. kernel with ν invariant L n Q η µ K Q η = {ν : J (ν) <η}

31 Sanov's Theorem Example: K = {ν : ν, g c} n 1 log P{L n K} η = inf J(ν) = ν, g c Λ (c)

32 Sanov's Theorem Example: K = {ν : ν, g c} n 1 log P{L n K} η = inf J(ν) = ν, g c Λ (c) ν, g = c L n Q η µ K Q η = {ν : J (ν) <η}

33 Outline Introduction Relative entropy & Large deviations Hypothesis testing Channel capacity Conclusions

34 Neyman Pearson Hypothesis Testing Observations X = {X t : t =1, 2,...N} X i.i.d. with marginal π j under H j, j =0, 1 Hypothesis test: Error Probabilities φ(x) = 1 ifh1 is declared true, based on N observations P e,0 =P 0 {φ(x) =1}, P e,1 =P 1 {φ(x) =0} N-P Criterion: inf φ P e,1 subject to P e,0 e Nη

35 Neyman Pearson Hypothesis Testing Observations X = {X t : t =1, 2,...N} X i.i.d. with marginal π j under H j, j =0, 1 ErrorProbabilities P e,0 =P 0 {φ(x) =1}, P e,1 =P 1 {φ(x) =0} π 1 Solution: φ(x) =0 if L n Q η (π 0 ) Q η (π 0 ) π 0 N-P Criterion: inf φ P e,1 subject to P e,0 e Nη

36 Neyman Pearson Hypothesis Testing Solution: φ(x) =0 if L n Q η (π 0 ) lim N N 1 log P 0 {φ N =1} = η lim N 1 log P 1 {φ N =0} = β N π 1 Q η (π 0 ) π 0

37 Neyman Pearson Hypothesis Testing Solution: φ(x) =0 if L n Q η (π 0 ) lim N N 1 log P 0 {φ N =1} = η Q (π 1 ) β lim N 1 log P 1 {φ N =0} = β N ν, l = c π 1 β =inf{j 1 (ν) :J 0 (ν) η} Q η (π 0 ) π 0 =inf{β >0:Q β (π 1 ) Q η (π 0 ) }

38 Robust Neyman Pearson Hypothesis Testing Uncertainty classes defined by moment constraints π 0 P 0 π 1 P 1 P 1 P 0

39 Robust Neyman Pearson Hypothesis Testing Uncertainty classes defined by moment constraints π 0 P 0 π 1 P 1 P 1 P 0 Q (P 0 ) η

40 Robust Neyman Pearson Hypothesis Testing Uncertainty classes defined by moment constraints There exist π 0 P 0,π 1 P 1,andµ solving, β = inf π 1 P 1 inf D(µ π 1 ) µ Q η (P 0 ) π 1 µ P 1 π 0 P 0 Q (P 0 ) η

41 Robust Neyman Pearson Hypothesis Testing Uncertainty classes defined by moment constraints There exist π 0 P 0,π 1 P 1,andµ solving, β = inf π 1 P 1 inf D(µ π 1 ) µ Q η (P 0 ) Q β (P 1 ) Q (P 0 ) η π 1 µ π 0 P 1 P 0 µ, log(l) = µ, log(l ) Optimizers again discrete

42 Outline Introduction Relative entropy & Large deviations Hypothesis testing Channel capacity Conclusions

43 Channel Coding and Sanov's Theorem Channel kernel P (dy x) =P{Y t dy X t = x} N-dimensional code words X i, i =1, 2,...,e NR N-dimensional output Y received X is i.i.d. with marginal distribution µ Y is i.i.d. with marginal distribution π π( )= P ( x)µ(dx)

44 Channel Coding and Sanov's Theorem Channel kernel P (dy x) =P{Y t dy X t = x} N-dimensional code words X i, i =1, 2,...,e NR N-dimensional output Y received If i is the true codeword then ( i, ) has marginal distribution X Y Otherwise, independence: µ P (dx, dy) =µ(dx)p (dy x) µ π (dx, dy) =µ(dx)π(dy)

45 Channel Coding and Sanov's Theorem Two hypotheses based on observations: H : 0 µ π (dx, dy) =µ(dx) π(dy) 1 µ P (dx, dy) =µ(dx)p (dy x) H : µ P µ π

46 Channel Coding and Sanov's Theorem Two hypotheses based on observations: H : 0 H : 1 µ π (dx, dy) =µ(dx) π(dy) µ P (dx, dy) =µ(dx)p (dy x) µ P Solution: Reject codeword i ( φ =0) if L n Q η (π 0 ) Empirical distributions for joint observations i (, ) X Y Q η (π 0 ) µ π

47 Channel Coding and Sanov's Theorem Solution: φ =0 if L n Q η (π 0 ) lim N 1 log P 0 {φ N =1} = η N The error probability e Nη must be multiplied by e NR µ P For vanishing error, e NR e Nη < 1 That is, R < η Q η (π 0 ) µ π

48 Channel Coding and Sanov's Theorem Solution: φ =0 if L n Q η (π 0 ) lim N N 1 log P 0 {φ N =1} = η The error probability e Nη must be multiplied by e NR µ P R<η max = D(µ P µ π) = mutual information Q η (π 0 ) max µ π

49 { Error Exponent E(R,µ ) = lim N 1 log P { error } N Formula expressed as solution to a robust hypothesis testing problem: For a given input distribution µ, denote product measures on X Y with first marginal µ, P 0 = { µ ν : ν is a probability measure on Y

50 { Error Exponent E(R,µ ) = lim N 1 log P { error } N Formula expressed as solution to a robust hypothesis testing problem: For a given input distribution µ, denote product measures on X Y with first marginal µ, P 0 = { µ ν : ν is a probability measure on Y H 0 Hypothesis : Code word i not sent; (Xj i ) (Y j ) independent Test: Empirical distributions within entropy ball around P 0

51 { { Error Exponent H 0 : {(X i j, Y j):j =1,...,N } has marginal distribution π 0 P 0 H 1 : {(X i j, Y j):j =1,...,N } has marginal distribution π 1 := µ p Entropy neighborhood of P 0 Q + R (P 0)={ γ :min ν D(γ µ ν) R Entropy neighborhood of π 1 Q + ( ) = { γ : β D(γ µ p ) β π 1

52 Error Exponent E(R,µ ) = lim N 1 log P { error } N β = infimum over β such that these entropy neighborhoods meet: µ p + Q β (µ p) + Q R (P 0 ) µ pˆ µ ˆp µ P 0

53 Error Exponent E(R,µ ) = = lim N 1 log P { error } N inf β { } β : Q + β (µ p) Q+ R (P 0) E(R ) = random coding exponent = supremum over µ µ p + Q β (µ p) + Q R (P 0 ) µ pˆ µ ˆp µ P 0

54 Outline Introduction Relative entropy & Large deviations Hypothesis testing Channel capacity Conclusions

55 Summary Large Deviations is the grand unifying principle of Information Theory

56 Summary Standard coding based on AWGN models May be unrealistic in wireless models with fading Discrete distributions arise in coding, and other applications involving optimization over M Extremal distributions arise in worst-case models

57 What's Next? II Channel models Convex optimization and channel coding Cutting plane algorithm III Worst-case models Extremal distributions

Finding the best mismatched detector for channel coding and hypothesis testing

Finding the best mismatched detector for channel coding and hypothesis testing Finding the best mismatched detector for channel coding and hypothesis testing Sean Meyn Department of Electrical and Computer Engineering University of Illinois and the Coordinated Science Laboratory

More information

Uncertainty. Jayakrishnan Unnikrishnan. CSL June PhD Defense ECE Department

Uncertainty. Jayakrishnan Unnikrishnan. CSL June PhD Defense ECE Department Decision-Making under Statistical Uncertainty Jayakrishnan Unnikrishnan PhD Defense ECE Department University of Illinois at Urbana-Champaign CSL 141 12 June 2010 Statistical Decision-Making Relevant in

More information

ELEC546 Review of Information Theory

ELEC546 Review of Information Theory ELEC546 Review of Information Theory Vincent Lau 1/1/004 1 Review of Information Theory Entropy: Measure of uncertainty of a random variable X. The entropy of X, H(X), is given by: If X is a discrete random

More information

Generalized Neyman Pearson optimality of empirical likelihood for testing parameter hypotheses

Generalized Neyman Pearson optimality of empirical likelihood for testing parameter hypotheses Ann Inst Stat Math (2009) 61:773 787 DOI 10.1007/s10463-008-0172-6 Generalized Neyman Pearson optimality of empirical likelihood for testing parameter hypotheses Taisuke Otsu Received: 1 June 2007 / Revised:

More information

Lecture 18: Gaussian Channel

Lecture 18: Gaussian Channel Lecture 18: Gaussian Channel Gaussian channel Gaussian channel capacity Dr. Yao Xie, ECE587, Information Theory, Duke University Mona Lisa in AWGN Mona Lisa Noisy Mona Lisa 100 100 200 200 300 300 400

More information

Arimoto Channel Coding Converse and Rényi Divergence

Arimoto Channel Coding Converse and Rényi Divergence Arimoto Channel Coding Converse and Rényi Divergence Yury Polyanskiy and Sergio Verdú Abstract Arimoto proved a non-asymptotic upper bound on the probability of successful decoding achievable by any code

More information

Lecture 5 Channel Coding over Continuous Channels

Lecture 5 Channel Coding over Continuous Channels Lecture 5 Channel Coding over Continuous Channels I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw November 14, 2014 1 / 34 I-Hsiang Wang NIT Lecture 5 From

More information

Channel Dispersion and Moderate Deviations Limits for Memoryless Channels

Channel Dispersion and Moderate Deviations Limits for Memoryless Channels Channel Dispersion and Moderate Deviations Limits for Memoryless Channels Yury Polyanskiy and Sergio Verdú Abstract Recently, Altug and Wagner ] posed a question regarding the optimal behavior of the probability

More information

IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 57, NO. 3, MARCH

IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 57, NO. 3, MARCH IEEE TRANSACTIONS ON INFORMATION THEORY, VOL 57, NO 3, MARCH 2011 1587 Universal and Composite Hypothesis Testing via Mismatched Divergence Jayakrishnan Unnikrishnan, Member, IEEE, Dayu Huang, Student

More information

. Then V l on K l, and so. e e 1.

. Then V l on K l, and so. e e 1. Sanov s Theorem Let E be a Polish space, and define L n : E n M E to be the empirical measure given by L n x = n n m= δ x m for x = x,..., x n E n. Given a µ M E, denote by µ n the distribution of L n

More information

Refined Bounds on the Empirical Distribution of Good Channel Codes via Concentration Inequalities

Refined Bounds on the Empirical Distribution of Good Channel Codes via Concentration Inequalities Refined Bounds on the Empirical Distribution of Good Channel Codes via Concentration Inequalities Maxim Raginsky and Igal Sason ISIT 2013, Istanbul, Turkey Capacity-Achieving Channel Codes The set-up DMC

More information

Intermittent Communication

Intermittent Communication Intermittent Communication Mostafa Khoshnevisan, Student Member, IEEE, and J. Nicholas Laneman, Senior Member, IEEE arxiv:32.42v2 [cs.it] 7 Mar 207 Abstract We formulate a model for intermittent communication

More information

Capacity of a channel Shannon s second theorem. Information Theory 1/33

Capacity of a channel Shannon s second theorem. Information Theory 1/33 Capacity of a channel Shannon s second theorem Information Theory 1/33 Outline 1. Memoryless channels, examples ; 2. Capacity ; 3. Symmetric channels ; 4. Channel Coding ; 5. Shannon s second theorem,

More information

Lecture 22: Error exponents in hypothesis testing, GLRT

Lecture 22: Error exponents in hypothesis testing, GLRT 10-704: Information Processing and Learning Spring 2012 Lecture 22: Error exponents in hypothesis testing, GLRT Lecturer: Aarti Singh Scribe: Aarti Singh Disclaimer: These notes have not been subjected

More information

The Method of Types and Its Application to Information Hiding

The Method of Types and Its Application to Information Hiding The Method of Types and Its Application to Information Hiding Pierre Moulin University of Illinois at Urbana-Champaign www.ifp.uiuc.edu/ moulin/talks/eusipco05-slides.pdf EUSIPCO Antalya, September 7,

More information

Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets

Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets Shivaprasad Kotagiri and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame,

More information

Constellation Shaping for Communication Channels with Quantized Outputs

Constellation Shaping for Communication Channels with Quantized Outputs Constellation Shaping for Communication Channels with Quantized Outputs, Dr. Matthew C. Valenti and Xingyu Xiang Lane Department of Computer Science and Electrical Engineering West Virginia University

More information

ONE of Shannon s key discoveries was that for quite

ONE of Shannon s key discoveries was that for quite IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 44, NO. 6, OCTOBER 1998 2505 The Method of Types Imre Csiszár, Fellow, IEEE (Invited Paper) Abstract The method of types is one of the key technical tools

More information

Capacity of Block Rayleigh Fading Channels Without CSI

Capacity of Block Rayleigh Fading Channels Without CSI Capacity of Block Rayleigh Fading Channels Without CSI Mainak Chowdhury and Andrea Goldsmith, Fellow, IEEE Department of Electrical Engineering, Stanford University, USA Email: mainakch@stanford.edu, andrea@wsl.stanford.edu

More information

Chapter 4: Continuous channel and its capacity

Chapter 4: Continuous channel and its capacity meghdadi@ensil.unilim.fr Reference : Elements of Information Theory by Cover and Thomas Continuous random variable Gaussian multivariate random variable AWGN Band limited channel Parallel channels Flat

More information

Expectation propagation for symbol detection in large-scale MIMO communications

Expectation propagation for symbol detection in large-scale MIMO communications Expectation propagation for symbol detection in large-scale MIMO communications Pablo M. Olmos olmos@tsc.uc3m.es Joint work with Javier Céspedes (UC3M) Matilde Sánchez-Fernández (UC3M) and Fernando Pérez-Cruz

More information

Lecture 4 Noisy Channel Coding

Lecture 4 Noisy Channel Coding Lecture 4 Noisy Channel Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 9, 2015 1 / 56 I-Hsiang Wang IT Lecture 4 The Channel Coding Problem

More information

Information Dimension

Information Dimension Information Dimension Mina Karzand Massachusetts Institute of Technology November 16, 2011 1 / 26 2 / 26 Let X would be a real-valued random variable. For m N, the m point uniform quantized version of

More information

IEEE Proof Web Version

IEEE Proof Web Version IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 55, NO. 10, OCTOBER 2009 1 A Neyman Pearson Approach to Universal Erasure List Decoding Pierre Moulin, Fellow, IEEE Abstract When information is to be transmitted

More information

Rate Distortion Function For a Class of Relative Entropy Sources

Rate Distortion Function For a Class of Relative Entropy Sources Proceedings of the 9th International Symposium on Mathematical Theory of Networks and Systems MTNS 200 5 9 July, 200 Budapest, Hungary Rate Distortion Function For a Class of Relative Entropy Sources Farzad

More information

Universal Anytime Codes: An approach to uncertain channels in control

Universal Anytime Codes: An approach to uncertain channels in control Universal Anytime Codes: An approach to uncertain channels in control paper by Stark Draper and Anant Sahai presented by Sekhar Tatikonda Wireless Foundations Department of Electrical Engineering and Computer

More information

Two Applications of the Gaussian Poincaré Inequality in the Shannon Theory

Two Applications of the Gaussian Poincaré Inequality in the Shannon Theory Two Applications of the Gaussian Poincaré Inequality in the Shannon Theory Vincent Y. F. Tan (Joint work with Silas L. Fong) National University of Singapore (NUS) 2016 International Zurich Seminar on

More information

Lecture 6 Channel Coding over Continuous Channels

Lecture 6 Channel Coding over Continuous Channels Lecture 6 Channel Coding over Continuous Channels I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw November 9, 015 1 / 59 I-Hsiang Wang IT Lecture 6 We have

More information

Lecture 2. Capacity of the Gaussian channel

Lecture 2. Capacity of the Gaussian channel Spring, 207 5237S, Wireless Communications II 2. Lecture 2 Capacity of the Gaussian channel Review on basic concepts in inf. theory ( Cover&Thomas: Elements of Inf. Theory, Tse&Viswanath: Appendix B) AWGN

More information

A Summary of Multiple Access Channels

A Summary of Multiple Access Channels A Summary of Multiple Access Channels Wenyi Zhang February 24, 2003 Abstract In this summary we attempt to present a brief overview of the classical results on the information-theoretic aspects of multiple

More information

Energy State Amplification in an Energy Harvesting Communication System

Energy State Amplification in an Energy Harvesting Communication System Energy State Amplification in an Energy Harvesting Communication System Omur Ozel Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland College Park, MD 20742 omur@umd.edu

More information

The Poisson Channel with Side Information

The Poisson Channel with Side Information The Poisson Channel with Side Information Shraga Bross School of Enginerring Bar-Ilan University, Israel brosss@macs.biu.ac.il Amos Lapidoth Ligong Wang Signal and Information Processing Laboratory ETH

More information

Dispersion of the Gilbert-Elliott Channel

Dispersion of the Gilbert-Elliott Channel Dispersion of the Gilbert-Elliott Channel Yury Polyanskiy Email: ypolyans@princeton.edu H. Vincent Poor Email: poor@princeton.edu Sergio Verdú Email: verdu@princeton.edu Abstract Channel dispersion plays

More information

EE 4TM4: Digital Communications II. Channel Capacity

EE 4TM4: Digital Communications II. Channel Capacity EE 4TM4: Digital Communications II 1 Channel Capacity I. CHANNEL CODING THEOREM Definition 1: A rater is said to be achievable if there exists a sequence of(2 nr,n) codes such thatlim n P (n) e (C) = 0.

More information

Economics 520. Lecture Note 19: Hypothesis Testing via the Neyman-Pearson Lemma CB 8.1,

Economics 520. Lecture Note 19: Hypothesis Testing via the Neyman-Pearson Lemma CB 8.1, Economics 520 Lecture Note 9: Hypothesis Testing via the Neyman-Pearson Lemma CB 8., 8.3.-8.3.3 Uniformly Most Powerful Tests and the Neyman-Pearson Lemma Let s return to the hypothesis testing problem

More information

Optimal Distributed Detection Strategies for Wireless Sensor Networks

Optimal Distributed Detection Strategies for Wireless Sensor Networks Optimal Distributed Detection Strategies for Wireless Sensor Networks Ke Liu and Akbar M. Sayeed University of Wisconsin-Madison kliu@cae.wisc.edu, akbar@engr.wisc.edu Abstract We study optimal distributed

More information

Appendix B Information theory from first principles

Appendix B Information theory from first principles Appendix B Information theory from first principles This appendix discusses the information theory behind the capacity expressions used in the book. Section 8.3.4 is the only part of the book that supposes

More information

National University of Singapore Department of Electrical & Computer Engineering. Examination for

National University of Singapore Department of Electrical & Computer Engineering. Examination for National University of Singapore Department of Electrical & Computer Engineering Examination for EE5139R Information Theory for Communication Systems (Semester I, 2014/15) November/December 2014 Time Allowed:

More information

EECS 750. Hypothesis Testing with Communication Constraints

EECS 750. Hypothesis Testing with Communication Constraints EECS 750 Hypothesis Testing with Communication Constraints Name: Dinesh Krithivasan Abstract In this report, we study a modification of the classical statistical problem of bivariate hypothesis testing.

More information

EVALUATION OF PACKET ERROR RATE IN WIRELESS NETWORKS

EVALUATION OF PACKET ERROR RATE IN WIRELESS NETWORKS EVALUATION OF PACKET ERROR RATE IN WIRELESS NETWORKS Ramin Khalili, Kavé Salamatian LIP6-CNRS, Université Pierre et Marie Curie. Paris, France. Ramin.khalili, kave.salamatian@lip6.fr Abstract Bit Error

More information

(each row defines a probability distribution). Given n-strings x X n, y Y n we can use the absence of memory in the channel to compute

(each row defines a probability distribution). Given n-strings x X n, y Y n we can use the absence of memory in the channel to compute ENEE 739C: Advanced Topics in Signal Processing: Coding Theory Instructor: Alexander Barg Lecture 6 (draft; 9/6/03. Error exponents for Discrete Memoryless Channels http://www.enee.umd.edu/ abarg/enee739c/course.html

More information

An Achievable Error Exponent for the Mismatched Multiple-Access Channel

An Achievable Error Exponent for the Mismatched Multiple-Access Channel An Achievable Error Exponent for the Mismatched Multiple-Access Channel Jonathan Scarlett University of Cambridge jms265@camacuk Albert Guillén i Fàbregas ICREA & Universitat Pompeu Fabra University of

More information

On the Low-SNR Capacity of Phase-Shift Keying with Hard-Decision Detection

On the Low-SNR Capacity of Phase-Shift Keying with Hard-Decision Detection On the Low-SNR Capacity of Phase-Shift Keying with Hard-Decision Detection ustafa Cenk Gursoy Department of Electrical Engineering University of Nebraska-Lincoln, Lincoln, NE 68588 Email: gursoy@engr.unl.edu

More information

Lecture 6 I. CHANNEL CODING. X n (m) P Y X

Lecture 6 I. CHANNEL CODING. X n (m) P Y X 6- Introduction to Information Theory Lecture 6 Lecturer: Haim Permuter Scribe: Yoav Eisenberg and Yakov Miron I. CHANNEL CODING We consider the following channel coding problem: m = {,2,..,2 nr} Encoder

More information

Universal and Composite Hypothesis Testing via Mismatched Divergence

Universal and Composite Hypothesis Testing via Mismatched Divergence 1 Universal and Composite Hypothesis Testing via Mismatched Divergence arxiv:0909.2234v3 [cs.it] 9 Sep 2010 Jayakrishnan Unnikrishnan, Dayu Huang, Sean Meyn, Amit Surana and Venugopal Veeravalli Abstract

More information

Universal and Composite Hypothesis Testing via Mismatched Divergence

Universal and Composite Hypothesis Testing via Mismatched Divergence Universal and Composite Hypothesis Testing via Mismatched Divergence Jayakrishnan Unnikrishnan, Dayu Huang, Sean Meyn, Amit Surana and Venugopal Veeravalli Abstract arxiv:0909.2234v1 [cs.it] 11 Sep 2009

More information

LECTURE 13. Last time: Lecture outline

LECTURE 13. Last time: Lecture outline LECTURE 13 Last time: Strong coding theorem Revisiting channel and codes Bound on probability of error Error exponent Lecture outline Fano s Lemma revisited Fano s inequality for codewords Converse to

More information

INFORMATION THEORY AND STATISTICS

INFORMATION THEORY AND STATISTICS CHAPTER INFORMATION THEORY AND STATISTICS We now explore the relationship between information theory and statistics. We begin by describing the method of types, which is a powerful technique in large deviation

More information

Performance Analysis and Code Optimization of Low Density Parity-Check Codes on Rayleigh Fading Channels

Performance Analysis and Code Optimization of Low Density Parity-Check Codes on Rayleigh Fading Channels Performance Analysis and Code Optimization of Low Density Parity-Check Codes on Rayleigh Fading Channels Jilei Hou, Paul H. Siegel and Laurence B. Milstein Department of Electrical and Computer Engineering

More information

Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information

Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information 204 IEEE International Symposium on Information Theory Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information Omur Ozel, Kaya Tutuncuoglu 2, Sennur Ulukus, and Aylin Yener

More information

Representation of Correlated Sources into Graphs for Transmission over Broadcast Channels

Representation of Correlated Sources into Graphs for Transmission over Broadcast Channels Representation of Correlated s into Graphs for Transmission over Broadcast s Suhan Choi Department of Electrical Eng. and Computer Science University of Michigan, Ann Arbor, MI 80, USA Email: suhanc@eecs.umich.edu

More information

Surrogate loss functions, divergences and decentralized detection

Surrogate loss functions, divergences and decentralized detection Surrogate loss functions, divergences and decentralized detection XuanLong Nguyen Department of Electrical Engineering and Computer Science U.C. Berkeley Advisors: Michael Jordan & Martin Wainwright 1

More information

Reliability of Radio-mobile systems considering fading and shadowing channels

Reliability of Radio-mobile systems considering fading and shadowing channels Reliability of Radio-mobile systems considering fading and shadowing channels Philippe Mary ETIS UMR 8051 CNRS, ENSEA, Univ. Cergy-Pontoise, 6 avenue du Ponceau, 95014 Cergy, France Philippe Mary 1 / 32

More information

On Comparability of Multiple Antenna Channels

On Comparability of Multiple Antenna Channels On Comparability of Multiple Antenna Channels Majid Fozunbal, Steven W. McLaughlin, and Ronald W. Schafer School of Electrical and Computer Engineering Georgia Institute of Technology Atlanta, GA 30332-0250

More information

Lecture 7 Introduction to Statistical Decision Theory

Lecture 7 Introduction to Statistical Decision Theory Lecture 7 Introduction to Statistical Decision Theory I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw December 20, 2016 1 / 55 I-Hsiang Wang IT Lecture 7

More information

On the Secrecy Capacity of Fading Channels

On the Secrecy Capacity of Fading Channels On the Secrecy Capacity of Fading Channels arxiv:cs/63v [cs.it] 7 Oct 26 Praveen Kumar Gopala, Lifeng Lai and Hesham El Gamal Department of Electrical and Computer Engineering The Ohio State University

More information

Secret Key Agreement Using Asymmetry in Channel State Knowledge

Secret Key Agreement Using Asymmetry in Channel State Knowledge Secret Key Agreement Using Asymmetry in Channel State Knowledge Ashish Khisti Deutsche Telekom Inc. R&D Lab USA Los Altos, CA, 94040 Email: ashish.khisti@telekom.com Suhas Diggavi LICOS, EFL Lausanne,

More information

Distributed Detection With Vector Quantizer Wenwen Zhao, Student Member, IEEE, and Lifeng Lai, Member, IEEE

Distributed Detection With Vector Quantizer Wenwen Zhao, Student Member, IEEE, and Lifeng Lai, Member, IEEE IEEE TRANSACTIONS ON SIGNAL AND INFORMATION PROCESSING OVER NETWORKS, VOL 2, NO 2, JUNE 206 05 Distributed Detection With Vector Quantizer Wenwen Zhao, Student Member, IEEE, and Lifeng Lai, Member, IEEE

More information

Correlation Detection and an Operational Interpretation of the Rényi Mutual Information

Correlation Detection and an Operational Interpretation of the Rényi Mutual Information Correlation Detection and an Operational Interpretation of the Rényi Mutual Information Masahito Hayashi 1, Marco Tomamichel 2 1 Graduate School of Mathematics, Nagoya University, and Centre for Quantum

More information

On Concentration of Martingales and Applications in Information Theory, Communication & Coding

On Concentration of Martingales and Applications in Information Theory, Communication & Coding On Concentration of Martingales and Applications in Information Theory, Communication & Coding Igal Sason Department of Electrical Engineering Technion - Israel Institute of Technology Haifa 32000, Israel

More information

On Common Information and the Encoding of Sources that are Not Successively Refinable

On Common Information and the Encoding of Sources that are Not Successively Refinable On Common Information and the Encoding of Sources that are Not Successively Refinable Kumar Viswanatha, Emrah Akyol, Tejaswi Nanjundaswamy and Kenneth Rose ECE Department, University of California - Santa

More information

On Reparametrization and the Gibbs Sampler

On Reparametrization and the Gibbs Sampler On Reparametrization and the Gibbs Sampler Jorge Carlos Román Department of Mathematics Vanderbilt University James P. Hobert Department of Statistics University of Florida March 2014 Brett Presnell Department

More information

Practical conditions on Markov chains for weak convergence of tail empirical processes

Practical conditions on Markov chains for weak convergence of tail empirical processes Practical conditions on Markov chains for weak convergence of tail empirical processes Olivier Wintenberger University of Copenhagen and Paris VI Joint work with Rafa l Kulik and Philippe Soulier Toronto,

More information

Least Favorable Distributions for Robust Quickest Change Detection

Least Favorable Distributions for Robust Quickest Change Detection Least Favorable Distributions for Robust Quickest hange Detection Jayakrishnan Unnikrishnan, Venugopal V. Veeravalli, Sean Meyn Department of Electrical and omputer Engineering, and oordinated Science

More information

Computing Probability of Symbol Error

Computing Probability of Symbol Error Computing Probability of Symbol Error I When decision boundaries intersect at right angles, then it is possible to compute the error probability exactly in closed form. I The result will be in terms of

More information

Spatially Smoothed Kernel Density Estimation via Generalized Empirical Likelihood

Spatially Smoothed Kernel Density Estimation via Generalized Empirical Likelihood Spatially Smoothed Kernel Density Estimation via Generalized Empirical Likelihood Kuangyu Wen & Ximing Wu Texas A&M University Info-Metrics Institute Conference: Recent Innovations in Info-Metrics October

More information

Lecture 22: Final Review

Lecture 22: Final Review Lecture 22: Final Review Nuts and bolts Fundamental questions and limits Tools Practical algorithms Future topics Dr Yao Xie, ECE587, Information Theory, Duke University Basics Dr Yao Xie, ECE587, Information

More information

Shannon s Noisy-Channel Coding Theorem

Shannon s Noisy-Channel Coding Theorem Shannon s Noisy-Channel Coding Theorem Lucas Slot Sebastian Zur February 13, 2015 Lucas Slot, Sebastian Zur Shannon s Noisy-Channel Coding Theorem February 13, 2015 1 / 29 Outline 1 Definitions and Terminology

More information

The Noncoherent Rician Fading Channel Part II : Spectral Efficiency in the Low-Power Regime

The Noncoherent Rician Fading Channel Part II : Spectral Efficiency in the Low-Power Regime The Noncoherent Rician Fading Channel Part II : Spectral Efficiency in the Low-Power Regime Mustafa Cenk Gursoy H. Vincent Poor Sergio Verdú arxiv:cs/0501067v1 [cs.it] 24 Jan 2005 Dept. of Electrical Engineering

More information

A New Metaconverse and Outer Region for Finite-Blocklength MACs

A New Metaconverse and Outer Region for Finite-Blocklength MACs A New Metaconverse Outer Region for Finite-Blocklength MACs Pierre Moulin Dept of Electrical Computer Engineering University of Illinois at Urbana-Champaign Urbana, IL 680 Abstract An outer rate region

More information

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Nan Liu and Andrea Goldsmith Department of Electrical Engineering Stanford University, Stanford CA 94305 Email:

More information

Large deviations for random projections of l p balls

Large deviations for random projections of l p balls 1/32 Large deviations for random projections of l p balls Nina Gantert CRM, september 5, 2016 Goal: Understanding random projections of high-dimensional convex sets. 2/32 2/32 Outline Goal: Understanding

More information

ECE 4400:693 - Information Theory

ECE 4400:693 - Information Theory ECE 4400:693 - Information Theory Dr. Nghi Tran Lecture 8: Differential Entropy Dr. Nghi Tran (ECE-University of Akron) ECE 4400:693 Lecture 1 / 43 Outline 1 Review: Entropy of discrete RVs 2 Differential

More information

Chaos, Complexity, and Inference (36-462)

Chaos, Complexity, and Inference (36-462) Chaos, Complexity, and Inference (36-462) Lecture 7: Information Theory Cosma Shalizi 3 February 2009 Entropy and Information Measuring randomness and dependence in bits The connection to statistics Long-run

More information

LARGE DEVIATIONS FOR DOUBLY INDEXED STOCHASTIC PROCESSES WITH APPLICATIONS TO STATISTICAL MECHANICS

LARGE DEVIATIONS FOR DOUBLY INDEXED STOCHASTIC PROCESSES WITH APPLICATIONS TO STATISTICAL MECHANICS LARGE DEVIATIONS FOR DOUBLY INDEXED STOCHASTIC PROCESSES WITH APPLICATIONS TO STATISTICAL MECHANICS A Dissertation Presented by CHRISTOPHER L. BOUCHER Submitted to the Graduate School of the University

More information

Lecture 10: Broadcast Channel and Superposition Coding

Lecture 10: Broadcast Channel and Superposition Coding Lecture 10: Broadcast Channel and Superposition Coding Scribed by: Zhe Yao 1 Broadcast channel M 0M 1M P{y 1 y x} M M 01 1 M M 0 The capacity of the broadcast channel depends only on the marginal conditional

More information

Second-Order Asymptotics in Information Theory

Second-Order Asymptotics in Information Theory Second-Order Asymptotics in Information Theory Vincent Y. F. Tan (vtan@nus.edu.sg) Dept. of ECE and Dept. of Mathematics National University of Singapore (NUS) National Taiwan University November 2015

More information

EE/Stats 376A: Homework 7 Solutions Due on Friday March 17, 5 pm

EE/Stats 376A: Homework 7 Solutions Due on Friday March 17, 5 pm EE/Stats 376A: Homework 7 Solutions Due on Friday March 17, 5 pm 1. Feedback does not increase the capacity. Consider a channel with feedback. We assume that all the recieved outputs are sent back immediately

More information

Contraction properties of Feynman-Kac semigroups

Contraction properties of Feynman-Kac semigroups Journées de Statistique Marne La Vallée, January 2005 Contraction properties of Feynman-Kac semigroups Pierre DEL MORAL, Laurent MICLO Lab. J. Dieudonné, Nice Univ., LATP Univ. Provence, Marseille 1 Notations

More information

Hypothesis Testing with Communication Constraints

Hypothesis Testing with Communication Constraints Hypothesis Testing with Communication Constraints Dinesh Krithivasan EECS 750 April 17, 2006 Dinesh Krithivasan (EECS 750) Hyp. testing with comm. constraints April 17, 2006 1 / 21 Presentation Outline

More information

Capacity of AWGN channels

Capacity of AWGN channels Chapter 3 Capacity of AWGN channels In this chapter we prove that the capacity of an AWGN channel with bandwidth W and signal-tonoise ratio SNR is W log 2 (1+SNR) bits per second (b/s). The proof that

More information

Capacity and Reliability Function for Small Peak Signal Constraints

Capacity and Reliability Function for Small Peak Signal Constraints 828 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 48, NO. 4, APRIL 2002 Capacity and Reliability Function for Small Peak Signal Constraints Bruce Hajek, Fellow, IEEE, and Vijay G. Subramanian, Member,

More information

Censoring for Type-Based Multiple Access Scheme in Wireless Sensor Networks

Censoring for Type-Based Multiple Access Scheme in Wireless Sensor Networks Censoring for Type-Based Multiple Access Scheme in Wireless Sensor Networks Mohammed Karmoose Electrical Engineering Department Alexandria University Alexandria 1544, Egypt Email: mhkarmoose@ieeeorg Karim

More information

Shannon s noisy-channel theorem

Shannon s noisy-channel theorem Shannon s noisy-channel theorem Information theory Amon Elders Korteweg de Vries Institute for Mathematics University of Amsterdam. Tuesday, 26th of Januari Amon Elders (Korteweg de Vries Institute for

More information

Mismatched Multi-letter Successive Decoding for the Multiple-Access Channel

Mismatched Multi-letter Successive Decoding for the Multiple-Access Channel Mismatched Multi-letter Successive Decoding for the Multiple-Access Channel Jonathan Scarlett University of Cambridge jms265@cam.ac.uk Alfonso Martinez Universitat Pompeu Fabra alfonso.martinez@ieee.org

More information

Quiz 2 Date: Monday, November 21, 2016

Quiz 2 Date: Monday, November 21, 2016 10-704 Information Processing and Learning Fall 2016 Quiz 2 Date: Monday, November 21, 2016 Name: Andrew ID: Department: Guidelines: 1. PLEASE DO NOT TURN THIS PAGE UNTIL INSTRUCTED. 2. Write your name,

More information

Lower Bounds on the Graphical Complexity of Finite-Length LDPC Codes

Lower Bounds on the Graphical Complexity of Finite-Length LDPC Codes Lower Bounds on the Graphical Complexity of Finite-Length LDPC Codes Igal Sason Department of Electrical Engineering Technion - Israel Institute of Technology Haifa 32000, Israel 2009 IEEE International

More information

Lecture 8: Channel and source-channel coding theorems; BEC & linear codes. 1 Intuitive justification for upper bound on channel capacity

Lecture 8: Channel and source-channel coding theorems; BEC & linear codes. 1 Intuitive justification for upper bound on channel capacity 5-859: Information Theory and Applications in TCS CMU: Spring 23 Lecture 8: Channel and source-channel coding theorems; BEC & linear codes February 7, 23 Lecturer: Venkatesan Guruswami Scribe: Dan Stahlke

More information

On Composite Quantum Hypothesis Testing

On Composite Quantum Hypothesis Testing University of York 7 November 207 On Composite Quantum Hypothesis Testing Mario Berta Department of Computing with Fernando Brandão and Christoph Hirche arxiv:709.07268 Overview Introduction 2 Composite

More information

A Simple Memoryless Proof of the Capacity of the Exponential Server Timing Channel

A Simple Memoryless Proof of the Capacity of the Exponential Server Timing Channel A Simple Memoryless Proof of the Capacity of the Exponential Server iming Channel odd P. Coleman ECE Department Coordinated Science Laboratory University of Illinois colemant@illinois.edu Abstract his

More information

EXPURGATED GAUSSIAN FINGERPRINTING CODES. Pierre Moulin and Negar Kiyavash

EXPURGATED GAUSSIAN FINGERPRINTING CODES. Pierre Moulin and Negar Kiyavash EXPURGATED GAUSSIAN FINGERPRINTING CODES Pierre Moulin and Negar iyavash Beckman Inst, Coord Sci Lab and ECE Department University of Illinois at Urbana-Champaign, USA ABSTRACT This paper analyzes the

More information

Lecture 8: Channel Capacity, Continuous Random Variables

Lecture 8: Channel Capacity, Continuous Random Variables EE376A/STATS376A Information Theory Lecture 8-02/0/208 Lecture 8: Channel Capacity, Continuous Random Variables Lecturer: Tsachy Weissman Scribe: Augustine Chemparathy, Adithya Ganesh, Philip Hwang Channel

More information

5958 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 12, DECEMBER 2010

5958 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 12, DECEMBER 2010 5958 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 12, DECEMBER 2010 Capacity Theorems for Discrete, Finite-State Broadcast Channels With Feedback and Unidirectional Receiver Cooperation Ron Dabora

More information

Consistency of the maximum likelihood estimator for general hidden Markov models

Consistency of the maximum likelihood estimator for general hidden Markov models Consistency of the maximum likelihood estimator for general hidden Markov models Jimmy Olsson Centre for Mathematical Sciences Lund University Nordstat 2012 Umeå, Sweden Collaborators Hidden Markov models

More information

Empirical Likelihood Ratio Test with Distribution Function Constraints

Empirical Likelihood Ratio Test with Distribution Function Constraints PAPER DRAFT Empirical Likelihood Ratio Test with Distribution Function Constraints Yingxi Liu, Student member, IEEE, Ahmed Tewfik, Fellow, IEEE arxiv:65.57v [math.st] 3 Apr 26 Abstract In this work, we

More information

Subset Source Coding

Subset Source Coding Fifty-third Annual Allerton Conference Allerton House, UIUC, Illinois, USA September 29 - October 2, 205 Subset Source Coding Ebrahim MolavianJazi and Aylin Yener Wireless Communications and Networking

More information

Block 2: Introduction to Information Theory

Block 2: Introduction to Information Theory Block 2: Introduction to Information Theory Francisco J. Escribano April 26, 2015 Francisco J. Escribano Block 2: Introduction to Information Theory April 26, 2015 1 / 51 Table of contents 1 Motivation

More information

The Gallager Converse

The Gallager Converse The Gallager Converse Abbas El Gamal Director, Information Systems Laboratory Department of Electrical Engineering Stanford University Gallager s 75th Birthday 1 Information Theoretic Limits Establishing

More information

Cooperative Communication in Spatially Modulated MIMO systems

Cooperative Communication in Spatially Modulated MIMO systems Cooperative Communication in Spatially Modulated MIMO systems Multimedia Wireless Networks (MWN) Group, Department Of Electrical Engineering, Indian Institute of Technology, Kanpur, India {neerajv,adityaj}@iitk.ac.in

More information