The Maximum Entropy Principle and Applications to MIMO Channel Modeling

Size: px
Start display at page:

Download "The Maximum Entropy Principle and Applications to MIMO Channel Modeling"

Transcription

1 The Maximum Entropy Principle and Applications to MIMO Channel Modeling Eurecom CM Talk, 16/02/2006 Maxime Guillaud (Joint work with Mérouane Debbah) The Maximum Entropy Principle and Applications to MIMO Channel Modeling, M. Guillaud, CM Talk 16/02/2006 p.1

2 Overview Background theory Plausible reasoning The Maximum Entropy method and the link to modelling Applications to wireless channel modelling General method: focus on one parameter Application to channel energy Application to spatial correlation Conclusion The Maximum Entropy Principle and Applications to MIMO Channel Modeling, M. Guillaud, CM Talk 16/02/2006 p.2

3 Deduction, Plausible Reasoning and Probabilities Deduction from fact: A B A is true, therefore B is true. B is false, therefore A is false. Plausible reasoning from fact: A B B is true. What does this say about A? A is false. What does this say about B? We continuously do plausible reasoning: B = {At night, a masked gentleman comes out of the broken window of a store, carrying a bag of jewelry}, A = {This gentleman is a robber} B is true, therefore A becomes more likely. Plausible reasoning can be quantified using probability theory This goes beyond the statistical interpretation of probability, it takes into account our degree of knowledge The Maximum Entropy Principle and Applications to MIMO Channel Modeling, M. Guillaud, CM Talk 16/02/2006 p.3

4 Plausible Reasoning and Modelling We already have all the Bayes probability theory for this. Yes, but... Prior distributions in the Bayesian framework are "speculative" parameters: if correct, they increase the accuracy of the conclusions, but if they are wrong it s the opposite Jaynes [1] proposes to solve this by taking into account all constraints or information which are known for sure assuming the maximum uncertainty (or entropy) for everything else this makes it a nice modelling tool in general! Jayne s Maximum Entropy (MaxEnt) and Bayes methods are not contradictory The Maximum Entropy Principle and Applications to MIMO Channel Modeling, M. Guillaud, CM Talk 16/02/2006 p.4

5 The Maximum Entropy Method Need for practical way to mathematically enforce the maximization of uncertainty. Jaynes defined uncertainty as a continuous function of the distribution, representing degrees of certainty by real numbers qualitative correspondence with common sense: uncertainty decreases when some extra knowledge is acquired consistency property: if a conclusion can be reasoned out in more than one way, every possible way must yield the same result The unique solution is the Shannon entropy: H(P ) = D log(p (v))p (v)dv. The MaxEnt method: P MaxEnt = arg max H(P ) P,constraints The Maximum Entropy Principle and Applications to MIMO Channel Modeling, M. Guillaud, CM Talk 16/02/2006 p.5

6 Maximum Entropy: Application Examples (1) Discrete random variable v, takes a finite number of values {v 1,..., v N } with probabilities P (v = v i ) = p i. Entropy: H(p 1,... p N ) = N i=1 p i log(p i ) Maximize H(p 1,... p N ) under the constraint N i=1 p i = 1 Lagrange method: maximize L(p 1,...p N ) p i L(p 1,... p N ) = H(p 1,... p N ) + β ( N ) p i 1 i=1 = 0 yields the uniform density p i = e β 1 for i = 1... N normalization N i=1 p i = 1 imposes p i = 1 N. The Maximum Entropy Principle and Applications to MIMO Channel Modeling, M. Guillaud, CM Talk 16/02/2006 p.6

7 Maximum Entropy: Application Examples (2) Generalization to continuous variates is straightforward: for v D with PDF P v (v), Entropy: H(P v ) = D P v(v) log(p v (v))dv Maximize H(P v ) under the constraint D P v(v)dv = 1 Lagrange method: maximize L(P v ) = P v (v) log(p v (v))dv + β D ( D ) P v (v)dv 1 Due to the integral structure of the functional L(P v ), the Euler-Lagrange equation applies [2], and maximizing L(P v ) is easy: δl(p v ) δp v = log(p v (v)) 1 + β = 0 for D = [a, b], normalizing yields again the uniform law P v (v) = 1 b a if D is infinite (e.g. [0, + )) and there are no other constraints, there is no MaxEnt distribution The Maximum Entropy Principle and Applications to MIMO Channel Modeling, M. Guillaud, CM Talk 16/02/2006 p.7

8 Maximum Entropy: Application Examples (3) Any constraint based on expectation (mean, covariance...) can be easily handled: in general, add R constraints D g r(v)p v (v) = a r, r = 1... R, by maximizing L(P v ) = D P v(v) log(p v (v))dv +β [ D P v(v)dv 1 ] Mean constraint: g r (v) = v, a r = m + R r=1 γ r Variance constraint: g r (v) = v 2, a r = m 2 + σ 2 [ D g ] r(v)p v (v)dv a r. for D = R, with mean m and variance σ 2 prescribed, MaxEnt yields the normal distribution P v (v) = 1 σ exp ( [ 1 x m ) ] 2 2π 2 σ for D = [0, + ) with mean m prescribed, MaxEnt yields the exponential distribution P v (v) = 1 m e x/m. see [3] for more applications The Maximum Entropy Principle and Applications to MIMO Channel Modeling, M. Guillaud, CM Talk 16/02/2006 p.8

9 Overview Background theory Plausible reasoning The Maximum Entropy method and the link to modelling Applications to wireless channel modelling General method: focus on one parameter Application to channel energy Application to spatial correlation Conclusion The Maximum Entropy Principle and Applications to MIMO Channel Modeling, M. Guillaud, CM Talk 16/02/2006 p.9

10 MaxEnt Wireless Channel Modelling We restrict the problem to the classical framework of MIMO frequency-flat fading channels y = Hx + n. with H an n r n t matrix with complex scalar coefficients. seek an analytical formula for the PDF P (H) (applications to channel code optimization) what do we know (or don t know...) for sure? the SNR is not always the same the SNR is bounded in many situations the coefficients of H can not be assumed independent incorporate spatial correlation only (realizations are assumed i.i.d. in time) What is the MaxEnt distribution function P (H) corresponding to this? The Maximum Entropy Principle and Applications to MIMO Channel Modeling, M. Guillaud, CM Talk 16/02/2006 p.10

11 Previous results: Energy constraint only Debbah et al. [4] derived the MaxEnt distribution for the average energy NE 0, N = n t n r constraint only: maximizing [ ] L(P ) = log(p (H))P (H)dH +β 1 P (H)dH C } N C {{}} N {{} Entropy [ PDF normalization ] +γ NE 0 H 2 F P (H)dH C } N {{} Avg. energy constraint with Lagrange multipliers β, γ yields a Gaussian i.i.d. model ( ) 1 N P H E0 (H, E 0 ) = (πe 0 ) N exp h i 2 E 0 Gaussianity and independence are results of the ignorance of further constraints, not assumptions. i=1 The Maximum Entropy Principle and Applications to MIMO Channel Modeling, M. Guillaud, CM Talk 16/02/2006 p.11

12 Focusing on one parameter: the IMM method Goal: generate a model that maximally explores the domain of a variable V. We propose the following method: derive P V (V ) using a MaxEnt argument and known constraints on V derive P H V (H, V ) using MaxEnt and known constraints on H marginalize P H,V = P H V P V over V to obtain P (H): P (H) = P H V (H, V )P V (V )dv Individual MaxEnt and Marginalize (IMM) method in general, this yields distributions with less entropy but that maximally explore the domain of V Applications: channel energy (SNR) and spatial correlation. The Maximum Entropy Principle and Applications to MIMO Channel Modeling, M. Guillaud, CM Talk 16/02/2006 p.12

13 Application to Channel Energy Derive the PDF of the channel energy E R according to MaxEnt, under the constraints 0 E E max E 0 = E max 0 EP E (E)dE is known MaxEnt yields the truncated exponential law β P E (E) = exp(βe max ) 1 exp(βe), 0 E E max, 0 elsewhere { ( ) } 1 with β = RootOf E max exp(βe max ) β + E 0 (exp(βe max ) 1) = 0 < 0. marginalize over E: P (H) = P H,E (H, E)dE = R + R + P H E (H)P E (E)dE. The Maximum Entropy Principle and Applications to MIMO Channel Modeling, M. Guillaud, CM Talk 16/02/2006 p.13

14 Application to Channel Energy (cont d) Example for a SISO channel (n t = n r = 1): distribution of r = h known energy: P r (r) = 2r E 0 exp unknonw energy: P r (r) = E max 0 ( ) r2 E 0 β exp(βe max ) 1 ( ) 2r exp βe r2 de E E Known energy, E 0 =1 0.9 P r (r) Unknown energy, E =1, E =+ 0 max Unknown energy, E =1, E =4 0 max Unknown energy, E 0 =1, E max =1.5 Mutual information CDF P(I<I 0 ) Known energy, E =1 0 Unknown energy, E =1, E =+ 0 max Unknown energy, E 0 =1, E max = Unknown energy, E 0 =1, E max = r I 0 (nats) The Maximum Entropy Principle and Applications to MIMO Channel Modeling, M. Guillaud, CM Talk 16/02/2006 p.14

15 Application to Spatial Correlation Q = C N hh H P H (H)dH (spatial covariance) is known to be an important channel characteristic Case of a deterministic Q: Apply the MaxEnt method by introducing N Lagrange multipliers α a,b, β, and maximizing L(P H Q ) = C N log(p H Q (H, Q))P H Q (H, Q)dH + β [ 1 C N P H Q (H, Q)dH ] + (a,b) [1,...,N] 2 α a,b [ C N h a h b P H Q(H)dH q a,b ] This yields the correlated Gaussian distribution 1 ( ) P h Q (h, Q) = det(πq) exp (h H Q 1 h) The Maximum Entropy Principle and Applications to MIMO Channel Modeling, M. Guillaud, CM Talk 16/02/2006 p.15

16 Application to Spatial Correlation (cont d) For unknown Q, derive P Q (Q) as the MaxEnt distribution over S = {N N positive semidefinite matrices on C}. S is mapped [5] into the product space U(N)/T R +N : U(N)/T : unitary N N matrices with real, non-negative first row R N is the space of real non-negative non-decreasing N-tuples Q = UΛU H, U U(N)/T, Λ = diag(λ 1,... λ N ). K(Λ) = (2π)N(N 1)/2 N 1 j=1 j! i<j (λ i λ j ) 2 is the Jacobian Under the average total energy constraint ( S tr(q)p Q(Q)dQ = NE 0 ), MaxEnt yields P U,Λ (U, Λ) = P U P Λ (Λ)K(Λ), and P U (U) is the uniform distribution on U(N)/T and P Λ (Λ) = C i=1...n eγλ i This factorization is a consequence of the MaxEnt optimization! The Maximum Entropy Principle and Applications to MIMO Channel Modeling, M. Guillaud, CM Talk 16/02/2006 p.16

17 Application to Spatial Correlation: P (H) Marginalize over Q to obtain P (H): P H (H) = S P H Q (H, Q)P Q (Q)dQ = U(N)/T R +N P H U,Λ (H)P U P Λ (Λ)K(Λ)dUdΛ P U,Λ (U, Λ) = P U P Λ (Λ)K(Λ) provides a way to generate realizations of Q: uniform P U : P H (H) is invariant by unitarily invariant U generated by orthogonalization of i.i.d. Gaussian matrices the joint distribution of the eigenvalues Λ is C i=1...n eγλ i (2π)N(N 1)/2 N 1 j=1 j! i<j (λ i λ j ) 2 P H Q (H, Q) is a correlated Gaussian r.v. The Maximum Entropy Principle and Applications to MIMO Channel Modeling, M. Guillaud, CM Talk 16/02/2006 p.17

18 Conclusions General MaxEnt-based method to generate analytical channel models with a parameter of interest [6] Application to channel energy Application to spatial covariance matrix importance of the distribution of the eigenvalues Every expectation constraint (correlation...) is easily incorporated Extension to other types of correlation (time, frequency...) is possible The Maximum Entropy Principle and Applications to MIMO Channel Modeling, M. Guillaud, CM Talk 16/02/2006 p.18

19 References [1] E. T. Jaynes, Probability Theory, Cambridge University Press, [2] Benjamin Svetitsky, Notes on functionals, bqs/functionals.pdf, Mar [3] J.N. Kapur and H.K. Kesavan, Entropy Optimization Principles with Applications, Academic Press, [4] Merouane Debbah and Ralf R. Müller, MIMO channel modelling and the principle of maximum entropy, IEEE Transactions on Information Theory, vol. 51, no. 5, pp , May [5] Fumio Hiai and Dénes Petz, The Semicircle Law, Free Random Variables and Entropy, vol. 77 of Mathematical Surveys and Monographs, American Mathematical Society, [6] M. Guillaud and M. Debbah, Maximum entropy MIMO wireless channel models with limited information, in Proc. MATHMOD Conference on Mathematical Modeling, Wien, Austria, Feb The Maximum Entropy Principle and Applications to MIMO Channel Modeling, M. Guillaud, CM Talk 16/02/2006 p.19

Lecture 8: MIMO Architectures (II) Theoretical Foundations of Wireless Communications 1. Overview. Ragnar Thobaben CommTh/EES/KTH

Lecture 8: MIMO Architectures (II) Theoretical Foundations of Wireless Communications 1. Overview. Ragnar Thobaben CommTh/EES/KTH MIMO : MIMO Theoretical Foundations of Wireless Communications 1 Wednesday, May 25, 2016 09:15-12:00, SIP 1 Textbook: D. Tse and P. Viswanath, Fundamentals of Wireless Communication 1 / 20 Overview MIMO

More information

On the Nature of Random System Matrices in Structural Dynamics

On the Nature of Random System Matrices in Structural Dynamics On the Nature of Random System Matrices in Structural Dynamics S. ADHIKARI AND R. S. LANGLEY Cambridge University Engineering Department Cambridge, U.K. Nature of Random System Matrices p.1/20 Outline

More information

Single-User MIMO systems: Introduction, capacity results, and MIMO beamforming

Single-User MIMO systems: Introduction, capacity results, and MIMO beamforming Single-User MIMO systems: Introduction, capacity results, and MIMO beamforming Master Universitario en Ingeniería de Telecomunicación I. Santamaría Universidad de Cantabria Contents Introduction Multiplexing,

More information

12.4 Known Channel (Water-Filling Solution)

12.4 Known Channel (Water-Filling Solution) ECEn 665: Antennas and Propagation for Wireless Communications 54 2.4 Known Channel (Water-Filling Solution) The channel scenarios we have looed at above represent special cases for which the capacity

More information

Applications and fundamental results on random Vandermon

Applications and fundamental results on random Vandermon Applications and fundamental results on random Vandermonde matrices May 2008 Some important concepts from classical probability Random variables are functions (i.e. they commute w.r.t. multiplication)

More information

Course on Inverse Problems

Course on Inverse Problems Stanford University School of Earth Sciences Course on Inverse Problems Albert Tarantola Third Lesson: Probability (Elementary Notions) Let u and v be two Cartesian parameters (then, volumetric probabilities

More information

Chapter 4: Continuous channel and its capacity

Chapter 4: Continuous channel and its capacity meghdadi@ensil.unilim.fr Reference : Elements of Information Theory by Cover and Thomas Continuous random variable Gaussian multivariate random variable AWGN Band limited channel Parallel channels Flat

More information

Parallel Additive Gaussian Channels

Parallel Additive Gaussian Channels Parallel Additive Gaussian Channels Let us assume that we have N parallel one-dimensional channels disturbed by noise sources with variances σ 2,,σ 2 N. N 0,σ 2 x x N N 0,σ 2 N y y N Energy Constraint:

More information

Tight Lower Bounds on the Ergodic Capacity of Rayleigh Fading MIMO Channels

Tight Lower Bounds on the Ergodic Capacity of Rayleigh Fading MIMO Channels Tight Lower Bounds on the Ergodic Capacity of Rayleigh Fading MIMO Channels Özgür Oyman ), Rohit U. Nabar ), Helmut Bölcskei 2), and Arogyaswami J. Paulraj ) ) Information Systems Laboratory, Stanford

More information

Lecture 5: Antenna Diversity and MIMO Capacity Theoretical Foundations of Wireless Communications 1. Overview. CommTh/EES/KTH

Lecture 5: Antenna Diversity and MIMO Capacity Theoretical Foundations of Wireless Communications 1. Overview. CommTh/EES/KTH : Antenna Diversity and Theoretical Foundations of Wireless Communications Wednesday, May 4, 206 9:00-2:00, Conference Room SIP Textbook: D. Tse and P. Viswanath, Fundamentals of Wireless Communication

More information

Random Matrix Theory Lecture 1 Introduction, Ensembles and Basic Laws. Symeon Chatzinotas February 11, 2013 Luxembourg

Random Matrix Theory Lecture 1 Introduction, Ensembles and Basic Laws. Symeon Chatzinotas February 11, 2013 Luxembourg Random Matrix Theory Lecture 1 Introduction, Ensembles and Basic Laws Symeon Chatzinotas February 11, 2013 Luxembourg Outline 1. Random Matrix Theory 1. Definition 2. Applications 3. Asymptotics 2. Ensembles

More information

Spectral Efficiency of CDMA Cellular Networks

Spectral Efficiency of CDMA Cellular Networks Spectral Efficiency of CDMA Cellular Networks N. Bonneau, M. Debbah, E. Altman and G. Caire INRIA Eurecom Institute nicolas.bonneau@sophia.inria.fr Outline 2 Outline Uplink CDMA Model: Single cell case

More information

ASIGNIFICANT research effort has been devoted to the. Optimal State Estimation for Stochastic Systems: An Information Theoretic Approach

ASIGNIFICANT research effort has been devoted to the. Optimal State Estimation for Stochastic Systems: An Information Theoretic Approach IEEE TRANSACTIONS ON AUTOMATIC CONTROL, VOL 42, NO 6, JUNE 1997 771 Optimal State Estimation for Stochastic Systems: An Information Theoretic Approach Xiangbo Feng, Kenneth A Loparo, Senior Member, IEEE,

More information

5 Mutual Information and Channel Capacity

5 Mutual Information and Channel Capacity 5 Mutual Information and Channel Capacity In Section 2, we have seen the use of a quantity called entropy to measure the amount of randomness in a random variable. In this section, we introduce several

More information

Supelec Randomness in Wireless Networks: how to deal with it?

Supelec Randomness in Wireless Networks: how to deal with it? Supelec Randomness in Wireless Networks: how to deal with it? Mérouane Debbah Alcatel-Lucent Chair on Flexible Radio merouane.debbah@supelec.fr The performance dilemma... La théorie, c est quand on sait

More information

Eigenvalues and Singular Values of Random Matrices: A Tutorial Introduction

Eigenvalues and Singular Values of Random Matrices: A Tutorial Introduction Random Matrix Theory and its applications to Statistics and Wireless Communications Eigenvalues and Singular Values of Random Matrices: A Tutorial Introduction Sergio Verdú Princeton University National

More information

EE 5407 Part II: Spatial Based Wireless Communications

EE 5407 Part II: Spatial Based Wireless Communications EE 5407 Part II: Spatial Based Wireless Communications Instructor: Prof. Rui Zhang E-mail: rzhang@i2r.a-star.edu.sg Website: http://www.ece.nus.edu.sg/stfpage/elezhang/ Lecture II: Receive Beamforming

More information

Lecture 9: Diversity-Multiplexing Tradeoff Theoretical Foundations of Wireless Communications 1

Lecture 9: Diversity-Multiplexing Tradeoff Theoretical Foundations of Wireless Communications 1 : Diversity-Multiplexing Tradeoff Theoretical Foundations of Wireless Communications 1 Rayleigh Friday, May 25, 2018 09:00-11:30, Kansliet 1 Textbook: D. Tse and P. Viswanath, Fundamentals of Wireless

More information

Frequentist-Bayesian Model Comparisons: A Simple Example

Frequentist-Bayesian Model Comparisons: A Simple Example Frequentist-Bayesian Model Comparisons: A Simple Example Consider data that consist of a signal y with additive noise: Data vector (N elements): D = y + n The additive noise n has zero mean and diagonal

More information

Space-Time Coding for Multi-Antenna Systems

Space-Time Coding for Multi-Antenna Systems Space-Time Coding for Multi-Antenna Systems ECE 559VV Class Project Sreekanth Annapureddy vannapu2@uiuc.edu Dec 3rd 2007 MIMO: Diversity vs Multiplexing Multiplexing Diversity Pictures taken from lectures

More information

Morning Session Capacity-based Power Control. Department of Electrical and Computer Engineering University of Maryland

Morning Session Capacity-based Power Control. Department of Electrical and Computer Engineering University of Maryland Morning Session Capacity-based Power Control Şennur Ulukuş Department of Electrical and Computer Engineering University of Maryland So Far, We Learned... Power control with SIR-based QoS guarantees Suitable

More information

Multiple-Input Multiple-Output Systems

Multiple-Input Multiple-Output Systems Multiple-Input Multiple-Output Systems What is the best way to use antenna arrays? MIMO! This is a totally new approach ( paradigm ) to wireless communications, which has been discovered in 95-96. Performance

More information

Lecture 7 MIMO Communica2ons

Lecture 7 MIMO Communica2ons Wireless Communications Lecture 7 MIMO Communica2ons Prof. Chun-Hung Liu Dept. of Electrical and Computer Engineering National Chiao Tung University Fall 2014 1 Outline MIMO Communications (Chapter 10

More information

Multiple Antennas for MIMO Communications - Basic Theory

Multiple Antennas for MIMO Communications - Basic Theory Multiple Antennas for MIMO Communications - Basic Theory 1 Introduction The multiple-input multiple-output (MIMO) technology (Fig. 1) is a breakthrough in wireless communication system design. It uses

More information

Exercises with solutions (Set D)

Exercises with solutions (Set D) Exercises with solutions Set D. A fair die is rolled at the same time as a fair coin is tossed. Let A be the number on the upper surface of the die and let B describe the outcome of the coin toss, where

More information

Multiple Antennas in Wireless Communications

Multiple Antennas in Wireless Communications Multiple Antennas in Wireless Communications Luca Sanguinetti Department of Information Engineering Pisa University luca.sanguinetti@iet.unipi.it April, 2009 Luca Sanguinetti (IET) MIMO April, 2009 1 /

More information

Review (Probability & Linear Algebra)

Review (Probability & Linear Algebra) Review (Probability & Linear Algebra) CE-725 : Statistical Pattern Recognition Sharif University of Technology Spring 2013 M. Soleymani Outline Axioms of probability theory Conditional probability, Joint

More information

MULTIPLE-CHANNEL DETECTION IN ACTIVE SENSING. Kaitlyn Beaudet and Douglas Cochran

MULTIPLE-CHANNEL DETECTION IN ACTIVE SENSING. Kaitlyn Beaudet and Douglas Cochran MULTIPLE-CHANNEL DETECTION IN ACTIVE SENSING Kaitlyn Beaudet and Douglas Cochran School of Electrical, Computer and Energy Engineering Arizona State University, Tempe AZ 85287-576 USA ABSTRACT The problem

More information

Capacity of multiple-input multiple-output (MIMO) systems in wireless communications

Capacity of multiple-input multiple-output (MIMO) systems in wireless communications 15/11/02 Capacity of multiple-input multiple-output (MIMO) systems in wireless communications Bengt Holter Department of Telecommunications Norwegian University of Science and Technology 1 Outline 15/11/02

More information

Lecture 9: Diversity-Multiplexing Tradeoff Theoretical Foundations of Wireless Communications 1. Overview. Ragnar Thobaben CommTh/EES/KTH

Lecture 9: Diversity-Multiplexing Tradeoff Theoretical Foundations of Wireless Communications 1. Overview. Ragnar Thobaben CommTh/EES/KTH : Diversity-Multiplexing Tradeoff Theoretical Foundations of Wireless Communications 1 Rayleigh Wednesday, June 1, 2016 09:15-12:00, SIP 1 Textbook: D. Tse and P. Viswanath, Fundamentals of Wireless Communication

More information

Lecture 2. Capacity of the Gaussian channel

Lecture 2. Capacity of the Gaussian channel Spring, 207 5237S, Wireless Communications II 2. Lecture 2 Capacity of the Gaussian channel Review on basic concepts in inf. theory ( Cover&Thomas: Elements of Inf. Theory, Tse&Viswanath: Appendix B) AWGN

More information

Stochastic Spectral Approaches to Bayesian Inference

Stochastic Spectral Approaches to Bayesian Inference Stochastic Spectral Approaches to Bayesian Inference Prof. Nathan L. Gibson Department of Mathematics Applied Mathematics and Computation Seminar March 4, 2011 Prof. Gibson (OSU) Spectral Approaches to

More information

Appendix B Information theory from first principles

Appendix B Information theory from first principles Appendix B Information theory from first principles This appendix discusses the information theory behind the capacity expressions used in the book. Section 8.3.4 is the only part of the book that supposes

More information

ECE 4400:693 - Information Theory

ECE 4400:693 - Information Theory ECE 4400:693 - Information Theory Dr. Nghi Tran Lecture 8: Differential Entropy Dr. Nghi Tran (ECE-University of Akron) ECE 4400:693 Lecture 1 / 43 Outline 1 Review: Entropy of discrete RVs 2 Differential

More information

MIMO Capacities : Eigenvalue Computation through Representation Theory

MIMO Capacities : Eigenvalue Computation through Representation Theory MIMO Capacities : Eigenvalue Computation through Representation Theory Jayanta Kumar Pal, Donald Richards SAMSI Multivariate distributions working group Outline 1 Introduction 2 MIMO working model 3 Eigenvalue

More information

No. of dimensions 1. No. of centers

No. of dimensions 1. No. of centers Contents 8.6 Course of dimensionality............................ 15 8.7 Computational aspects of linear estimators.................. 15 8.7.1 Diagonalization of circulant andblock-circulant matrices......

More information

Machine Learning Srihari. Information Theory. Sargur N. Srihari

Machine Learning Srihari. Information Theory. Sargur N. Srihari Information Theory Sargur N. Srihari 1 Topics 1. Entropy as an Information Measure 1. Discrete variable definition Relationship to Code Length 2. Continuous Variable Differential Entropy 2. Maximum Entropy

More information

Introduction to Bayesian Data Analysis

Introduction to Bayesian Data Analysis Introduction to Bayesian Data Analysis Phil Gregory University of British Columbia March 2010 Hardback (ISBN-10: 052184150X ISBN-13: 9780521841504) Resources and solutions This title has free Mathematica

More information

Capacity of Block Rayleigh Fading Channels Without CSI

Capacity of Block Rayleigh Fading Channels Without CSI Capacity of Block Rayleigh Fading Channels Without CSI Mainak Chowdhury and Andrea Goldsmith, Fellow, IEEE Department of Electrical Engineering, Stanford University, USA Email: mainakch@stanford.edu, andrea@wsl.stanford.edu

More information

Ergodic and Outage Capacity of Narrowband MIMO Gaussian Channels

Ergodic and Outage Capacity of Narrowband MIMO Gaussian Channels Ergodic and Outage Capacity of Narrowband MIMO Gaussian Channels Yang Wen Liang Department of Electrical and Computer Engineering The University of British Columbia April 19th, 005 Outline of Presentation

More information

ELEC E7210: Communication Theory. Lecture 10: MIMO systems

ELEC E7210: Communication Theory. Lecture 10: MIMO systems ELEC E7210: Communication Theory Lecture 10: MIMO systems Matrix Definitions, Operations, and Properties (1) NxM matrix a rectangular array of elements a A. an 11 1....... a a 1M. NM B D C E ermitian transpose

More information

Lecture 2. Spring Quarter Statistical Optics. Lecture 2. Characteristic Functions. Transformation of RVs. Sums of RVs

Lecture 2. Spring Quarter Statistical Optics. Lecture 2. Characteristic Functions. Transformation of RVs. Sums of RVs s of Spring Quarter 2018 ECE244a - Spring 2018 1 Function s of The characteristic function is the Fourier transform of the pdf (note Goodman and Papen have different notation) C x(ω) = e iωx = = f x(x)e

More information

Information Theory. Coding and Information Theory. Information Theory Textbooks. Entropy

Information Theory. Coding and Information Theory. Information Theory Textbooks. Entropy Coding and Information Theory Chris Williams, School of Informatics, University of Edinburgh Overview What is information theory? Entropy Coding Information Theory Shannon (1948): Information theory is

More information

Linear and non-linear programming

Linear and non-linear programming Linear and non-linear programming Benjamin Recht March 11, 2005 The Gameplan Constrained Optimization Convexity Duality Applications/Taxonomy 1 Constrained Optimization minimize f(x) subject to g j (x)

More information

WE study the capacity of peak-power limited, single-antenna,

WE study the capacity of peak-power limited, single-antenna, 1158 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 3, MARCH 2010 Gaussian Fading Is the Worst Fading Tobias Koch, Member, IEEE, and Amos Lapidoth, Fellow, IEEE Abstract The capacity of peak-power

More information

Statistical Data Analysis Stat 3: p-values, parameter estimation

Statistical Data Analysis Stat 3: p-values, parameter estimation Statistical Data Analysis Stat 3: p-values, parameter estimation London Postgraduate Lectures on Particle Physics; University of London MSci course PH4515 Glen Cowan Physics Department Royal Holloway,

More information

Lecture : Probabilistic Machine Learning

Lecture : Probabilistic Machine Learning Lecture : Probabilistic Machine Learning Riashat Islam Reasoning and Learning Lab McGill University September 11, 2018 ML : Many Methods with Many Links Modelling Views of Machine Learning Machine Learning

More information

Capacity Pre-log of Noncoherent SIMO Channels via Hironaka s Theorem

Capacity Pre-log of Noncoherent SIMO Channels via Hironaka s Theorem Capacity Pre-log of Noncoherent SIMO Channels via Hironaka s Theorem Veniamin I. Morgenshtern 22. May 2012 Joint work with E. Riegler, W. Yang, G. Durisi, S. Lin, B. Sturmfels, and H. Bőlcskei SISO Fading

More information

Title. Author(s)Tsai, Shang-Ho. Issue Date Doc URL. Type. Note. File Information. Equal Gain Beamforming in Rayleigh Fading Channels

Title. Author(s)Tsai, Shang-Ho. Issue Date Doc URL. Type. Note. File Information. Equal Gain Beamforming in Rayleigh Fading Channels Title Equal Gain Beamforming in Rayleigh Fading Channels Author(s)Tsai, Shang-Ho Proceedings : APSIPA ASC 29 : Asia-Pacific Signal Citationand Conference: 688-691 Issue Date 29-1-4 Doc URL http://hdl.handle.net/2115/39789

More information

Computer Vision Group Prof. Daniel Cremers. 2. Regression (cont.)

Computer Vision Group Prof. Daniel Cremers. 2. Regression (cont.) Prof. Daniel Cremers 2. Regression (cont.) Regression with MLE (Rep.) Assume that y is affected by Gaussian noise : t = f(x, w)+ where Thus, we have p(t x, w, )=N (t; f(x, w), 2 ) 2 Maximum A-Posteriori

More information

Vector Channel Capacity with Quantized Feedback

Vector Channel Capacity with Quantized Feedback Vector Channel Capacity with Quantized Feedback Sudhir Srinivasa and Syed Ali Jafar Electrical Engineering and Computer Science University of California Irvine, Irvine, CA 9697-65 Email: syed@ece.uci.edu,

More information

Ch. 12 Linear Bayesian Estimators

Ch. 12 Linear Bayesian Estimators Ch. 1 Linear Bayesian Estimators 1 In chapter 11 we saw: the MMSE estimator takes a simple form when and are jointly Gaussian it is linear and used only the 1 st and nd order moments (means and covariances).

More information

Physics 403. Segev BenZvi. Choosing Priors and the Principle of Maximum Entropy. Department of Physics and Astronomy University of Rochester

Physics 403. Segev BenZvi. Choosing Priors and the Principle of Maximum Entropy. Department of Physics and Astronomy University of Rochester Physics 403 Choosing Priors and the Principle of Maximum Entropy Segev BenZvi Department of Physics and Astronomy University of Rochester Table of Contents 1 Review of Last Class Odds Ratio Occam Factors

More information

Random Matrix Theory Lecture 3 Free Probability Theory. Symeon Chatzinotas March 4, 2013 Luxembourg

Random Matrix Theory Lecture 3 Free Probability Theory. Symeon Chatzinotas March 4, 2013 Luxembourg Random Matrix Theory Lecture 3 Free Probability Theory Symeon Chatzinotas March 4, 2013 Luxembourg Outline 1. Free Probability Theory 1. Definitions 2. Asymptotically free matrices 3. R-transform 4. Additive

More information

Ch. 8 Math Preliminaries for Lossy Coding. 8.4 Info Theory Revisited

Ch. 8 Math Preliminaries for Lossy Coding. 8.4 Info Theory Revisited Ch. 8 Math Preliminaries for Lossy Coding 8.4 Info Theory Revisited 1 Info Theory Goals for Lossy Coding Again just as for the lossless case Info Theory provides: Basis for Algorithms & Bounds on Performance

More information

Metric-based classifiers. Nuno Vasconcelos UCSD

Metric-based classifiers. Nuno Vasconcelos UCSD Metric-based classifiers Nuno Vasconcelos UCSD Statistical learning goal: given a function f. y f and a collection of eample data-points, learn what the function f. is. this is called training. two major

More information

LECTURE 18. Lecture outline Gaussian channels: parallel colored noise inter-symbol interference general case: multiple inputs and outputs

LECTURE 18. Lecture outline Gaussian channels: parallel colored noise inter-symbol interference general case: multiple inputs and outputs LECTURE 18 Last time: White Gaussian noise Bandlimited WGN Additive White Gaussian Noise (AWGN) channel Capacity of AWGN channel Application: DS-CDMA systems Spreading Coding theorem Lecture outline Gaussian

More information

A characterization of consistency of model weights given partial information in normal linear models

A characterization of consistency of model weights given partial information in normal linear models Statistics & Probability Letters ( ) A characterization of consistency of model weights given partial information in normal linear models Hubert Wong a;, Bertrand Clare b;1 a Department of Health Care

More information

Introduction to Machine Learning

Introduction to Machine Learning Introduction to Machine Learning Introduction to Probabilistic Methods Varun Chandola Computer Science & Engineering State University of New York at Buffalo Buffalo, NY, USA chandola@buffalo.edu Chandola@UB

More information

Lecture 6: Gaussian Channels. Copyright G. Caire (Sample Lectures) 157

Lecture 6: Gaussian Channels. Copyright G. Caire (Sample Lectures) 157 Lecture 6: Gaussian Channels Copyright G. Caire (Sample Lectures) 157 Differential entropy (1) Definition 18. The (joint) differential entropy of a continuous random vector X n p X n(x) over R is: Z h(x

More information

Schur-convexity of the Symbol Error Rate in Correlated MIMO Systems with Precoding and Space-time Coding

Schur-convexity of the Symbol Error Rate in Correlated MIMO Systems with Precoding and Space-time Coding Schur-convexity of the Symbol Error Rate in Correlated MIMO Systems with Precoding and Space-time Coding RadioVetenskap och Kommunikation (RVK 08) Proceedings of the twentieth Nordic Conference on Radio

More information

ECE 5615/4615 Computer Project

ECE 5615/4615 Computer Project Set #1p Due Friday March 17, 017 ECE 5615/4615 Computer Project The details of this first computer project are described below. This being a form of take-home exam means that each person is to do his/her

More information

Ergodic and Outage Capacity of Narrowband MIMO Gaussian Channels

Ergodic and Outage Capacity of Narrowband MIMO Gaussian Channels Ergodic and Outage Capacity of Narrowband MIMO Gaussian Channels Yang Wen Liang Department of Electrical and Computer Engineering The University of British Columbia, Vancouver, British Columbia Email:

More information

Open Issues on the Statistical Spectrum Characterization of Random Vandermonde Matrices

Open Issues on the Statistical Spectrum Characterization of Random Vandermonde Matrices Open Issues on the Statistical Spectrum Characterization of Random Vandermonde Matrices Giusi Alfano, Mérouane Debbah, Oyvind Ryan Abstract Recently, analytical methods for finding moments of random Vandermonde

More information

Review (probability, linear algebra) CE-717 : Machine Learning Sharif University of Technology

Review (probability, linear algebra) CE-717 : Machine Learning Sharif University of Technology Review (probability, linear algebra) CE-717 : Machine Learning Sharif University of Technology M. Soleymani Fall 2012 Some slides have been adopted from Prof. H.R. Rabiee s and also Prof. R. Gutierrez-Osuna

More information

Homework 1 Due: Thursday 2/5/2015. Instructions: Turn in your homework in class on Thursday 2/5/2015

Homework 1 Due: Thursday 2/5/2015. Instructions: Turn in your homework in class on Thursday 2/5/2015 10-704 Homework 1 Due: Thursday 2/5/2015 Instructions: Turn in your homework in class on Thursday 2/5/2015 1. Information Theory Basics and Inequalities C&T 2.47, 2.29 (a) A deck of n cards in order 1,

More information

Shannon meets Wiener II: On MMSE estimation in successive decoding schemes

Shannon meets Wiener II: On MMSE estimation in successive decoding schemes Shannon meets Wiener II: On MMSE estimation in successive decoding schemes G. David Forney, Jr. MIT Cambridge, MA 0239 USA forneyd@comcast.net Abstract We continue to discuss why MMSE estimation arises

More information

Testing Restrictions and Comparing Models

Testing Restrictions and Comparing Models Econ. 513, Time Series Econometrics Fall 00 Chris Sims Testing Restrictions and Comparing Models 1. THE PROBLEM We consider here the problem of comparing two parametric models for the data X, defined by

More information

Continuous Probability Distributions from Finite Data. Abstract

Continuous Probability Distributions from Finite Data. Abstract LA-UR-98-3087 Continuous Probability Distributions from Finite Data David M. Schmidt Biophysics Group, Los Alamos National Laboratory, Los Alamos, New Mexico 87545 (August 5, 1998) Abstract Recent approaches

More information

Some Expectations of a Non-Central Chi-Square Distribution With an Even Number of Degrees of Freedom

Some Expectations of a Non-Central Chi-Square Distribution With an Even Number of Degrees of Freedom Some Expectations of a Non-Central Chi-Square Distribution With an Even Number of Degrees of Freedom Stefan M. Moser April 7, 007 Abstract The non-central chi-square distribution plays an important role

More information

Mixtures of Gaussians. Sargur Srihari

Mixtures of Gaussians. Sargur Srihari Mixtures of Gaussians Sargur srihari@cedar.buffalo.edu 1 9. Mixture Models and EM 0. Mixture Models Overview 1. K-Means Clustering 2. Mixtures of Gaussians 3. An Alternative View of EM 4. The EM Algorithm

More information

Eigenvalues, Eigenvectors, and an Intro to PCA

Eigenvalues, Eigenvectors, and an Intro to PCA Eigenvalues, Eigenvectors, and an Intro to PCA Eigenvalues, Eigenvectors, and an Intro to PCA Changing Basis We ve talked so far about re-writing our data using a new set of variables, or a new basis.

More information

16.584: Random Vectors

16.584: Random Vectors 1 16.584: Random Vectors Define X : (X 1, X 2,..X n ) T : n-dimensional Random Vector X 1 : X(t 1 ): May correspond to samples/measurements Generalize definition of PDF: F X (x) = P[X 1 x 1, X 2 x 2,...X

More information

ECE 275A Homework 6 Solutions

ECE 275A Homework 6 Solutions ECE 275A Homework 6 Solutions. The notation used in the solutions for the concentration (hyper) ellipsoid problems is defined in the lecture supplement on concentration ellipsoids. Note that θ T Σ θ =

More information

USING multiple antennas has been shown to increase the

USING multiple antennas has been shown to increase the IEEE TRANSACTIONS ON COMMUNICATIONS, VOL. 55, NO. 1, JANUARY 2007 11 A Comparison of Time-Sharing, DPC, and Beamforming for MIMO Broadcast Channels With Many Users Masoud Sharif, Member, IEEE, and Babak

More information

Gaussian processes. Chuong B. Do (updated by Honglak Lee) November 22, 2008

Gaussian processes. Chuong B. Do (updated by Honglak Lee) November 22, 2008 Gaussian processes Chuong B Do (updated by Honglak Lee) November 22, 2008 Many of the classical machine learning algorithms that we talked about during the first half of this course fit the following pattern:

More information

Image Denoising using Uniform Curvelet Transform and Complex Gaussian Scale Mixture

Image Denoising using Uniform Curvelet Transform and Complex Gaussian Scale Mixture EE 5359 Multimedia Processing Project Report Image Denoising using Uniform Curvelet Transform and Complex Gaussian Scale Mixture By An Vo ISTRUCTOR: Dr. K. R. Rao Summer 008 Image Denoising using Uniform

More information

Statistics for scientists and engineers

Statistics for scientists and engineers Statistics for scientists and engineers February 0, 006 Contents Introduction. Motivation - why study statistics?................................... Examples..................................................3

More information

UNIT Define joint distribution and joint probability density function for the two random variables X and Y.

UNIT Define joint distribution and joint probability density function for the two random variables X and Y. UNIT 4 1. Define joint distribution and joint probability density function for the two random variables X and Y. Let and represent the probability distribution functions of two random variables X and Y

More information

DEEP LEARNING CHAPTER 3 PROBABILITY & INFORMATION THEORY

DEEP LEARNING CHAPTER 3 PROBABILITY & INFORMATION THEORY DEEP LEARNING CHAPTER 3 PROBABILITY & INFORMATION THEORY OUTLINE 3.1 Why Probability? 3.2 Random Variables 3.3 Probability Distributions 3.4 Marginal Probability 3.5 Conditional Probability 3.6 The Chain

More information

Machine Learning 4771

Machine Learning 4771 Machine Learning 4771 Instructor: Tony Jebara Topic 11 Maximum Likelihood as Bayesian Inference Maximum A Posteriori Bayesian Gaussian Estimation Why Maximum Likelihood? So far, assumed max (log) likelihood

More information

Optimal Data and Training Symbol Ratio for Communication over Uncertain Channels

Optimal Data and Training Symbol Ratio for Communication over Uncertain Channels Optimal Data and Training Symbol Ratio for Communication over Uncertain Channels Ather Gattami Ericsson Research Stockholm, Sweden Email: athergattami@ericssoncom arxiv:50502997v [csit] 2 May 205 Abstract

More information

Capacity optimization for Rician correlated MIMO wireless channels

Capacity optimization for Rician correlated MIMO wireless channels Capacity optimization for Rician correlated MIMO wireless channels Mai Vu, and Arogyaswami Paulraj Information Systems Laboratory, Department of Electrical Engineering Stanford University, Stanford, CA

More information

Choosing among models

Choosing among models Eco 515 Fall 2014 Chris Sims Choosing among models September 18, 2014 c 2014 by Christopher A. Sims. This document is licensed under the Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported

More information

Review. DS GA 1002 Statistical and Mathematical Models. Carlos Fernandez-Granda

Review. DS GA 1002 Statistical and Mathematical Models.   Carlos Fernandez-Granda Review DS GA 1002 Statistical and Mathematical Models http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall16 Carlos Fernandez-Granda Probability and statistics Probability: Framework for dealing with

More information

Linear Algebra Practice Problems

Linear Algebra Practice Problems Math 7, Professor Ramras Linear Algebra Practice Problems () Consider the following system of linear equations in the variables x, y, and z, in which the constants a and b are real numbers. x y + z = a

More information

The Expectation-Maximization Algorithm

The Expectation-Maximization Algorithm The Expectation-Maximization Algorithm Francisco S. Melo In these notes, we provide a brief overview of the formal aspects concerning -means, EM and their relation. We closely follow the presentation in

More information

EE/Stats 376A: Homework 7 Solutions Due on Friday March 17, 5 pm

EE/Stats 376A: Homework 7 Solutions Due on Friday March 17, 5 pm EE/Stats 376A: Homework 7 Solutions Due on Friday March 17, 5 pm 1. Feedback does not increase the capacity. Consider a channel with feedback. We assume that all the recieved outputs are sent back immediately

More information

Probability and Information Theory. Sargur N. Srihari

Probability and Information Theory. Sargur N. Srihari Probability and Information Theory Sargur N. srihari@cedar.buffalo.edu 1 Topics in Probability and Information Theory Overview 1. Why Probability? 2. Random Variables 3. Probability Distributions 4. Marginal

More information

User Guide for Hermir version 0.9: Toolbox for Synthesis of Multivariate Stationary Gaussian and non-gaussian Series

User Guide for Hermir version 0.9: Toolbox for Synthesis of Multivariate Stationary Gaussian and non-gaussian Series User Guide for Hermir version 0.9: Toolbox for Synthesis of Multivariate Stationary Gaussian and non-gaussian Series Hannes Helgason, Vladas Pipiras, and Patrice Abry June 2, 2011 Contents 1 Organization

More information

Lecture 1: Probability Fundamentals

Lecture 1: Probability Fundamentals Lecture 1: Probability Fundamentals IB Paper 7: Probability and Statistics Carl Edward Rasmussen Department of Engineering, University of Cambridge January 22nd, 2008 Rasmussen (CUED) Lecture 1: Probability

More information

Information Theory in Intelligent Decision Making

Information Theory in Intelligent Decision Making Information Theory in Intelligent Decision Making Adaptive Systems and Algorithms Research Groups School of Computer Science University of Hertfordshire, United Kingdom June 7, 2015 Information Theory

More information

Spatial and Temporal Power Allocation for MISO Systems with Delayed Feedback

Spatial and Temporal Power Allocation for MISO Systems with Delayed Feedback Spatial and Temporal ower Allocation for MISO Systems with Delayed Feedback Venkata Sreekanta Annapureddy and Srikrishna Bhashyam Department of Electrical Engineering Indian Institute of Technology Madras

More information

Lower Bound Techniques for Statistical Estimation. Gregory Valiant and Paul Valiant

Lower Bound Techniques for Statistical Estimation. Gregory Valiant and Paul Valiant Lower Bound Techniques for Statistical Estimation Gregory Valiant and Paul Valiant The Setting Given independent samples from a distribution (of discrete support): D Estimate # species Estimate entropy

More information

Review: Directed Models (Bayes Nets)

Review: Directed Models (Bayes Nets) X Review: Directed Models (Bayes Nets) Lecture 3: Undirected Graphical Models Sam Roweis January 2, 24 Semantics: x y z if z d-separates x and y d-separation: z d-separates x from y if along every undirected

More information

Multiuser Capacity in Block Fading Channel

Multiuser Capacity in Block Fading Channel Multiuser Capacity in Block Fading Channel April 2003 1 Introduction and Model We use a block-fading model, with coherence interval T where M independent users simultaneously transmit to a single receiver

More information

Lecture Notes 1: Vector spaces

Lecture Notes 1: Vector spaces Optimization-based data analysis Fall 2017 Lecture Notes 1: Vector spaces In this chapter we review certain basic concepts of linear algebra, highlighting their application to signal processing. 1 Vector

More information

Math Review Sheet, Fall 2008

Math Review Sheet, Fall 2008 1 Descriptive Statistics Math 3070-5 Review Sheet, Fall 2008 First we need to know about the relationship among Population Samples Objects The distribution of the population can be given in one of the

More information

672 IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 57, NO. 2, FEBRUARY We only include here some relevant references that focus on the complex

672 IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 57, NO. 2, FEBRUARY We only include here some relevant references that focus on the complex 672 IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 57, NO. 2, FEBRUARY 2009 Ordered Eigenvalues of a General Class of Hermitian Rom Matrices With Application to the Performance Analysis of MIMO Systems Luis

More information

Introduction to Statistical Learning Theory

Introduction to Statistical Learning Theory Introduction to Statistical Learning Theory In the last unit we looked at regularization - adding a w 2 penalty. We add a bias - we prefer classifiers with low norm. How to incorporate more complicated

More information