Network Modulation: Transmission Technique for MIMO Networks Anatoly Khina Joint work with: Uri Erez, Ayal Hitron, Idan Livni TAU Yuval Kochman HUJI Gregory W. Wornell MIT ACC Workshop, Feder Family Award Ceremony February 27 th, 2012
Talk Outline Novel MIMO multicast scheme Two-user: via new joint decomposition of two matrices Multi-user: via algebraic space time coding structure Various applications New information-theoretic results
MIMO Multicast (Closed Loop): State of the Art Unicast Multicast Theory SISO MIMO
Single-Input Single-Output (SISO) Unicast SISO Unicast Unicast MIMO SISO Multicst MIMO Mlticst h Transmitter Receiver y = hx +z x Input of power 1 y i Output h Channel gain z White Gaussian noise CN(0,1) Optimal communication rate (capacity): C = log(1+ h 2 ) Good practical codes that approach capacity are known!
SISO Unicast Unicast MIMO SISO Multicst MIMO Mlticst MIMO Multicast (Closed Loop): State of the Art Unicast Multicast Theory SISO MIMO
SISO Unicast Unicast MIMO SISO Multicst MIMO Mlticst MIMO Multicast (Closed Loop): State of the Art Unicast Multicast Theory SISO MIMO
SISO Unicast Unicast MIMO SISO Multicst MIMO Mlticst Multiple-Input Multiple-Output (MIMO) Unicast Transmitter H x Input vector of power 1 N t y Output vector H Channel matrix Receiver y = Hx+z H kl Gain from transmit-antenna l to receive-antenna k. z White Gaussian noise CN(0,I) Optimal rate (capacity): C = max Cx log(1+hc xh ) log(1+hh )
SISO Unicast Unicast MIMO SISO Multicst MIMO Mlticst MIMO Multicast (Closed Loop): State of the Art Unicast Multicast Theory SISO MIMO
SISO Multicast SISO Unicast Unicast MIMO SISO Multicst MIMO Mlticst h 1 h 4 Receiver 1 x Input of power 1 y i Output of user i Receiver 2 h 2 y i = h i x +z i h i Channel gain to user i Transmitter z i White Gaussian noise CN(0,1) h 3 Receiver 3 i = 1,...,K Optimal rate (capacity): C = min i log(1+ h i 2 ) Receiver 4
SISO Unicast Unicast MIMO SISO Multicst MIMO Mlticst MIMO Multicast (Closed Loop): State of the Art Unicast Multicast Theory SISO MIMO
SISO Unicast Unicast MIMO SISO Multicst MIMO Mlticst MIMO Multicast (Closed Loop): State of the Art Unicast Multicast Theory SISO MIMO
Gaussian MIMO Multicast SISO Unicast Unicast MIMO SISO Multicst MIMO Mlticst H 1 H 4 Receiver 1 H 2 Transmitter H 3 Receiver 4 Receiver 2 y i = H i x+z i i = 1,...,K Receiver 3 x N t 1 input vector of power N t 1 y i Output vector of user i H i Channel matrix to user i z i White Gaussian noise vector CN(0,I) Closed loop (Full channel knowledge everywhere)
Optimal Achievable Rate (Capacity) SISO Unicast Unicast MIMO SISO Multicst MIMO Mlticst Multicasting capacity C = max C X { min log i=1,...,k ( det I +H i C X H i Optimization over covariance matrices C X satisfying the power constraint White Input / High SNR { ( )} C WI min log det I +H i H i=1,...,k i )}
SISO Unicast Unicast MIMO SISO Multicst MIMO Mlticst MIMO Multicast (Closed Loop): State of the Art Unicast Multicast Theory SISO MIMO?
SISO Unicast Unicast MIMO SISO Multicst MIMO Mlticst Summary: Multicast is (Almost) Everywhere... Multicast Rateless Codes Permuted Channels Full-Duplex Relay Network Modulation Parallel Relay Half-Duplex Relay Two-Way Relay Gaussian Source Multicast (JSCC)
Unicast SVD QR SIC &H1 &H3 &T1 &H2 Transmitter H &T3 &T2 Receiver y = Hx+z White-input capacity { C WI = log det (I +HH )} But how is this rate achieved?
SVD QR SIC Practical (Capacity-Achieving) Unicast Approaches (Info. Theory) White-input capacity { C WI = log det (I +HH )} (Comm.) Achieving this rate with a practical scheme Singular value decomposition (SVD) QR decomposition (GDFE / V-BLAST) Geometric mean decomposition (GMD) Dirty-paper coding (DPC)
SVD QR SIC Practical (Capacity-Achieving) Unicast Approaches (Info. Theory) White-input capacity { C WI = log det (I +HH )} (Comm.) Achieving this rate with a practical scheme Singular value decomposition (SVD) QR decomposition (GDFE / V-BLAST) Geometric mean decomposition (GMD) Dirty-paper coding (DPC)
SVD QR SIC Singular Value Decomposition (SVD) Q,V Unitary H = QΛV Λ Diagonal Parallel AWGN SISO sub-channels off-the-shelf codes Diagonal of Λ = SISO channel gains R i = log ( 1+λ 2 ) i Generalization to Multicast? Precoding matrix V depends on channel matrix H Which V to take?? Bottleneck problem (Λ 1 Λ 2 ) SVD
SVD QR SIC QR Decomposition (GDFE/V-BLAST) Q Unitary H = QT T Upper-triangular matrix Successive interference cancellation Parallel AWGN SISO channels Diagonal of T SISO channel gains Remark Both zero-forcing and MMSE (capacity-achieving) solutions exist.
QR Based Scheme SVD QR SIC Scheme Channel: y = Hx+z = QTx+z Transmitter: x SISO codebooks Receiver: ỹ = Q y = Tx+Q z z = Q z CN(0,I N ) Example for 2 2 ỹ 1 = [T] 11 x 1 + Interference {}}{ [T] 12 x 2 + z 1 ỹ 2 = 0 x 1 + [T] 22 x 2 + z 2
QR Based Scheme QR-based Example JET Network Modulation Scheme Generalization of QR based solution to Multicast? T depends on H. For two channel matrices H 1 and H 2 : diag(t 1 ) diag(t 2 ) different sub-channel gains! Bottleneck problem: N t ( Info. Theory: log [T 1 ] jj 2) = j=1 Comm.: R j = log Can we have equal diagonals? N t j=1 ( min{[t 1 ] jj,[t 2 ] jj } 2) ( log [T 2 ] jj 2) X
QR Based Scheme QR-based Example JET Network Modulation Scheme Idea SVD uses both Q and V QR uses only Q Can V help in QR case to achieve equal diagonals? YES!
Illustrative Example QR-based Example JET Network Modulation Scheme H 1 H 2 H 1 = [ 3 0 0 3 ], H 2 = [ 99 0 ] C1 WI = 2log ( 1+3 2) ( = log 1+ ( 99 ) ) 2 = C2 WI Send same signal over both antennas Losses half of the rate at High SNR! What if we add a precoding matrix V? How to choose V?
QR-based Example JET Network Modulation Scheme Joint Equi-Diagonal Triangularization (JET) Theorem [Kh., Kochman, Erez; Allerton2010, SP2012] H 1 and H 2 N N non-singular matrices det(h 1 ) = det(h 2 ) H 1 and H 2 can be jointly decomposed as: H 1 = Q 1 T 1 V H 2 = Q 2 T 2 V Q 1,Q 2,V unitary T 1 and T 2 are upper-triangular with equal diagonals For det(h 1 ) > det(h 2 ): H 1 = N det(h 1 )Q 1 T 1 V H 2 = N det(h 2 )Q 2 T 2 V
Illustrative Example QR-based Example JET Network Modulation Scheme Matrix V is applied to [ Hi I Nt ] (MMSE variant): [ H1 I 2 ] = 3 0 0 3 1 0 0 1 = Q 1 {}}{ T 1 V 0.286 0.905 {}}{{}}{ [ ][ ] 0.905 0.286 10 0 0.302 0.954 0.095 0.301 0 10 0.954 0.302 0.301 0.095 [ H2 I 2 ] = 99 0 1 0 0 1 Q 2 T {}}{ 2 V {}}{{}}{ 0.949 0.300 [ ][ ] = 0.905 0.030 10 9 0.302 0.954 0.302 0.954 0 10 0.954 0.302 [ ] Q 1 0 1 Q1 = Q 2 Q2 = V V = 0 1 diag(t 1) = diag(t 2) = [ ] T 10 10 Parallel SISO channels with equal gains for both users!
QR-based Example JET Network Modulation Scheme Network Modulation: MIMO Multicast Scheme Transmitter: info. bits Splitter b 1 Codebook 1 x 1 V x 1 b Nt x Nt x Nt Codebook N t
QR-based Example JET Network Modulation Scheme Network Modulation: MIMO Multicast Scheme Effective Channel: z i x x y i V H i + Q i ỹ i T i ỹ i = T i x i + z i z i = Q z i CN(0,I)
QR-based Example JET Network Modulation Scheme Network Modulation: MIMO Multicast Scheme Receiver: y i Q i ỹ i Decoder i ˆx j GDFE
QR-based Example JET Network Modulation Scheme Extensions
Non-white Capacity Multiple Users Extension: Optimal MMSE (Capacity-Achieving) Scheme Holds for channel matrices of any dimension and rank (not equal between H 1 and H 2 ) Based upon an extension of the decomposition to non-square matrices Similar to the extension of V-BLAST from zero-forcing to MMSE For non-white input covariance matrix Cx, decompose: [ ] H i Cx 1/2 I Nt Any number of codebooks number of Tx antennas
Multiple Users Non-white Capacity Multiple Users Problem We have used V to triangularize two matrices. What to do for more?? Is 2 just a bit more than 1? Or... Is 2 a simplified? How one buys more degrees of freedom? And and what price?
Multiple Users Non-white Capacity Multiple Users Problem We have used V to triangularize two matrices. What to do for more?? Is 2 just a bit more than 1? Or... Is 2 a simplified? How one buys more degrees of freedom? And and what price? Space Time Coding to the Rescue!
Space Time Coding Structure Non-white Capacity Multiple Users H i = Q i T i V X Bunch two channel uses together: H i Q i { ( }} ) { ( {}} ) {{ ( }} ) { ({}} ){ Hi 0 Qi 0 Ti 0 V 0 = 0 H i 0 Q i 0 T i 0 V T i V X H i have a block-diagonal structure. Use general Q i, V (not block-diagonal): H i ( {}} ) { Hi 0 0 H i = ( Qi ) ( Ti ) ( V ) Exploiting off-diagonal 0s enables JET of more users!
Space Time Coding Structure Non-white Capacity Multiple Users Space Time Coding Structure [Kh., Hitron, Erez ISIT2011][Livni, Kh., Hitron, Erez ISIT2012] Any number of users K Any number of antennas at each node Joint constant-diagonal triangularization of K matrices Process jointly N K 1 t symbols Prefix suffix loss of N K 1 t symbols total Numerical evidence: Can be improved!
Perm. Rateless HD Relay 2-Way Relay Source Multicast Applications
Perm. Rateless HD Relay 2-Way Relay Source Multicast Gaussian Permuted Parallel Channels General channels: [Willems, Gorokhov][Hof, Sason, Shamai] x 1 α 1 z 1 x 2 z 2 Tx α 2 Rx x n. α n z n Gains {α i } are known Order of gains is not known at Tx, but known at Rx Equivalent Problem Be optimal for all permutation-orders simultaneously.
Perm. Rateless HD Relay 2-Way Relay Source Multicast Gaussian Permuted Parallel Channels Special case of MIMO multicasting problem! n! effective channel matrices: α πi (1) 0 0 0 α πi (2) 0 H i...... 0 0 α πi (n), π i S n i = 1,...,n! Optimal precoding matrices [Hitron, Kh., Erez ISIT2012] 2 gains: Hadamard/DFT; Single channel use 3 gains: DFT; Single channel use 4-6 gains: Quaternion-based matrices; Two channel uses
Perm. Rateless HD Relay 2-Way Relay Source Multicast Gaussian Rateless (Incremental Redundancy) Coding y = αx +z, α is unknown at Tx but is known at Rx Rx sends NACKS/ACKS until it is able to recover the message Assume α can take only a finite number of values: α 1,α 2,... Can be represented as a MIMO multicasting problem [Kh., Kochman, Erez, Wornell ITW2011] Example α {α 1,α 2 }, α 1 > α 2 C 1 = 2C 2 Effective matrices: H 1 = ( α 1 0 ) ( α2 0,H 2 = 0 α 2 Coincides with the solution of [Erez, Trott, Wornell] Works for MIMO channels H 1,H 2 (replacing α 1,α 2 ) )
Half-Duplex Relay Perm. Rateless HD Relay 2-Way Relay Source Multicast y rel Relay x rel h t,rel h rel,r Transmitter x t h t,r y r Receiver Half-duplex: Relay can receive or transmit but not both Decode-and-forward implementation: rateless relay Effective Matrices: [Kh., Kochman, Erez, Wornell ITW2011] H 1 = [ P 1 h t,rel 0 0 ],H 2 = P1 h t,r 0 0 P2 h 0 t,r + 0 P rel h rel,r........ 0 0 P2 h t,r + P rel h rel,r
Perm. Rateless HD Relay 2-Way Relay Source Multicast MIMO Two-Way Relay (New Achievable) [Kh., Kochman, Erez ISIT2011] Two nodes want to exchange messages via a relay H 1 H 2 Relay G 1 G 2 Relay Node 1 Node 2 Node 1 Node 2 (a) MAC Phase (b) Broadcast Phase MAC Phase Apply JET to H 1 and H 2 (roles of V and Q switched) Use dirty-paper coding to pre-cancel off-diagonal elements (Replaces successive interference cancellation of broadcast) Broadcast (Multicast!) Phase Use previously discussed multicasting scheme
Perm. Rateless HD Relay 2-Way Relay Source Multicast MIMO Multicasting of a Gaussian Source [Kochman, Kh., Erez ICASSP2011][Kh., Kochman, Erez SP2012] Separation does not hold! Different triangularization is needed (N t 1) sub-channels with equal diagonal values (last gain may differ) Combine with hybrid digital analog scheme Decomposition possible under a generalized Weyl condition When decomposition is possible: New achievable distortion! For 2 transmit-antennas: Optimum performance!
Summary: Multicast is (Almost) Everywhere... Multicast More...? Rateless Codes Permuted Channels Full-Duplex Relay Network Modulation Parallel Relay Half-Duplex Relay Two-Way Relay Gaussian Source Multicast (JSCC) Even now, me talking to you...