Optimaity of Gaussian Fronthau Compression for Upink MMO Coud Radio Access etworks Yuhan Zhou, Yinfei Xu, Jun Chen, and Wei Yu Department of Eectrica and Computer Engineering, University of oronto, Canada Schoo of nformation Science and Engineering, Southeast University, China Department of Eectrica and Computer Engineering, McMaster University, Canada Emais: {yzhou, weiyu}@ece.utoronto.ca, yinfeixu@seu.edu.cn, junchen@ece.mcmaster.ca Abstract his paper investigates the compress-and-forward scheme for an upink coud radio access network (C-RA mode, where muti-antenna base-stations (BSs are connected to a coudcomputing based centra processor (CP via capacity-imited fronthau inks. he BSs perform Wyner-Ziv coding to compress and send the received signas to the CP; the CP performs either joint decoding of both the quantization codewords and the user messages at the same time, or the more practica successive decoding of the quantization codewords first, then the user messages. Under this setup, this paper makes progress toward the optimization of the fronthau compression scheme by proving two resuts. First, it is shown that if the input distributions are assumed to be Gaussian, then under joint decoding, the optima Wyner-Ziv quantization scheme for maximizing the achievabe rate region is Gaussian. Second, for fixed Gaussian input, under a sum fronthau capacity constraint and assuming Gaussian quantization, this paper shows that successive decoding and joint decoding achieve the same maximum sum rate. n this case, the optimization of Gaussian quantization noise covariance matrices for maximizing sum rate can be formuated as a convex optimization probem, therefore can be soved efficienty.. RODUCO his paper considers the upink of a coud radio access network (C-RA under finite-capacity fronthau constraints. he upink C-RA mode, as shown in Fig. 1, consists of mutipe remote users sending independent messages to a coud-computing based centra processor (CP through mutipe muti-antenna base-stations (BSs. he BSs and the CP are connected with digita fronthau inks with per-ink capacity C. A the user messages are eventuay decoded at the CP. his channe mode can be thought of as a two-hop reay network, with an interference channe between the users and the BSs, foowed by a noiseess mutipe-access channe between the BSs and the CP. he C-RA architecture enabes mutice processing, which significanty improves the performance of wireess ceuar networks by effectivey mitigating interce interference. A key question in the design of the C-RA architecture is the optima choice of input distribution at the remote users, the optima coding strategy at the BSs, and the optima decoding strategy at the CP. oward this end, this paper restricts attention to fixed Gaussian input distribution, Wyner- Ziv compress-and-forward reaying strategy at the BSs, and either successive decoding of the quantization codewords first, then the user messages, or joint decoding of the quantization X 1 X 2 X H 11 H 21 H 12 H 1 H 2 H 22 H L H L2 H L1 Z 1 Z 2 Z L Y 1 : Ŷ1 Y 2 : Ŷ2 Y L : ŶL C 1 C 2 C L Ŷ 2 Ŷ 1 Ŷ L Centra Processor Fig. 1. Upink of a coud radio-access network with capacity-imited fronthau codewords and user messages together at the CP. Our main resuts are that, under this assumption, the optima Wyner- Ziv quantization scheme is Gaussian under joint decoding. Further, if we assume a sum fronthau constraint, then under Gaussian quantization successive decoding achieves the same sum rate as joint decoding. Moreover, the optimization of Gaussian quantization covariance matrices for maximizing sum rate under the sum fronthau constraint can be formuated as a convex optimization probem, therefore can be soved efficienty. he achievabe rate of compress-and-forward with joint decoding in the upink C-RA is first studied in [1] for a singe-transmitter mode and in [2] for the muti-transmitter case. he upink C-RA mode is in fact a particuar instance of a genera reay network with a singe destination for which a generaization of compress-and-forward with joint decoding (known as noisy network coding [3], [4] can be shown to achieve the information theoretica capacity to within a constant gap. he achievabe rate region of compress-andforward with successive decoding (which has ower decoding compexity has aso been studied for the C-RA mode [5]. he recent work [6] further shows that successive decoding can achieve the sum capacity of C-RA to within constant gap, if the fronthau inks are subjected to a sum capacity constraint. he constant-gap resuts in the reay iterature are typicay estabished assuming Gaussian quantization noise covariance at the reays. his paper goes one step further in proving that
Gaussian quantizer is in fact optima under joint decoding, if the input distributions are assumed to be Gaussian. he key insight here is a connection between the C-RA mode and the CEO probem in source coding [7], where a source is described to a centra unit by remote agents with noisy observations. he soution to the CEO probem is known for the scaar Gaussian case [8] [9], whie significant recent progress has been made in the vector case, e.g., [10], [11]. Finding the optima quantization for the C-RA mode is aso reated to the mutua information constraint probem [12] [13], which can be soved by the entropy power inequaity or the perturbation approach. n this paper, we use techniques for estabishing outer bound for the Gaussian vector CEO probem [11] to prove the optimaity of Gaussian quantization. his paper makes further progress in observing that the optimization of Gaussian quantization noise covariance matrices can be reformuated as a convex optimization probem under joint decoding, and that successive decoding and joint decoding are equivaent for maximizing the sum rate under a specia case of upink C-RA mode with a sum-capacity fronthau constraint. he quantization noise covariance optimization probem has been considered in the iterature, but ony ocay convergent agorithms are known previousy [14], [15]. he convex formuation proposed in this paper aows gobay optima Gaussian quantization noise covariance to be found efficienty. he assumption of fixed Gaussian input distribution is crucia for the current setup. t is not difficut to come up with exampes where non-gaussian input can outperform Gaussian input [5]. his paper optimizes the quantization noise covariance matrix under the fixed Gaussian input. We remark that the joint optimization of input and quantization covariance matrices remains a difficut probem [16]. otation: Superscripts ( and ( 1 denote Hermitian transpose, and matrix inverse operators; cov( denotes the covariance operation. For a vector X, X S denotes a vector/matrix with eements whose indices are eements of S. Given matrices {X 1,..., X L }, diag ( {X } =1 L denotes the bock diagona matrix formed with X on the diagona. Denote by J(X the Fisher information matrix of random vector X. Let = {1,, } and L = {1,, L}.. ACHEVABLE RAE REGOS FOR UPL C-RA his paper considers the upink C-RA, where mobie users communicate with a CP through L BSs, as shown in Fig. 1. he noiseess fronthau inks connecting the BSs with the CP have per-ink capacity C. Each user termina is equipped with M antennas; each BS is equipped with antennas. Let X k C (0, k be the signa transmitted by the kth user. he signa received at the th BS can be expressed as Y = H,k X k + Z, = 1, 2,..., L, k=1 where Z C (0, Σ are independent noises, and H,k denotes the compex channe matrix from X k to Y. Fix Gaussian input distribution as above. We use R JD to denote the achievabe rate region of the compress-and-forward reay scheme with joint decoding [1, Proposition V.1] [4, heorem 1], i.e., the set of (R 1,, R for which [ ] C (Y ; ŶX + (X ; ŶS cx c R k k S (1 for a and S L, for some L =1 p Ŷ Y. Likewise, we use R SD to denote the achievabe rate region under fixed Gaussian input distribution of the compress-andforward scheme with successive decoding [5, heorem 1], i.e., the set of (R 1,, R for which k ( R k X ; ŶLX c,, (2 for some product distribution L =1 p Ŷ Y that satisfies (Y S ; ŶSŶS c C, S L. (3 S ote that (2 is the mutipe-access rate region, (3 represents the Wyner-Ziv decoding constraint, whie (1 incorporates the joint decoding of the transmit and quantization codewords. We now evauate the joint decoding and successive decoding regions under Gaussian quantization, denoted as R G JD and R G SD, respectivey. Set p Ŷ Y C (Y, Q, where Q is the Gaussian quantization noise covariance matrix at the th BS. nstead of parameterizing over Q, we parameterize over B defined as B = (Σ + Q 1. (4 Proposition 1: Fix Gaussian input X C (0,. he joint decoding region R G JD for the C-RA mode under Gaussian quantization is the set of (R 1,, R such that k R k S [ C og ] B S H + og c, B H, + 1 1 for a and S L, for some 0 B Σ 1, where = E[X X ] is the covariance matrix of X, and H, denotes the channe matrix from X to Y. Proposition 2: Fix Gaussian input X C (0,. he successive decoding region R G SD for the C-RA mode under Gaussian quantization is given by the set of (R 1,, R such that L R k og k (5 =1 H, B H, + 1,, 1 (6
for some 0 B Σ 1 satisfying L H, B H, + 1 =1 og H, B + H, + 1 S S c og B S C, S L. (7 o derive the above expressions, first we use B = (Σ + Q 1 to evauate (Y ; ŶX = og Σ + Q = og Q B. (8 Further, in deriving (5, (X ; ŶS cx c is evauated as H Sc, H S og c, + diag ( {Σ + Q } =1 L diag ( {Σ + Q } L. (9 =1 Substituting (8 and (9( into (1 gives (5. Likewise a simiar expression hods for X ; Ŷ X c, together giving (6 and (7. n deriving (7, we start with the chain rue on mutua information ( (X ; ŶSŶS c + Y S ; ŶSX Ŷ S c ( = (Y S ; ŶSŶS c + X ; ŶSY S Ŷ S c, (10 and make use of the Markov chain Ŷ i Y i X Y j Ŷj, i j to( note that (Y S ; ŶSX Ŷ S c = (Y S ; ŶSX, and X ; ŶSY S Ŷ S c = 0. Hence, (Y S ; ŶSŶS c = (X ; ŶSŶS c + (Y ; ŶX S ( = X ; ŶL (X ; ŶS c + (Y ; ŶX. S hen, (7 can be derived by evauating the above mutua information expressions assuming Gaussian input and Gaussian quantization test channe. Ceary, in genera we have R G SD R SD, RG JD R JD, and R G SD RG JD. However, Gaussian quantization is desirabe, because it eads to achievabe rate regions that can be easiy evauated. Further, successive decoding is more desirabe than joint decoding, since it has much ower compexity. herefore, this paper focuses on R G JD and RG SD. We go toward estabishing the optimaity of Gaussian quantization and the efficient optimization of the rate region by showing that under fixed Gaussian input: (1 Gaussian quantization is optima for joint decoding, i.e., R G JD = R JD ; (2 f we assume a sum fronthau capacity constraint, the maximum sum rate achieved by successive decoding and joint decoding under Gaussian quantization are identica, and the optimization of Gaussian quantization noise covariance matrix for maximizing sum rate can be formuated as a convex optimization probem, which can be soved efficienty.. OPMALY OF GAUSSA QUAZAO UDER JO DECODG heorem 1: For the upink C-RA under fixed Gaussian input distribution and assuming joint decoding, Gaussian quantization is optima, i.e. R G JD = R JD. Proof: Reca that the achievabe rate region of the compress-and-forward scheme under joint decoding is given by the set of (R 1,, R given by (1 under joint distribution p X,Y L,ŶL L L p Xk p Y X k=1 =1 =1 = Fix pŷ Y, choose B with 0 B Σ 1 pŷ Y. (11 such that cov(y X, Ŷ = Σ Σ B Σ, = 1,, L. We proceed to show that the achievabe rate region as given by (5 with a Gaussian pŷ Y C (Y, Q, where Q = B 1 Σ, is as arge as that of (1. First, note that (Y ; ŶX = og (πe Σ h(y X, Ŷ og (πe Σ og (πe cov(y X, Ŷ = og, = 1,, L, (12 B where we use the fact that Gaussian distribution maximizes differentia entropy. Moreover, we have (X ; ŶS cx c = h(x h(x X c, ŶSc og og J 1 (X X c, ŶSc, where the inequaity is due to [11, Lemma 2] [17]. Since it foows that where Y S c = H Sc, X + H Sc, cx c + S c, X = E[X X c, Y S c] +,S c = S c G, (Y H, cx c +,S c, G, = ( 1 + H j, Σ 1 j j S c H j, 1 H, Σ 1, (13 and,s c C (0, Λ (14 ( with Λ = 1 + 1. S H c, Σ 1 H, By the matrix compementary identity between Fisher information matrix
and MMSE [11, Lemma 3] [18], we have J(X X c, U S c cov = 1 herefore, Λ 1 ( G, (Y H, cx cx, ŶS c S c ( Λ 1 cov Λ 1 G, Y X, ŶS c S [ c S c H, + G, cov(y X, ŶG, S c ( Σ 1 B H, S c H, B H,. ] (X ; ŶS cx 1 c og + S H c, B H, 1 (15 for a and S L. Combining (12 and (15, we concude that R G JD as given by (5 is as arge as R JD as given by (1. herefore, R G JD = R JD. We observe further that the optimization of Gaussian quantization noise covariance matrices under joint decoding is the foowing convex optimization probem over (R k, B : max R k,0 B Σ 1 s.t. µ k R k (16 k=1 R k [ ] C og k S B S H + og c, B H, + 1 1 where the set of constraints is over a and S L. ote that the number of constraints grows exponentiay in the size of the network. Because of this, the above optimization probem can ony be soved for sma networks in practice. V. OPMZAO OF SUCCESSVE DECODG REGO UDER SUM FROHAUL COSRA Successive decoding is more practica than joint decoding because of its ower compexity, but can aso give ower rate [5]. n this section, we show that in the specia case where the fronthau inks are subject to a sum capacity constraint, successive decoding actuay achieves the same maximum sum rate as joint decoding, assuming Gaussian input and Gaussian quantization. Combining with the resut on the optimaity of Gaussian quantization for joint decoding, this impies that for maximizing the sum rate under sum fronthau constraint, Gaussian quantization with successive decoding is optima. More specificay, the sum fronthau constraint is modeed as L =1 C C, justifiabe in certain situations where the fronthau are impemented in shared medium, as has been considered in [6], [14]. Assuming Gaussian quantization, the joint decoding rate R JD,s under sum fronthau constraint is { L R JD,s min C og =1 B, L =1 H, B H, + 1 og (17 1 for some 0 B Σ 1, which can be derived from (5 by noting that ony = and ony the constraints corresponding to S = and S = L are reevant under the sum fronthau constraint. Likewise, based on (6 and (7, the sum rate for successive decoding R SD,s is given by L =1 H, B H, + 1 R SD,s og (18 1 for some 0 B Σ 1 that satisfies L =1 H, B H, + 1 L og 1 + og =1 B C. (19 Let RJD,s and R SD,s be the maximum sum rates under (17 and (18-(19, respectivey. Ceary, RJD,s R SD,s, since any B that satisfies the constraint (19 aso gives R JD,s = R SD,s. o show that RJD,s R SD,s, observe that if this is not the case, then the optima B that attains RJD,s must satisfy L C og =1 B < og L =1 H, B H, + 1 1. (20 But then, we can scae B = γb with 0 < γ 1 to increase the eft-hand side and to decrease the right-hand side in the above, eading to a higher R JD,s. his contradicts the optimaity of B. hus, we have proved: heorem 2: Under sum fronthau constraint and with fixed Gaussian input, the maximum sum rates achieved by successive decoding and joint decoding over Gaussian quantization are the same, i.e., R SD,s = R JD,s. A consequence of this resut is that under the sum fronthau constraint, the optimization of the Gaussian quantization noise covariance for maximizing the sum rate under successive decoding can be formuated as a convex optimization probem. More precisey, the sum rate maximization probem can be formuated as: max R,0 B Σ 1 s.t. R (21 L =1 H, B H, + 1 R og, R + L og =1 1 B C.
t can be verified that the above probem is convex in (R, B, so it can soved efficienty. Convexity is a key advantage of this reformuation of the probem as compared to previous approaches in the iterature (e.g. [6], [14], [15] that parameterize the optimization probem over the quantization noise covariance Q, which eads to a nonconvex formuation. V. UMERCAL EXAMPLE his section presents a numerica exampe of a wireess ceuar network with three ces forming a cooperating custer. A tota of = 6 users are randomy ocated within the custer. Both the users and the BSs are equipped with M = = 2 antennas each. he noise power spectra density is set to be 124.6 dbm/hz; the user s transmit power is set to be 23 dbm over 10 MHz; a distance-dependent path-oss mode is used with L = 128.1 + 37.6 og 10 (d (where d is in km and with 8dB og norma shadowing and a Rayeigh component. he distance between neighboring BSs is set to be 0.5 km. he achievabe sum rates of the compress-and-forward schemes are potted in Fig. 2. n the simuation, the sum rates of joint decoding and successive decoding with optima Gaussian quantization are obtained under the individua fronthau capacity constraint and the sum fronthau capacity constraint using the convex formuations (16 and (21, respectivey. As performance comparison, the sum rate achieved by joint decoding with the quantizers suggested by noisy network coding [4], i.e., Q = Σ, which achieves capacity to within constant gap, is aso provided. his is referred to as constantgap quantizer. he cut-set ike sum-capacity upper bound [1] C = min og k=1 H L,k k H L,k + Ω Ω, L =1 C (22 where Ω = diag ( {Σ } L =1 is aso potted. t is observed that the optima quantizer significanty outperforms the constantgap qunatizer in achievabe sum rate for the C-RA mode. V. COCLUSO his paper studies the compress-and-forward scheme for an upink C-RA mode where the BSs are connected to a CP through noiseess fronthau inks of imited capacities. We show the optimaity of Gaussian quantization under certain condition, and show the equivaence of joint decoding and successive decoding for maximum sum rate under sum fronthau constraint. Further, we show that the optimization of Gaussian quantizer for maximizing the sum rate under successive decoding can be cast as a convex optimization probem, which faciitates its efficient numerica soution. REFERECES [1] A. Sanderovich, O. Somekh, H. V. Poor, and S. Shamai, Upink macro diversity of imited backhau ceuar network, EEE rans. nf. heory, vo. 55, no. 8, pp. 3457 3478, Aug. 2009. [2] A. Sanderovich, S. Shamai, and Y. Steinberg, Distributed MMO receiver Achievabe rates and upper bounds, EEE rans. nf. heory, vo. 55, no. 10, pp. 4419 4438, Oct. 2009. Per-ce sum rate (Mbps 60 55 50 45 40 35 30 25 20 15 Cut-set upper bound Optima quantizer, sum fronthau constraint Optima quantizer, individua fronthau constraint Constant-gap quantizer 20 40 60 80 100 120 140 160 180 200 220 Per-ce backhau capacity (Mbps Fig. 2. Achievabe sum rate of a C-RA with different quantizers. [3] A. Avestimehr, S. Diggavi, and D. se, Wireess network information fow: A deterministic approach, EEE rans. nf. heory, vo. 57, no. 4, pp. 1872 1905, Apr. 2011. [4] S. H. Lim, Y.-H. im, A. E Gama, and S.-Y. Chung, oisy network coding, EEE rans. nf. heory, vo. 57, no. 5, pp. 3132 3152, May 2011. [5] A. Sanderovich, S. Shamai, Y. Steinberg, and G. ramer, Communication via decentraized processing, EEE rans. nf. heory, vo. 54, no. 7, pp. 3008 3023, Ju. 2008. [6] Y. Zhou and W. Yu, Optimized backhau compression for upink coud radio access network, EEE J. Se. Areas Commun., vo. 32, no. 6, pp. 1295 1307, Jun. 2014. [7]. Berger, Z. Zhang, and H. Viswanathan, he CEO probem [mutitermina source coding], EEE rans. nf. heory, vo. 42, no. 3, pp. 887 902, May 1996. [8] Y. Oohama, Rate-distortion theory for Gaussian mutitermina source coding systems with severa side informations at the decoder, EEE rans. nf. heory, vo. 51, no. 7, pp. 2577 2593, Ju. 2005. [9] V. Prabhakaran, D. se, and. Ramachandran, Rate region of the quadratic Gaussian CEO probem, in Proc. EEE nt. Symp. nf. heory (S, Jun. 2004, p. 119. [10] J. Wang and J. Chen, Vector Gaussian mutitermina source coding, EEE rans. nf. heory, vo. 60, no. 9, pp. 5533 5552, Sep. 2014. [11] E. Ekrem and S. Uukus, An outer bound for the vector Gaussian CEO probem, EEE rans. nf. heory, vo. 60, no. 11, pp. 6870 6887, ov. 2014. [12] C. ian and J. Chen, Remote vector Gaussian source coding with decoder side information under mutua information and distortion constraints, EEE rans. nf. heory, vo. 55, no. 10, pp. 4676 4680, Oct. 2009. [13]. Liu and P. Viswanath, An extrema inequaity motivated by mutitermina information-theoretic probems, EEE rans. nf. heory, vo. 53, no. 5, pp. 1839 1851, May 2007. [14] A. de Coso and S. Simoens, Distributed compression for MMO coordinated networks with a backhau constraint, EEE rans. Wireess Commun., vo. 8, no. 9, pp. 4698 4709, Sep. 2009. [15] S.-H. Park, O. Simeone, O. Sahin, and S. Shamai, Robust and efficient distributed compression for coud radio access networks, EEE rans. Veh. echno., vo. 62, no. 2, pp. 692 703, Feb. 2013. [16] Y. Zhou and W. Yu, Optimized beamforming and backhau compression for upink MMO coud radio access networks, in Proc. EEE GLOBE- COM Workshops, Dec. 2014, pp. 1487 1492. [17] A. Dembo,. Cover, and J. homas, nformation theoretic inequaities, EEE rans. nf. heory, vo. 37, no. 6, pp. 1501 1518, ov. 1991. [18] D. Paomar and S. Verdu, Gradient of mutua information in inear vector Gaussian channes, EEE rans. nf. heory, vo. 52, no. 1, pp. 141 154, Jan. 2006.