Weiss-Weinstein Bounds for Various Priors
|
|
- Blake Green
- 5 years ago
- Views:
Transcription
1 Weiss-Weinstein Bounds for Various Priors Florian Xaer, Christoph F. Mecklenbräuker, Peter Gerstoft, and Gerald Matz Abstract We address analytic solutions of the Weiss-Weinstein bound (WWB), which lower bounds the mean squared error of Bayesian inferrers. The bound supports discrete, absolutely continuous, and singular continuous probability distributions, the latter corresponding to joint estimation and detection. We present new analytical solutions for truncated Gaussian, Laplace, categorical, uniform, and Bernoulli distributions. We focus on sparse signals modeled by a Laplace prior as used in Bayesian LASSO methods, priors of truncated Gaussian densities, and uninformatie priors. In general, finding the tightest WWB of a model is a non-conex optimization problem. Hence, we show numerical examples of known and new WWBs to gain additional insight. Index Terms Weiss-Weinstein lower bound, Bayesian inference, mean squared error, LASSO, categorical, uniform, Laplace, truncated Gaussian I. INTRODUCTION In this paper we address discrete-time stochastic models in a Bayesian context. Random parameters are inferred from random obserations. Estimation and detection (classification) are types of inference for continuous and discrete random parameters. They are designed by minimizing an expected loss function. If the loss function is the mean squared error, then the optimization result is the minimum mean squared error (MMSE) inferrer. Hence, the mean squared error of an inferrer is a performance indicator. The study of lower bounds on the mean squared error of a Bayesian estimator [] [4] entails arious bounds [5] [8]. The most prominent bound is the Bayesian Cramér-Rao bound (CRB) for linear models, where the prior and the likelihood function are Gaussian. In this case, the mean squared error of the MMSE estimator coincides with the CRB. We are interested in Bayesian lower bounds for the inference of parameters, which are jointly applicable to discrete and continuous random state ariables. Additionally, the bound shall be alid for probability densities with finite support. It turns out that the regularity conditions for the applicability of the Bayesian CRB are too restrictie for discrete parameters [9], []. We want bounds with relaxed regularity conditions, which are applicable to discrete parameters, especially to discrete parameters that stems from quantized continuous states. This requirement guides us to the Weiss-Weinstein bounds (WWB) [5], [7], [], [2] and their sequential ersions [], [3] [5]. Florian Xaer florian@xaer.me, Gerald Matz gmatz@nt.tuwien.ac.at, and Christoph Mecklenbräuker cfm@nt.tuwien.ac.at are with the Institute of Telecommunications, Vienna Uniersity of Technology, Gusshausstrasse 25/E389, 4 Vienna, Austria. Peter Gerstoft gerstoft@ucsd.edu is with Scripps Institution of Oceanography, Uniersity of California, San Diego, 95 Gilman Dr, La Jolla, CA , California. The WWB is parametrized by a test point. The optimal test point proides the tightest WWB. For linear models and sample spaces R N, the WWB depends on the Bhattacharyya coefficient (BC) []. If the sample space is a subset of R N, then this leads to a generalized Bhattacharyya coefficient (GBC). Using analytic solutions of the (G)BC for popular distributions, numerical computations gie insight into the (G)BC and WWB. We open with a summary of WWBs in Section II, adding singular-continuous distributions that are inducing joint estimation and detection (e.g. in [5]). In Section III we present the BC stemming from Gaussian distributions and derie the GBC from categorical, Bernoulli, Laplace, uniform, and truncated Gaussian distributions. The newly deried WWBs are used as priors in Sections V to VII. A Laplacian prior induces sparsity and leads to the Bayesian least absolute shrinkage and selection operator (LASSO). The truncated Gaussian prior introduces a finite support as in practical problems, whereas a uniform prior is uninformatie, and links to frequentist models. II. WEISS-WEINSTEIN BOUNDS Consider the probability space (R N, B, P x ) with the N- dimensional sample space R N, the induced Borel algebra B, and the probability measure P x (B) = P (x B) [6], [7]. Due to Lebesgue decomposition, a probability distribution may consists of an absolutely continuous, a singular (discrete), and a singular continuous part, i.e. P x = c P c x + c 2 P d x + c 3 P s x, c + c 2 + c 3 =. Thus, the expectation of a measurable ector-alued function g(x) is E x (g(x)) := g(x)dp x (x) = c g(x)f x (x)dx R N R N + c 2 g(x)p x (x) + c 3 g(x)dpx s x C R x R N N with the probability density function (PDF) f x (x) = dpx c (x)/d λ c (x) and the probability mass function (PMF) p x (x) = dpx d (x)/d λ d C(x). If Px s = Px α... P α N x, α i {c, d}, is a product measure, then a hybrid density x (x) exists as a product of PDFs and PMFs. We call all three types simply probability density (PD) [8]. In what follows, we use the notation λ x for probability measures wheneer we assume the existence of a PD for the random ariable x. To simplify notation, we write E ( ) := E x,y ( ), f(x) := f x (x), p(x) := p x (x), and (x) := x (x). For the Bayesian estimate ˆx(y), the parameters x R N are inferred from measurements y R M. In what follows, we consider the affine measurement equation y = Cx +, x (x), (),
2 2 with measurement matrix C : R N R M and random noise R N. The resulting estimation error of estimate ˆx is defined by e := ˆx(y) x. The mean squared error of the inferrer is lower bounded by the Weiss-Weinstein matrix W parametrized by a test-point matrix H = [h,..., h N ]. The test point h a, a =,..., N, pertains to element [x] a. For the bound, we insert s i = s j = /2 and (38) into (39)-(4) of [5] and reformulate it. Hence, the mean-squared-error matrix is bounded as (left-hand side minus right-hand side is positie semi-definite.) E ( ee T) W (H) := HJ H T, () where J = J(H) depends on H with typical element [J] a,b := ϱ( h a, h b ) ϱ(h a, h b ) ρ (Ch a, )ρ x (h a, )ρ (Ch b, )ρ x (h b, ), (2) with the symmetrized ersion of the generalized non-metric Bayesian Bhattacharyya coefficient ϱ(h a, h b ) := [ρ (Ch a, Ch b )ρ x (h a, h b ) +ρ ( Ch a, Ch b )ρ x ( h a, h b )], (3) and the non-symmetric Bhattacharyya coefficient (GBC) ρ (h a, h b ) = ( + h a ) ( h b )dλ, (4) V where V = { : () > }. The GBC ρ x is similar. Due to the product measure λ, the integral in (4) is an abbreiation of sums and Lebesgue integrals. The multiplications of the GBCs in (2) and (3) are due to linearity and statistical independence. Eq. (3) simplifies for symmetric PDs. For infinite support of (), (4) becomes the BC [9] [2] ρ (h a, h b ) = ( + h a ) ( h b )dλ. (5) Obsere that if we substite := h b in (4), then the GBC is still dependent on h a and h b. We will see that the BC of the Gaussian and Laplace distributions depend only on the sum h := h a +h b and that PDs with finite support induce box constrains on the test-point ector h []. III. ANALYTIC SOLUTIONS In what follows, we derie analytic solutions of the BC of seeral distributions. Obsere from (4) that if () = ( )... N ( N ), then ρ (h a, h b ) = ρ (h a,, h b, )... ρ N (h a,n, h b,n ). Thus, we consider uniariate PDs wheneer the elements of the parameter ector are assumed to be statistically independent. A. Categorical Distribution Categorical distributions are defined by a probability mass function. If the categorical distribution stems from the equidistant quantization (discretization) of a continuous distribution, l R N, l L Z, we can use () to (4). In this case, the mean squared error makes sense. Inserting its density p() = l L p l =l with probability masses p l and indicator function, into (4) gies ρ C (h a, h b ) = [ ] /2 p l +ha=l p l hb =l. L l L l L Obsere that the GBC depends on h a and h b. The Bernoulli distribution p = + p = and the discrete uniform distribution Unif {r, s} with p l = p and h a ± h b s r + are examples of categorical distributions. Next, we address absolutely continuous distributions. B. Uniform Distribution Let us insert the continuous uniform distribution Unif {r, s}, into (4), i.e. s r f() = s r r s, s r r +ha s r hb sd. We distinguish four cases: s ha r r+hb s+h b d, h a, h b, s r h r+hb a s+h b d, h a, h b, s ha r s r+hb s+h b d, h a >, h b <, r h r+hb a s+h b d, h a <, h b >. Merging the first two and splitting the last two, the GBC becomes h s r, h a, h b or h a, h b, ha s r, h a >, h b <, h a > h b, ρ cu + h b (h a, h b ) = s r, h a >, h b <, h a h b, + ha s r, h a <, h b >, h a > h b, h b s r, h a <, h b >, h a h b,, else. with h a ± h b < s r. C. Laplace Distribution The Laplace distribution La {b, µ x } is a sparse prior. Inserting Laplace s density f() = µ e b into (5) and substituting w = µ and h = h a + h b gies e ( w+h + w ) dw. If h >, then the integral splits into three terms, w >, w + h > : w <, w + h < : w <, w + h > : h e w /b h / dw = 2 e h /, e w /b h / dw = 2 e h /, h e h / dw = h e h /. This is a reised ersion of BC in [], which contains only the first case.
3 3 Due to symmetry, for h, there are three similar cases. The composition of the terms proides the BC {( + ρ L (h a, h b ) = ρ L h (h) = ) e h /, h >, ( h ) e h/, h. D. (Truncated) Gaussian Distribution The density of Gaussian distribution N {µ, C } is () := (2π) N /2 (det C ) /2 e 2 µ 2 C. with the mean µ, the coariance matrix C, and the weighted norm h C := (h T C h) /2. Then the Bayesian BC [] is gien by ρ G (h b, h b ) = ρ G (h) := e 8 h 2 C. The truncated Gaussian N {µ, C, r, s} is a Gaussian distribution limited to the support [r, s], i.e., the density is gien by f() = c e 2 µ 2 C r s. with appropiate normalization factor c. Inserting f() into (4), substituting = µ, and use normalization factor c, we get s µ c e 4 r µ ( e 4 ( 2 2 C 2 T C h b + h b 2 C ) +2 T C h a+ h a 2 C r ha s h a r+hb s+h b d. Next, we substitute = + ha /2 h b/2 and obtain ρ tg (h a, h b ) = c ρg (h a, h b ) ) s µ + ha/2 h b/2 r µ + ha/2 h b/2 e 2 2 C r + ha/2+h b/2 s r ha/2 h b/2 sd. The two cases h a, h b and h a, h b lead to ρ G (h a, h b ) c s µ ha /2 h b /2 r µ + ha /2+ h b /2 If h a, h b >, then ρ tg c ρg for h a + h b and c ρg (h a, h b ) is s µ ha /2 h b /2 r µ + ha /2 h b /2 s µ + ha /2 h b /2 r µ + ha /2+ h b /2 e 2 2 C d. e 2 2 C d e 2 2 C d for h a + h b >. If h a >, h b, then ρ tg c ρg s µ + ha /2+ h b /2 r µ + ha /2 h b /2 (h a, h b ) is e 2 2 C d Weiss-Weinstein bound W (ha).5.5 Laplace Uniform Gauss tgauss Categorical Uniform (discr.) Bernoulli Figure. Weiss-Weinstein bound s. test point for equal prior and noise distributions. h a = h b for uniariate distributions. for h a + h b and c ρg s µ ha /2 h b /2 r µ + ha /2 h b /2 e 2 2 C d for h a + h b >. Due to the finite support, h a ± h b s r. This integral and Factor c in general can only be numerically ealuated. IV. NUMERICAL EXAMPLES In the sequel, we use the scalar measurement model with C =. Test matrix H shrinks to a scalar, i.e. h a = h b in (2). Fig. plots WWBs for Laplace La {, }, Gaussian N {, 2}, uniform Unif {, }, and truncated Gaussian N {, 2, 3, 3} prior and measurement distributions. In general, the WWB is non-concae regarding the test point and no general solution for the optimal test point h exists. The existence of an analytic solution of the tightest bound depends on the BC of the particular distributions. Howeer, due to symmetry, for unimodal densities only h is needed and the BC is quasi-concae. The optimal test point is small. In case of Gaussian distributions, the optimal test point tends to h a. The WWB approaches the CRB, cf. [] for a proof. The tightest WWB in the Gaussian case coincides with the CRB and is reached by an optimal estimator. Fig. also shows the WWB of the categorical distribution {.5,.2,.,.2}, the Bernoulli distribution {.5,.5}, and the discrete uniform distribution Unif { 3, 3} for =. The shape of WWB of discrete distributions follow that of continuous distributions. In case of Bernoulli distributions, there are only two possible test points h a = h b = ± = ±. According to [], the WWBs of continuous distributions and their quantized discrete ersions are equal for the same test points. Comparing the WWB of the discrete uniform distribution with the bound of the continuous uniform distribution, we obsere how a smaller support lowers the Weiss-Weinstein bound.
4 REFERENCES 4 4 Weiss-Weinstein bound W (ha) 3 2 Laplace/Laplace Laplace/Gaussian Gaussian/Gaussian Weiss-Weinstein Bound W (ha).5 support [r, s] [ 2σ 2, 2σ 2 ] [ σ 2, σ 2 ] [ /2σ 2, /2σ 2 ] [ 3σ 2, 3σ 2 ] Gaussian [, ] Weiss-Weintstein bound W (ha).5 Gaussian/Gaussian Uniform/Gaussian Uniform/Uniform Figure 2. WWB for a Laplace prior and Gaussian likelihood compared to pure Laplacian and pure Gaussian distributions. Figure 3. WWB for truncated Gaussian densities of support [r, s] s. test point. The Gaussian cure is for comparison (dotted). Figure 4. WWB for a uniform prior and Gaussian likelihood function compared to pure uniform and pure Gaussian case. V. SPARSE SIGNALS LAPLACE PRIOR The least absolute shrinkage and selection operator (LASSO) is a prominent estimation method for sparse signals x [22]. LASSO minimizes the squared error min x y Cx 2 2, where ectors x, y are deterministic. Assuming a sparse signal x, it makes sense to constrain the l - norm x = n x n < b R +. The associated Lagrangian becomes min x y Cx b x, b R +. In the Bayesian context x and y are random. Then the minimization is interpreted as the maximization of the posterior distribution, if the likelihood function is Gaussian and the prior is Laplacian. In Fig. 2, we show the WWB for the scalar linear measurement model for pure Laplace distributions x, La {, 2}, pure Gaussian distributions x, N {, 2} and Laplace prior / Gaussian noise x La {, 2}, N {, 2}. Obsere that the WWB of the mixed case lies in-between the pure examples. In the limit h, the bound is greater than zero. VI. THE TRUNCATED GAUSSIAN DENSITY Real-world applications that are modeled by a Gaussian prior, are usually box constrained. This leads to the truncated Gaussian distribution. Let x, N {, 2, r, s}. Fig. 3 shows the WW for different [r, s]. The Gaussian case (r and s ) is our reference. Compare the cures with that of continuous uniform distributions in Fig.. They look similarly. The bound decreases with decreasing support size s r. Compare it with the pure uniform cure in Fig. 3. The support of the truncated Gaussian density defines, whether the WWB tends to the uniform or to the Gaussian WWB. VII. UNINFORMATIVE PRIOR UNIFORM PRIOR A uniform prior is uninformatie in the sense that it only defines a minimum and maximum alue. Measurement y updates the beliefe of signal x. Due to the uninformatie prior, the noise distribution has a strong impact on the beliee update. Let us now compare uniform prior / Gaussian noise x Unif { 2.45, 2.45}, N {, 2} with uniform distributions x, Unif { 2.45, 2.45}, and Gaussian distributions x, N {, 2}. For the uniform distribution we used r = s = + /2 2σ to achiee identical ariance σ 2 = 2 for the Gaussian and uniform distribution. Next, we plot the WWB for the scalar linear measurement model in Fig. 4. Replacing the uniform by a Gaussian likelihood function increases the lower bound. Obsere that the WWB of the mixed model lies aboe the pure uniform case, but has a similar shape. Two characteristics influence the maximum WWB: the support and the flatness of the densities. The greater the support, the greater the possible error. The flatter the density, the greater the error. VIII. CONCLUSION The Weiss-Weinstein bound (WWB) is parametrized by test points leading to different leels. The WWB depends on the density s ariability and on its support. They apply to both discrete and absolutely continuous probability distributions. This is useful for quantized ariables, where the alues in the sample space hae a physical interpretation. Furthermore, the WWB supports singular-continuous multiariate probability distributions with a hybrid joint density. A hybrid joint density is the product of probability densities and probability mass functions. The WWB is applicable to probability distributions that are non-differentiable or hae finite support, e.g. uniform or truncated Gaussian distributions, and distributions defined only on real positie support, e.g. exponential distributions. For linear models, the WWB depends on the generalized Bhattacharyya coefficient. If the distribution is unimodal, then the WWB is bimodal. To exemplify its use, we discussed a linear measurement model. This illustrates the application of the WWB for sparse signal models with a Laplace-distributed prior and additie Gaussian noise. The WWB for a truncated Gaussian prior tends to the normally or uniformly distributed case depending on its support. The WWB of a model with a continuous uniform prior and a Gaussian likelihood function is greater than of the uniform-only model and has a similar shape.
5 5 REFERENCES [] S. M. Kay, Fundamentals of Statistical Signal Processing: Estimation Theory. NJ: Pearson Education, 993, ol.. [2] B. Ristic, S. Arulampalam, and N. Gordon, Beyond the Kalman filter: Particle filters for tracking applications. Boston: Artech House, 24. [3] O. Hlinka, O. Slučiak, F. Hlawatsch, P. M. Djurić and, and M. Rupp, Likelihood consensus and its application to distributed particle filtering, IEEE Trans. Signal Process., ol. 6, no. 8, pp , Aug. 22. [4] O. Hlinka, F. Hlawatsch, and P. M. Djurić, Distributed particle filtering in agent networks: A surey, classification, and comparison, IEEE Signal Process. Mag., ol. 3, no., pp. 6 8, 23. [5] A. J. Weiss and E. Weinstein, A general class of lower bounds in parameter estimation, IEEE Trans. Inf. Theory, ol. 34, no. 2, pp , 988. [6] H. L. Van Trees and K. L. Bell, Bayesian bounds for parameter estimation and nonlinear filtering/tracking. IEEE Press, 27. [7] A. Renaux, P. Forster, P. Larzabal, C. D. Richmond, and A. Nehorai, A fresh look at the Bayesian bounds of the Weiss-Weinstein family, IEEE Trans. Signal Process., ol. 56, no., pp , No. 28. [8] D. A. Koshae and O. A. Stepano, Application of the Rao-Cramer inequality in problems of nonlinear estimation, Journal of Comp. and System Sci. Int., ol. 36, pp , Apr [9] Z. Duan, V. P. Jilko, and X. R. Li, State estimation with quantized measurements: Approximate MMSE approach, in th Int. Conf. on Inform. Fusion, IEEE, 28, pp. 6. [] F. Xaer, P. Gerstoft, G. Matz, and C. Mecklenbräuker, Analytic sequential Weiss-Weinstein bounds, IEEE Trans. Signal Process., ol. 6, no. 2, pp , 23. [] N. D. Tran, A. Renaux, R. Boyer, S. Marcos, and P. Larzabal, Weiss-Weinstein bound for MIMO radar with colocated linear arrays for SNR threshold prediction, Signal Processing, ol. 92, no. 5, pp , 22. [2] A. Renaux, Weiss Weinstein bound for data-aided carrier estimation, Signal Process. Letters, IEEE, ol. 4, no. 4, pp , 27. [3] I. Rapoport and Y. Oshman, Recursie Weiss- Weinstein lower bounds for discrete-time nonlinear filtering, in 43rd IEEE Conf. on Decision and Control, ol. 3, Dec. 24, pp [4] S. Reece and D. Nicholson, Tighter alternaties to the Cramer-Rao lower bound for discrete-time filtering, in 8th Int. Conf. on Inform. Fusion, ol., Jul. 25, 6 pp. [5] F. Xaer, G. Matz, P. Gerstoft, and C. F. Mecklenbräuker, Predictie state ector encoding for decentralized field estimation in sensor networks, in Proc. IEEE Int. Conf. Acoust., Speech, Signal Process. (ICASSP), Kyoto, JP, Mar. 22, pp [6] P. Billingsley, Probability and measure, 3rd. John Wiley & Sons, Inc., New York, 22. [7] F. E. Burk, A Garden of Integrals, st ed., ser. The Dolciani Math. Expositions 3. The Math. Assoc. of Am., 27. [8] F. Xaer, Mixed discrete-continuous Bayesian inference: Censored measurements of sparse signals, IEEE Trans. Signal Process., ol. 63, no. 2, pp , No. 25. [9] Z. Ben-Haim and Y. C. Eldar, A comment on the Weiss Weinstein bound for constrained parameter sets, IEEE Trans. Inf. Theory, ol. 54, no., pp , 28. [2] A. Bhattacharyya, On a measure of diergence between two statistical populations defined by their probability distributions, Bull. Calcutta Math. Soc, ol. 35, no. 99-9, p. 4, 943. [2] T. Kailath, The diergence and Bhattacharyya distance measures in signal selection, IEEE Trans. Commun. Technol., ol. 5, no., pp. 52 6, Feb [22] T. Park and G. Casella, The Bayesian Lasso, Journal of the American Statistical Association, ol. 3, no. 482, pp , 28. eprint: 98/
Analytic Sequential Weiss-Weinstein Bounds
Analytic Sequential Weiss-Weinstein Bounds Florian Xaver, Peter Gerstoft, Gerald Matz, and Christoph F. Meclenbräuer Abstract In this paper, we explore a sequential Bayesian bound for state-space models
More informationOptimal Joint Detection and Estimation in Linear Models
Optimal Joint Detection and Estimation in Linear Models Jianshu Chen, Yue Zhao, Andrea Goldsmith, and H. Vincent Poor Abstract The problem of optimal joint detection and estimation in linear models with
More informationPREDICTIVE STATE VECTOR ENCODING FOR DECENTRALIZED FIELD ESTIMATION IN SENSOR NETWORKS
PREDICTIVE STATE VECTOR ENCODING FOR DECENTRALIZED FIELD ESTIMATION IN SENSOR NETWORKS Florian Xaver Gerald Matz Peter Gerstoft Christoph Meclenbräuer Institute of Telecommunications, Vienna University
More informationNotes on Linear Minimum Mean Square Error Estimators
Notes on Linear Minimum Mean Square Error Estimators Ça gatay Candan January, 0 Abstract Some connections between linear minimum mean square error estimators, maximum output SNR filters and the least square
More informationTHE BAYESIAN ABEL BOUND ON THE MEAN SQUARE ERROR
THE BAYESIAN ABEL BOUND ON THE MEAN SQUARE ERROR Alexandre Renaux, Philippe Forster, Pascal Larzabal, Christ Richmond To cite this version: Alexandre Renaux, Philippe Forster, Pascal Larzabal, Christ Richmond
More informationA NEW BAYESIAN LOWER BOUND ON THE MEAN SQUARE ERROR OF ESTIMATORS. Koby Todros and Joseph Tabrikian
16th European Signal Processing Conference EUSIPCO 008) Lausanne Switzerland August 5-9 008 copyright by EURASIP A NEW BAYESIAN LOWER BOUND ON THE MEAN SQUARE ERROR OF ESTIMATORS Koby Todros and Joseph
More informationOn the Cramér-Rao lower bound under model mismatch
On the Cramér-Rao lower bound under model mismatch Carsten Fritsche, Umut Orguner, Emre Özkan and Fredrik Gustafsson Linköping University Post Print N.B.: When citing this work, cite the original article.
More informationOptimal Mean-Square Noise Benefits in Quantizer-Array Linear Estimation Ashok Patel and Bart Kosko
IEEE SIGNAL PROCESSING LETTERS, VOL. 17, NO. 12, DECEMBER 2010 1005 Optimal Mean-Square Noise Benefits in Quantizer-Array Linear Estimation Ashok Patel and Bart Kosko Abstract A new theorem shows that
More informationBAYESIAN SPARSE SENSING OF THE JAPANESE 2011 EARTHQUAKE
BAYESIAN SPARSE SENSING OF THE JAPANESE 211 EARTHQUAKE Peter Gerstoft 1, Christoph F. Mecklenbräuker 2, Huaian Yao 3 1 Scripps Institution of Oceanography, La Jolla, CA 9293-238, USA, 2 Institute of Telecommunications,
More informationHybrid multi-bernoulli CPHD filter for superpositional sensors
Hybrid multi-bernoulli CPHD filter for superpositional sensors Santosh Nannuru and Mark Coates McGill University, Montreal, Canada ABSTRACT We propose, for the superpositional sensor scenario, a hybrid
More informationParticle Filters. Outline
Particle Filters M. Sami Fadali Professor of EE University of Nevada Outline Monte Carlo integration. Particle filter. Importance sampling. Degeneracy Resampling Example. 1 2 Monte Carlo Integration Numerical
More informationOnline Companion to Pricing Services Subject to Congestion: Charge Per-Use Fees or Sell Subscriptions?
Online Companion to Pricing Serices Subject to Congestion: Charge Per-Use Fees or Sell Subscriptions? Gérard P. Cachon Pnina Feldman Operations and Information Management, The Wharton School, Uniersity
More informationModeling Highway Traffic Volumes
Modeling Highway Traffic Volumes Tomáš Šingliar1 and Miloš Hauskrecht 1 Computer Science Dept, Uniersity of Pittsburgh, Pittsburgh, PA 15260 {tomas, milos}@cs.pitt.edu Abstract. Most traffic management
More informationNonlinear Estimation Techniques for Impact Point Prediction of Ballistic Targets
Nonlinear Estimation Techniques for Impact Point Prediction of Ballistic Targets J. Clayton Kerce a, George C. Brown a, and David F. Hardiman b a Georgia Tech Research Institute, Georgia Institute of Technology,
More informationSIGNALS with sparse representations can be recovered
IEEE SIGNAL PROCESSING LETTERS, VOL. 22, NO. 9, SEPTEMBER 2015 1497 Cramér Rao Bound for Sparse Signals Fitting the Low-Rank Model with Small Number of Parameters Mahdi Shaghaghi, Student Member, IEEE,
More informationQUANTIZATION FOR DISTRIBUTED ESTIMATION IN LARGE SCALE SENSOR NETWORKS
QUANTIZATION FOR DISTRIBUTED ESTIMATION IN LARGE SCALE SENSOR NETWORKS Parvathinathan Venkitasubramaniam, Gökhan Mergen, Lang Tong and Ananthram Swami ABSTRACT We study the problem of quantization for
More informationEstimation Error Bounds for Frame Denoising
Estimation Error Bounds for Frame Denoising Alyson K. Fletcher and Kannan Ramchandran {alyson,kannanr}@eecs.berkeley.edu Berkeley Audio-Visual Signal Processing and Communication Systems group Department
More informationA fresh look at Bayesian Cramér-Rao bounds for discrete-time nonlinear filtering
A fresh look at Bayesian Cramér-Rao bounds for discrete-time nonlinear filtering Carsten Fritsche, Emre Özkan, L. Svensson and Fredrik Gustafsson Linköping University Post Print N.B.: When citing this
More informationGaussian Estimation under Attack Uncertainty
Gaussian Estimation under Attack Uncertainty Tara Javidi Yonatan Kaspi Himanshu Tyagi Abstract We consider the estimation of a standard Gaussian random variable under an observation attack where an adversary
More informationRECENTLY, wireless sensor networks have been the object
IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 55, NO. 4, APRIL 2007 1511 Distributed Sequential Bayesian Estimation of a Diffusive Source in Wireless Sensor Networks Tong Zhao, Student Member, IEEE, and
More informationOn the Non-Existence of Unbiased Estimators in Constrained Estimation Problems
On the Non-Existence of Unbiased Estimators in Constrained Estimation Problems Anelia Somekh-Baruch, Amir Leshem and Venkatesh Saligrama 1 arxiv:1609.07415v1 [math.st] 23 Sep 2016 Abstract We address the
More informationAssignment 4 (Solutions) NPTEL MOOC (Bayesian/ MMSE Estimation for MIMO/OFDM Wireless Communications)
Assignment 4 Solutions NPTEL MOOC Bayesian/ MMSE Estimation for MIMO/OFDM Wireless Communications The system model can be written as, y hx + The MSE of the MMSE estimate ĥ of the aboe mentioned system
More informationNOTES ON THE REGULAR E-OPTIMAL SPRING BALANCE WEIGHING DESIGNS WITH CORRE- LATED ERRORS
REVSTAT Statistical Journal Volume 3, Number 2, June 205, 9 29 NOTES ON THE REGULAR E-OPTIMAL SPRING BALANCE WEIGHING DESIGNS WITH CORRE- LATED ERRORS Authors: Bronis law Ceranka Department of Mathematical
More informationLECTURE 2: CROSS PRODUCTS, MULTILINEARITY, AND AREAS OF PARALLELOGRAMS
LECTURE : CROSS PRODUCTS, MULTILINEARITY, AND AREAS OF PARALLELOGRAMS MA1111: LINEAR ALGEBRA I, MICHAELMAS 016 1. Finishing up dot products Last time we stated the following theorem, for which I owe you
More informationApplication-Oriented Estimator Selection
IEEE SIGNAL PROCESSING LETTERS, VOL. 22, NO. 4, APRIL 2015 489 Application-Oriented Estimator Selection Dimitrios Katselis, Member, IEEE, and Cristian R. Rojas, Member, IEEE Abstract Designing the optimal
More informationOPTIMAL ADAPTIVE TRANSMIT BEAMFORMING FOR COGNITIVE MIMO SONAR IN A SHALLOW WATER WAVEGUIDE
OPTIMAL ADAPTIVE TRANSMIT BEAMFORMING FOR COGNITIVE MIMO SONAR IN A SHALLOW WATER WAVEGUIDE Nathan Sharaga School of EE Tel-Aviv University Tel-Aviv, Israel natyshr@gmail.com Joseph Tabrikian Dept. of
More informationIntroduction to Systems Analysis and Decision Making Prepared by: Jakub Tomczak
Introduction to Systems Analysis and Decision Making Prepared by: Jakub Tomczak 1 Introduction. Random variables During the course we are interested in reasoning about considered phenomenon. In other words,
More informationA Survey on Binary Message LDPC decoder
A Surey on Binary Message LDPC decoder Emmanuel Boutillon *, Chris Winstead * Uniersité de Bretagne Sud Utah State Uniersity Noember the 4 th, 204 Outline Classifications of BM LDPC decoder State of the
More informationADAPTIVE FILTER THEORY
ADAPTIVE FILTER THEORY Fourth Edition Simon Haykin Communications Research Laboratory McMaster University Hamilton, Ontario, Canada Front ice Hall PRENTICE HALL Upper Saddle River, New Jersey 07458 Preface
More informationFUNDAMENTAL FILTERING LIMITATIONS IN LINEAR NON-GAUSSIAN SYSTEMS
FUNDAMENTAL FILTERING LIMITATIONS IN LINEAR NON-GAUSSIAN SYSTEMS Gustaf Hendeby Fredrik Gustafsson Division of Automatic Control Department of Electrical Engineering, Linköpings universitet, SE-58 83 Linköping,
More informationAn Optimal Split-Plot Design for Performing a Mixture-Process Experiment
Science Journal of Applied Mathematics and Statistics 217; 5(1): 15-23 http://www.sciencepublishinggroup.com/j/sjams doi: 1.11648/j.sjams.21751.13 ISSN: 2376-9491 (Print); ISSN: 2376-9513 (Online) An Optimal
More informationSigma Point Belief Propagation
Copyright 2014 IEEE IEEE Signal Processing Letters, vol. 21, no. 2, Feb. 2014, pp. 145 149 1 Sigma Point Belief Propagation Florian Meyer, Student Member, IEEE, Ondrej Hlina, Member, IEEE, and Franz Hlawatsch,
More informationSparse Functional Regression
Sparse Functional Regression Junier B. Olia, Barnabás Póczos, Aarti Singh, Jeff Schneider, Timothy Verstynen Machine Learning Department Robotics Institute Psychology Department Carnegie Mellon Uniersity
More informationOn general error distributions
ProbStat Forum, Volume 6, October 3, Pages 89 95 ISSN 974-335 ProbStat Forum is an e-journal. For details please isit www.probstat.org.in On general error distributions R. Vasudea, J. Vasantha Kumari Department
More informationNoise constrained least mean absolute third algorithm
Noise constrained least mean absolute third algorithm Sihai GUAN 1 Zhi LI 1 Abstract: he learning speed of an adaptie algorithm can be improed by properly constraining the cost function of the adaptie
More informationSensor Tasking and Control
Sensor Tasking and Control Sensing Networking Leonidas Guibas Stanford University Computation CS428 Sensor systems are about sensing, after all... System State Continuous and Discrete Variables The quantities
More informationBAYESIAN DESIGN OF DECENTRALIZED HYPOTHESIS TESTING UNDER COMMUNICATION CONSTRAINTS. Alla Tarighati, and Joakim Jaldén
204 IEEE International Conference on Acoustic, Speech and Signal Processing (ICASSP) BAYESIA DESIG OF DECETRALIZED HYPOTHESIS TESTIG UDER COMMUICATIO COSTRAITS Alla Tarighati, and Joakim Jaldén ACCESS
More informationDistributed Particle Filters: Stability Results and Graph-based Compression of Weighted Particle Clouds
Distributed Particle Filters: Stability Results and Graph-based Compression of Weighted Particle Clouds Michael Rabbat Joint with: Syamantak Datta Gupta and Mark Coates (McGill), and Stephane Blouin (DRDC)
More informationSUPPLEMENTARY MATERIAL. Authors: Alan A. Stocker (1) and Eero P. Simoncelli (2)
SUPPLEMENTARY MATERIAL Authors: Alan A. Stocker () and Eero P. Simoncelli () Affiliations: () Dept. of Psychology, Uniersity of Pennsylania 34 Walnut Street 33C Philadelphia, PA 94-68 U.S.A. () Howard
More informationIEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 59, NO. 1, JANUARY
IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL 59, NO 1, JANUARY 2011 1 Conditional Posterior Cramér Rao Lower Bounds for Nonlinear Sequential Bayesian Estimation Long Zuo, Ruixin Niu, Member, IEEE, and Pramod
More informationTarget Trajectory Estimation within a Sensor Network
Target Trajectory Estimation within a Sensor Network Adrien Ickowicz IRISA/CNRS, 354, Rennes, J-Pierre Le Cadre, IRISA/CNRS,354, Rennes,France Abstract This paper deals with the estimation of the trajectory
More informationA matrix Method for Interval Hermite Curve Segmentation O. Ismail, Senior Member, IEEE
International Journal of Video&Image Processing Network Security IJVIPNS-IJENS Vol:15 No:03 7 A matrix Method for Interal Hermite Cure Segmentation O. Ismail, Senior Member, IEEE Abstract Since the use
More informationTechnical Report: Detailed Proofs of Bounds for Map-Aware Localization
Technical Report: Detailed Proofs of Bounds for Map-Aware Localization Francesco Montorsi 1 and Giorgio M. Vitetta 1 1 Department of Information Engineering, University of Modena and Reggio Emilia 6th
More informationStochastic Processes. Theory for Applications. Robert G. Gallager CAMBRIDGE UNIVERSITY PRESS
Stochastic Processes Theory for Applications Robert G. Gallager CAMBRIDGE UNIVERSITY PRESS Contents Preface page xv Swgg&sfzoMj ybr zmjfr%cforj owf fmdy xix Acknowledgements xxi 1 Introduction and review
More informationOptimal Time Division Multiplexing Schemes for DOA Estimation of a Moving Target Using a Colocated MIMO Radar
Optimal Division Multiplexing Schemes for DOA Estimation of a Moving Target Using a Colocated MIMO Radar Kilian Rambach, Markus Vogel and Bin Yang Institute of Signal Processing and System Theory University
More informationBalanced Partitions of Vector Sequences
Balanced Partitions of Vector Sequences Imre Bárány Benjamin Doerr December 20, 2004 Abstract Let d,r N and be any norm on R d. Let B denote the unit ball with respect to this norm. We show that any sequence
More information0 a 3 a 2 a 3 0 a 1 a 2 a 1 0
Chapter Flow kinematics Vector and tensor formulae This introductory section presents a brief account of different definitions of ector and tensor analysis that will be used in the following chapters.
More informationNON-PARAMETRIC METHODS IN ANALYSIS OF EXPERIMENTAL DATA
NON-PARAMETRIC METHODS IN ANALYSIS OF EXPERIMENTAL DATA Rajender Parsad I.A.S.R.I., Library Aenue, New Delhi 0 0. Introduction In conentional setup the analysis of experimental data is based on the assumptions
More informationOn computing Gaussian curvature of some well known distribution
Theoretical Mathematics & Applications, ol.3, no.4, 03, 85-04 ISSN: 79-9687 (print), 79-9709 (online) Scienpress Ltd, 03 On computing Gaussian curature of some well known distribution William W.S. Chen
More informationMinimax MMSE Estimator for Sparse System
Proceedings of the World Congress on Engineering and Computer Science 22 Vol I WCE 22, October 24-26, 22, San Francisco, USA Minimax MMSE Estimator for Sparse System Hongqing Liu, Mandar Chitre Abstract
More informationEVALUATING SYMMETRIC INFORMATION GAP BETWEEN DYNAMICAL SYSTEMS USING PARTICLE FILTER
EVALUATING SYMMETRIC INFORMATION GAP BETWEEN DYNAMICAL SYSTEMS USING PARTICLE FILTER Zhen Zhen 1, Jun Young Lee 2, and Abdus Saboor 3 1 Mingde College, Guizhou University, China zhenz2000@21cn.com 2 Department
More informationSymmetric Functions and Difference Equations with Asymptotically Period-two Solutions
International Journal of Difference Equations ISSN 0973-532, Volume 4, Number, pp. 43 48 (2009) http://campus.mst.edu/ijde Symmetric Functions Difference Equations with Asymptotically Period-two Solutions
More informationModifying Voice Activity Detection in Low SNR by correction factors
Modifying Voice Activity Detection in Low SNR by correction factors H. Farsi, M. A. Mozaffarian, H.Rahmani Department of Electrical Engineering University of Birjand P.O. Box: +98-9775-376 IRAN hfarsi@birjand.ac.ir
More informationSparse Sensing for Statistical Inference
Sparse Sensing for Statistical Inference Model-driven and data-driven paradigms Geert Leus, Sundeep Chepuri, and Georg Kail ITA 2016, 04 Feb 2016 1/17 Power networks, grid analytics Health informatics
More informationCo-Prime Arrays and Difference Set Analysis
7 5th European Signal Processing Conference (EUSIPCO Co-Prime Arrays and Difference Set Analysis Usham V. Dias and Seshan Srirangarajan Department of Electrical Engineering Bharti School of Telecommunication
More informationStatistical Signal Processing Detection, Estimation, and Time Series Analysis
Statistical Signal Processing Detection, Estimation, and Time Series Analysis Louis L. Scharf University of Colorado at Boulder with Cedric Demeure collaborating on Chapters 10 and 11 A TT ADDISON-WESLEY
More informationJournal of Computational and Applied Mathematics. New matrix iterative methods for constraint solutions of the matrix
Journal of Computational and Applied Mathematics 35 (010 76 735 Contents lists aailable at ScienceDirect Journal of Computational and Applied Mathematics journal homepage: www.elseier.com/locate/cam New
More informationPosition in the xy plane y position x position
Robust Control of an Underactuated Surface Vessel with Thruster Dynamics K. Y. Pettersen and O. Egeland Department of Engineering Cybernetics Norwegian Uniersity of Science and Technology N- Trondheim,
More informationEstimation of the Optimum Rotational Parameter for the Fractional Fourier Transform Using Domain Decomposition
Estimation of the Optimum Rotational Parameter for the Fractional Fourier Transform Using Domain Decomposition Seema Sud 1 1 The Aerospace Corporation, 4851 Stonecroft Blvd. Chantilly, VA 20151 Abstract
More informationOn the Linear Threshold Model for Diffusion of Innovations in Multiplex Social Networks
On the Linear Threshold Model for Diffusion of Innoations in Multiplex Social Networks Yaofeng Desmond Zhong 1, Vaibha Sriastaa 2 and Naomi Ehrich Leonard 1 Abstract Diffusion of innoations in social networks
More informationPitch Estimation and Tracking with Harmonic Emphasis On The Acoustic Spectrum
Downloaded from vbn.aau.dk on: marts 31, 2019 Aalborg Universitet Pitch Estimation and Tracking with Harmonic Emphasis On The Acoustic Spectrum Karimian-Azari, Sam; Mohammadiha, Nasser; Jensen, Jesper
More informationA Tree Search Approach to Target Tracking in Clutter
12th International Conference on Information Fusion Seattle, WA, USA, July 6-9, 2009 A Tree Search Approach to Target Tracking in Clutter Jill K. Nelson and Hossein Roufarshbaf Department of Electrical
More informationDetecting Parametric Signals in Noise Having Exactly Known Pdf/Pmf
Detecting Parametric Signals in Noise Having Exactly Known Pdf/Pmf Reading: Ch. 5 in Kay-II. (Part of) Ch. III.B in Poor. EE 527, Detection and Estimation Theory, # 5c Detecting Parametric Signals in Noise
More informationBAYESIAN INFERENCE FOR MULTI-LINE SPECTRA IN LINEAR SENSOR ARRAY
BAYESIA IFERECE FOR MULTI-LIE SPECTRA I LIEAR SESOR ARRAY Viet Hung Tran Wenwu Wang Yuhui Luo Jonathon Chambers CVSSP, University of Surrey, UK ational Physical Laboratory (PL), UK ewcastle University,
More informationMMSE Dimension. snr. 1 We use the following asymptotic notation: f(x) = O (g(x)) if and only
MMSE Dimension Yihong Wu Department of Electrical Engineering Princeton University Princeton, NJ 08544, USA Email: yihongwu@princeton.edu Sergio Verdú Department of Electrical Engineering Princeton University
More informationProgress In Electromagnetics Research C, Vol. 11, , 2009
Progress In Electromagnetics Research C, Vol. 11, 171 182, 2009 DISTRIBUTED PARTICLE FILTER FOR TARGET TRACKING IN SENSOR NETWORKS H. Q. Liu, H. C. So, F. K. W. Chan, and K. W. K. Lui Department of Electronic
More informationDS-GA 1002 Lecture notes 11 Fall Bayesian statistics
DS-GA 100 Lecture notes 11 Fall 016 Bayesian statistics In the frequentist paradigm we model the data as realizations from a distribution that depends on deterministic parameters. In contrast, in Bayesian
More informationLecture 3: More on regularization. Bayesian vs maximum likelihood learning
Lecture 3: More on regularization. Bayesian vs maximum likelihood learning L2 and L1 regularization for linear estimators A Bayesian interpretation of regularization Bayesian vs maximum likelihood fitting
More informationAsymptotic Normality of an Entropy Estimator with Exponentially Decaying Bias
Asymptotic Normality of an Entropy Estimator with Exponentially Decaying Bias Zhiyi Zhang Department of Mathematics and Statistics Uniersity of North Carolina at Charlotte Charlotte, NC 28223 Abstract
More informationVariance Reduction for Stochastic Gradient Optimization
Variance Reduction for Stochastic Gradient Optimization Chong Wang Xi Chen Alex Smola Eric P. Xing Carnegie Mellon Uniersity, Uniersity of California, Berkeley {chongw,xichen,epxing}@cs.cmu.edu alex@smola.org
More information4. A Physical Model for an Electron with Angular Momentum. An Electron in a Bohr Orbit. The Quantum Magnet Resulting from Orbital Motion.
4. A Physical Model for an Electron with Angular Momentum. An Electron in a Bohr Orbit. The Quantum Magnet Resulting from Orbital Motion. We now hae deeloped a ector model that allows the ready isualization
More informationML ESTIMATION AND CRB FOR NARROWBAND AR SIGNALS ON A SENSOR ARRAY
2014 IEEE International Conference on Acoustic, Speech and Signal Processing ICASSP ML ESTIMATION AND CRB FOR NARROWBAND AR SIGNALS ON A SENSOR ARRAY Langford B White School of Electrical and Electronic
More informationStatistical and Adaptive Signal Processing
r Statistical and Adaptive Signal Processing Spectral Estimation, Signal Modeling, Adaptive Filtering and Array Processing Dimitris G. Manolakis Massachusetts Institute of Technology Lincoln Laboratory
More informationA simple one sweep algorithm for optimal APP symbol decoding of linear block codes
A simple one sweep algorithm for optimal APP symbol decoding of linear block codes Johansson, Thomas; Zigangiro, Kamil Published in: IEEE Transactions on Information Theory DOI: 10.1109/18.737541 Published:
More informationState estimation of linear dynamic system with unknown input and uncertain observation using dynamic programming
Control and Cybernetics vol. 35 (2006) No. 4 State estimation of linear dynamic system with unknown input and uncertain observation using dynamic programming by Dariusz Janczak and Yuri Grishin Department
More informationarxiv: v1 [physics.comp-ph] 17 Jan 2014
An efficient method for soling a correlated multi-item inentory system Chang-Yong Lee and Dongu Lee The Department of Industrial & Systems Engineering, Kongu National Uniersity, Kongu 34-70 South Korea
More informationInformation-Preserving Transformations for Signal Parameter Estimation
866 IEEE SIGNAL PROCESSING LETTERS, VOL. 21, NO. 7, JULY 2014 Information-Preserving Transformations for Signal Parameter Estimation Manuel Stein, Mario Castañeda, Amine Mezghani, and Josef A. Nossek Abstract
More informationEvolution Analysis of Iterative LMMSE-APP Detection for Coded Linear System with Cyclic Prefixes
Eolution Analysis of Iteratie LMMSE-APP Detection for Coded Linear System with Cyclic Prefixes Xiaoun Yuan Qinghua Guo and Li Ping Member IEEE Department of Electronic Engineering City Uniersity of Hong
More informationinforms DOI /moor.xxxx.xxxx c 20xx INFORMS
MATHEMATICS OF OPERATIONS RESEARCH Vol. 00, No. 0, Xxxxxx 20xx, pp. xxx xxx ISSN 0364-765X EISSN 526-547 xx 0000 0xxx informs DOI 0.287/moor.xxxx.xxxx c 20xx INFORMS A Distributional Interpretation of
More informationA spectral Turán theorem
A spectral Turán theorem Fan Chung Abstract If all nonzero eigenalues of the (normalized) Laplacian of a graph G are close to, then G is t-turán in the sense that any subgraph of G containing no K t+ contains
More informationEstimation for Nonlinear Dynamical Systems over Packet-Dropping Networks
Estimation for Nonlinear Dynamical Systems over Packet-Dropping Networks Zhipu Jin Chih-Kai Ko and Richard M Murray Abstract Two approaches, the extended Kalman filter (EKF) and moving horizon estimation
More informationEquivalence Probability and Sparsity of Two Sparse Solutions in Sparse Representation
IEEE TRANSACTIONS ON NEURAL NETWORKS, VOL. 19, NO. 12, DECEMBER 2008 2009 Equivalence Probability and Sparsity of Two Sparse Solutions in Sparse Representation Yuanqing Li, Member, IEEE, Andrzej Cichocki,
More informationIntroduction p. 1 Fundamental Problems p. 2 Core of Fundamental Theory and General Mathematical Ideas p. 3 Classical Statistical Decision p.
Preface p. xiii Acknowledgment p. xix Introduction p. 1 Fundamental Problems p. 2 Core of Fundamental Theory and General Mathematical Ideas p. 3 Classical Statistical Decision p. 4 Bayes Decision p. 5
More informationInsights into Cross-validation
Noname manuscript No. (will be inserted by the editor) Insights into Cross-alidation Amit Dhurandhar Alin Dobra Receied: date / Accepted: date Abstract Cross-alidation is one of the most widely used techniques,
More informationSensitivity Considerations in Compressed Sensing
Sensitivity Considerations in Compressed Sensing Louis L. Scharf, 1 Edwin K. P. Chong, 1,2 Ali Pezeshki, 2 and J. Rockey Luo 2 1 Department of Mathematics, Colorado State University Fort Collins, CO 8523,
More informationNONLINEAR STATISTICAL SIGNAL PROCESSING: A PARTI- CLE FILTERING APPROACH
NONLINEAR STATISTICAL SIGNAL PROCESSING: A PARTI- CLE FILTERING APPROACH J. V. Candy (tsoftware@aol.com) University of California, Lawrence Livermore National Lab. & Santa Barbara Livermore CA 94551 USA
More informationReview of Matrices and Vectors 1/45
Reiew of Matrices and Vectors /45 /45 Definition of Vector: A collection of comple or real numbers, generally put in a column [ ] T "! Transpose + + + b a b a b b a a " " " b a b a Definition of Vector
More informationOptimal Strategies for Communication and Remote Estimation with an Energy Harvesting Sensor
1 Optimal Strategies for Communication and Remote Estimation with an Energy Harvesting Sensor A. Nayyar, T. Başar, D. Teneketzis and V. V. Veeravalli Abstract We consider a remote estimation problem with
More informationI. INTRODUCTION 338 IEEE TRANSACTIONS ON AEROSPACE AND ELECTRONIC SYSTEMS VOL. 33, NO. 1 JANUARY 1997
VI. CONCLUSION We have shown that dilution of precision terms for relative positioning, using double-difference processing of GPS satellite signals, are bounded by the corresponding dilution of precision
More informationMath 425 Lecture 1: Vectors in R 3, R n
Math 425 Lecture 1: Vectors in R 3, R n Motiating Questions, Problems 1. Find the coordinates of a regular tetrahedron with center at the origin and sides of length 1. 2. What is the angle between the
More informationParticle Filtering for Large-Dimensional State Spaces With Multimodal Observation Likelihoods Namrata Vaswani
IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 56, NO. 10, OCTOBER 2008 4583 Particle Filtering for Large-Dimensional State Spaces With Multimodal Observation Likelihoods Namrata Vaswani Abstract We study
More informationMultiple Bits Distributed Moving Horizon State Estimation for Wireless Sensor Networks. Ji an Luo
Multiple Bits Distributed Moving Horizon State Estimation for Wireless Sensor Networks Ji an Luo 2008.6.6 Outline Background Problem Statement Main Results Simulation Study Conclusion Background Wireless
More informationOptimum Joint Detection and Estimation
20 IEEE International Symposium on Information Theory Proceedings Optimum Joint Detection and Estimation George V. Moustakides Department of Electrical and Computer Engineering University of Patras, 26500
More informationMarkov chain Monte Carlo methods for visual tracking
Markov chain Monte Carlo methods for visual tracking Ray Luo rluo@cory.eecs.berkeley.edu Department of Electrical Engineering and Computer Sciences University of California, Berkeley Berkeley, CA 94720
More informationSparse Bayesian Logistic Regression with Hierarchical Prior and Variational Inference
Sparse Bayesian Logistic Regression with Hierarchical Prior and Variational Inference Shunsuke Horii Waseda University s.horii@aoni.waseda.jp Abstract In this paper, we present a hierarchical model which
More informationAcoustic MIMO Signal Processing
Yiteng Huang Jacob Benesty Jingdong Chen Acoustic MIMO Signal Processing With 71 Figures Ö Springer Contents 1 Introduction 1 1.1 Acoustic MIMO Signal Processing 1 1.2 Organization of the Book 4 Part I
More informationBayesian Models in Machine Learning
Bayesian Models in Machine Learning Lukáš Burget Escuela de Ciencias Informáticas 2017 Buenos Aires, July 24-29 2017 Frequentist vs. Bayesian Frequentist point of view: Probability is the frequency of
More informationRate of Convergence of Learning in Social Networks
Rate of Convergence of Learning in Social Networks Ilan Lobel, Daron Acemoglu, Munther Dahleh and Asuman Ozdaglar Massachusetts Institute of Technology Cambridge, MA Abstract We study the rate of convergence
More informationEconometrics II - EXAM Outline Solutions All questions have 25pts Answer each question in separate sheets
Econometrics II - EXAM Outline Solutions All questions hae 5pts Answer each question in separate sheets. Consider the two linear simultaneous equations G with two exogeneous ariables K, y γ + y γ + x δ
More informationNon-Surjective Finite Alphabet Iterative Decoders
IEEE ICC 2016 - Communications Theory Non-Surjectie Finite Alphabet Iteratie Decoders Thien Truong Nguyen-Ly, Khoa Le,V.Sain, D. Declercq, F. Ghaffari and O. Boncalo CEA-LETI, MINATEC Campus, Grenoble,
More information