Conditional Approximate Message Passing with Side Information

Size: px
Start display at page:

Download "Conditional Approximate Message Passing with Side Information"

Transcription

1 Condiional Approximae Message Passing wih Side Informaion Dror Baron Dp. Elecrical & Comp. Eng. NC Sae Universiy Anna Ma Ins. Mah. Sciences Claremon Grad. U. Deanna Needell Dp. Mahemaics UCLA Cynhia Rush Dp. Saisics Columbia U. Tina Woolf Ins. Mah. Sciences Claremon Grad. U. Absrac In informaion heory, side informaion (SI) is ofen used o increase he efficiency of communicaion sysems. This work lays he framework for a class of Bayes-opimal signal recovery algorihms referred o as condiional approximae message passing (CAMP) ha make use of available SI. CAMP involves a linear inverse problem, where noisy, linear measuremens acquire an unknown inpu vecor using a measuremen marix wih independen and idenically disribued enries, and he SI vecor obeys a symbol-wise dependence wih he inpu. Despie having a simple and sraighforward derivaion, our CAMP algorihm obains lower mean squared error han oher signal recovery algorihms ha have been proposed o incorporae SI. The good performance of CAMP is due is Bayes-opimaliy properies, which are no presen in previous approaches o SI-aided signal recovery. I. INTRODUCTION The core focus of research in many disciplines is on accuraely recovering a high-dimensional, unknown inpu signal from a limied number of noisy linear measuremens by exploiing probabilisic characerisics and srucure of he inpu. We consider he following model for his ask. For an unknown inpu signal x R N, y Ax + z, (1) where y R M are noisy measuremens, A R M N is he measuremen marix, and z N (0, zi) is measuremen noise. The objecive of signal recovery is o recover or esimae x from knowledge of only y and A, and possible saisical knowledge abou x and z. In informaion heory [4], when separae communicaion sysems hemselves share side informaion (SI) in he form of addiional informaion abu he signal, overall communicaion ofen becomes more efficien. For example, hree dimensional (3D) video acquisiion could be performed by acquiring each frame of video, which is a D image, independenly of oher frames using a single pixel camera [13]. While recovering he curren frame, one is likely simulaneously recovering adjacen frames, which can be used as SI. Anoher example is channel esimaion in wireless sysems, where he channel srucure ofen varies slowly and he esimae from he previous ime bach can be used as SI for he curren bach. Our recovery algorihm has access o SI, allowing for improvemens in recovery qualiy. Approximae message passing or AMP [6], [7], [10] is a low-complexiy algorihmic framework for efficienly solving high-dimensional regression asks (1). AMP algorihms are derived as Gaussian or quadraic approximaions of loopy belief propagaion algorihms (e.g., min-sum, sum-produc) on he dense facor graph corresponding o (1). AMP has he aracive feaure ha under suiable condiions on A and x is performance can be racked accuraely wih a scalar ieraion called sae evoluion (SE) [1], [1]. In paricular, performance measures such as he l 1 - or l -error in he algorihm s ieraions concenrae o consans prediced by SE. AMP algorihms work by ieraively updaing esimaes of he unknown signal. These AMP updaes rely on appropriaely-chosen denoisers, which reduce he noise in each ieraion. Assuming ha he measuremen marix A has independen and idenically disribued (i.i.d.) N (0, 1/M) enries and he enries of he signal x are i.i.d. f(x), where f(x) is he probabiliy densiy funcion (pdf) of he signal, one useful feaure of AMP is ha he inpu o he denoiser a any ieraion,, is approximaely equal o he rue signal x plus i.i.d. Gaussian noise wih variance λ, a consan value given by he SE equaions. Owing o hese favorable properies, η in ieraion is ofen he minimum mean squared error (MMSE) denoiser based on he pdf of x: η (a) E[X X + λ Z a], () where Z N (0, 1), and X f(x) is a random variable (RV) whose pdf is he same as ha of x. In his paper, we se he foundaions for a novel Bayes-opimal algorihmic framework ha uses AMP for signal recovery from noisy linear measuremens where SI is available. In our problem seing, he marix A is i.i.d. Gaussian, and he SI akes he form of a sequence x R N, wih each symbol of x saisically dependen on he corresponding symbol of x hrough some join pdf f(x, X). While oher ypes of dependence beween x and x are also possible, we leave hese for fuure

2 research. For such symbol-by-symbol dependencies, we propose a condiional denoiser, η (a, b) E[X X + λ Z 1 a, X b], (3) which provides he MMSE esimae of he signal when SI is available o he denoiser. Applying his condiional denoiser (3) wihin AMP, we obain our condiional approximae message passing (CAMP) approach. Alhough our condiional denoiser (3) can be applied o various linear regression asks, in his paper we focus on underdeermined sysems, i.e., M < N, where x is approximaely K-sparse (meaning ha x has K nonzero enries on average) for K N. This framework has been exensively sudied in he compressed sensing lieraure [], [5]. We show ha despie having a simple and sraighforward derivaion, our CAMP algorihm ouperforms oher signal recovery algorihms ha have been proposed o incorporae SI. This improved recovery performance is likely due o he Bayes-opimal naure of CAMP, which is discussed in Secion II-B. Paper Ouline: In Secion II we inroduce he CAMP algorihm and lay he framework for is heoreical analysis in he case of Gaussian measuremen noise and SI wih addiive whie Gaussian noise (AWGN). While our analysis in his paper focuses on he case where he SI is he inpu wih AWGN for wo specific signal priors, CAMP can be used in any seing where a condiional denoiser (3) can be employed wihin AMP ieraions. In Secion III we sudy wo example pdfs for our inpu signal x, he nearly leas favorable signal and he Bernoulli-Gaussian signal. For hese examples we demonsrae how one derives he condiional denoiser (3) for SI wih AWGN. Finally, Secion IV presens numerical resuls demonsraing he good performance of CAMP for he wo examples compared o oher signal recovery algorihms making use of SI. Comparison o previous work. While inegraing SI ino signal recovery algorihms is no new [3], [8], [9], [11], [15], CAMP proposes a unified framework wihin AMP for ackling his problem ha suppors any per-symbol dependence beween he SI and signal pair. Prior work using SI has been eiher heurisic, limied in focus o specific applicaions, or ouside he AMP framework. For example, Wang and Liang [15] inegrae SI ino AMP for a specific signal prior densiy, bu he mehod lacks Bayes opimaliy properies and is difficul o generalize. The algorihmic framework we lay ou in Secion II overcomes hese limiaions. II. CONDITIONAL AMP FRAMEWORK A. Condiional AMP inroducion AMP algorihm: The sandard AMP algorihm [6] ieraively updaes esimaes of he unknown inpu signal wih x R N being he esimae a ieraion. The algorihm is given by he following se of updaes. Assume x 0 0 he all-zero vecor and updae for 0: r y Ax + r 1 η δ 1 (x 1 + A T r 1 ), (4) x +1 η (x + A T r ). (5) Noe ha η : R R is he denoising funcion a he -h ieraion and δ M N is he measuremen rae. The denoising funcion acs elemen-wise on is inpu and has derivaive η (w) w η (w). Moreover, w 1 N N i1 w i is he empirical mean, where w R N. Sae evoluion (SE): As noed earlier, i has been shown [1], [1] ha he pseudo-daa, x +A T r, which is he inpu o he denoiser, η ( ), is approximaely equal in disribuion o he rue inpu x plus independen Gaussian noise, i.e., x + λ Z, where Z N (0, I) R N, and he noise variance λ can be prediced by SE, inroduced below. These favorable saisical properies of he pseudo-daa are due o he presence of he Onsager erm, r 1 δ η 1 (x 1 + A T r 1 ), used in he residual sep (4) of he AMP updaes. Le λ 0 z + E[X ]/δ and for 0, λ z + 1 δ E [(η 1 (X + λ 1 Z) X) ], (6) where X f(x) is independen of Z N (0, 1). CAMP, inroduced below, inegraes SI ino AMP. CAMP for SI wih AWGN: Condiional AMP, or CAMP, uses he same updaes as in (4) - (5), bu incorporaes SI using he condiional denoiser given in (3). In he following, we discuss he proposed condiional denoiser (3) for wo sparse pdfs, where i is assumed ha he signal, X, and SI, X, are relaed by X X + N (0, I), (7) which follows he model presened in [15]. While his model of SI is quie limied, i is pracical for an iniial invesigaion of he CAMP framework. I should be noed ha CAMP easily suppors oher SI models. In wha follows, we use lower case leers o denoe realizaions of he RVs denoed by capial leers. Under he assumpion (7), he CAMP denoiser akes he form η (a, b) E[X X + λ Z 1 a, X + Z b], (8) for independen sandard Gaussian RVs Z 1 and Z, where λ is given by he SE equaions for CAMP: le λ 0 z + E[X ]/δ and for 0, λ z + 1 δ E [(η 1 (X + λ 1 Z 1, X + Z ) X) ], (9) where X f(x) is independen of Z 1 and Z.

3 B. Bayes-opimaliy properies of CAMP When he SI is he rue inpu wih AWGN, as in (7), we will show ha he fixed poins of he CAMP SE (9) coincide wih he fixed poins of sandard AMP SE (6) wih effecive measuremen rae δ eff αδ and effecive measuremen noise variance eff z/α where he enhancemen facor α 1 depends on he pdf of he signal and he SI noise variance given in (7). The enhancemen facor implies ha for SI wih AWGN, he CAMP algorihm emulaes signal recovery for sandard (no SI) wih more measuremens and reduced measuremen noise variance in he linear regression problem (1). Before demonsraing he aforemenioned Bayesopimaliy propery of CAMP, we use mached filer argumens o provide a simplified represenaion of he condiional denoiser of (3) for SI wih AWGN. In calculaing he CAMP denoiser (3), we wan o esimae X from wo noisy observaions, he pseudo daa, X + λ Z 1 a, and SI, X + Z b, where Z 1 and Z are sandard Gaussian RVs. We define signal and noise vecors as s [1 1] T and v [λ Z 1 Z ] T, respecively, where [ ] T is he ranspose operaor. The mached filer esimaes he unknown X by compuing he inner produc beween [ ] a b [ ] X + λ Z 1 sx + v, X + Z and a mached filer h R. An opimal h ha maximizes he signal o noise raio while having uni norm is compued by invering R v E[vv T ], he auocovariance marix of v, and is given by h (R v) 1 s (R v ) 1 s. I can be shown ha h [ λ ] T /( + λ ), and he inner produc is defined as µ (a, b) : µ (a, b) [a b] T, h a + bλ + λ Noe ha µ (X + λ Z 1, X + Z ) equals (X + λ Z 1 ) + (X + Z )λ + λ. (10) d X + Z, d where Z is sandard Gaussian, denoes equaliy in disribuion, and he variance erm, ( ), is ( ) (λ ) + (λ ) ( + λ ) λ + λ. (11) The above provides us wih he following simplificaion of he CAMP denoiser (8) for SI wih AWGN, η (a, b) E[X X + Z µ ], (1) where µ (a, b) and are defined in (10) and (11). We noe ha µ is a funcion of (a, b), bu for breviy we drop his dependence in he following. Using he represenaion in (1), we analyze he SE equaions in (9). Considering (9) and (1), noe ha η (X + λ Z 1, X + Z ) E[X X + Z]. (13) We can herefore simplify he SE equaions given in (9) using (13) and he definiion of in (11). Le λ 0 z + E[X ]/δ and for 0, ( [ ] ) λ z + 1 δ E E X X + λ 1 Z X. (14) + λ 1 The resuls in (1) and (14) provide a simplified way o calculae he condiional denoiser of (3) and he SE when he signal and he SI are relaed hrough Gaussian noise. Moreover, a he saionary poin of (14) we have [ λ z + [(E 1 ] ) ] δ E X X + λ +λ Z X, (15) where λ is he scalar channel variance. Comparing (6) (SE wihou SI) and (15), we denoe he variance in he condiional expecaion by λ λ +λ. Noe ha λ λ 0, because λ, and we can rewrie he λ above as λ ( λ )z + 1 [ ( λz] X) ] E E[X X +. δ λ (16) We herefore see ha he CAMP SE (9) has fixed poins coinciding wih he fixed poins of sandard AMP ( SE (6) ) wih effecive measuremen rae δ eff δ +λ and ( effecive ) measuremen noise variance eff +λ z where is he noise in he SI given in (7), λ is he saionary poin of (14), and he enhancemen facor is α ( + λ )/. This effecive change in δ and implies ha he incorporaion of SI wih AWGN via he CAMP algorihm gives us signal recovery for a sandard (no SI) linear regression problem (1) wih more measuremens and/or reduced measuremen noise variance han our own, and he effec becomes more pronounced, i.e., he enhancemen facor α increases, as he noise variance in he SI,, ges small. The above analysis relies on he fac ha for he condiional expecaion denoiser in sandard (no SI) AMP (4)- (5), he corresponding SE equaion (6) in is convergen saes coincides wih Tanaka s fixed poin equaion [14], ensuring ha if AMP runs unil i converges, he resul provides he bes possible mean square error (MSE) achieved by any algorihm under cerain condiions. (These condiions on δ and ɛ, while ouside he scope of his paper, ensure ha here is a single soluion o Tanaka s fixed poin equaion, since muliple soluions may creae a dispariy beween he MSE of AMP and 3

4 he MMSE [7], [16], implying ha CAMP migh be sub-opimal in such cases.) However, he above analysis relies heavily on he Gaussianiy of he SI noise and is generalizaion is lef for fuure work. III. TWO EXAMPLE CAMP DERIVATIONS Having discussed our CAMP esimaion framework, we derive he CAMP denoiser funcions of (8) for wo pdfs for he signal. A. Nearly leas favorable signal The firs is he nearly leas favorable (NLF) signal, as sudied in [15]. The NLF signal x is enry-wise i.i.d., where each individual enry x i of x obeys X i (1 ɛ)δ 0 + ɛ δ h + ɛ δ h, (17) where δ is a Dirac dela funcion a, h is a consan ha can be compued as in [15], and ɛ K N is he expeced sparsiy rae. Wih such a prior, he elemens of x ake only hree values: 0 wih probabiliy 1 ɛ and ±h, each wih probabiliy ɛ/. Considering (1), we can compue he denoiser easily via Bayes rule using he values µ and ( ) defined in (10) and (11), η (a, b) E[X X + Z µ ] hp (X h X + Z µ ) hp (X h X + Z µ ) hɛ f Z( µ h ) hɛ f Z( µ +h ) ɛ f Z( µ +h ) + ɛ f Z( µ h ) + (1 ɛ)f Z ( µ ), where f Z represens ( he) sandard Gaussian pdf, i.e., f Z (z) 1 π exp z. Then for he NLF signal, using he above derivaion for he denoiser, we can easily find he SE using he resul of (14). B. Bernoulli-Gaussian signal Nex, we derive he CAMP denoiser (8) for a Bernoulli-Gaussian (BG) signal pdf, 1 x X i ɛ exp i + (1 ɛ)δ 0, (18) π meaning ha each elemen of he signal akes he value 0 wih probabiliy 1 ɛ, and oherwise is sampled from a sandard Gaussian pdf. Again considering (1), we can compue he denoiser easily via Bayes rule, η (a, b) E[X X + Z µ ] P E[X X + Z µ, X 0]. (19) In he above, P P (X 0 X + Z µ ) is he probabiliy ha he individual enry being denoised is nonzero and i can be compued using Bayes rule: µ ɛf Z 1+ P. (1 ɛ)f Z ( µ µ ) + ɛf Z 1+ ρ MSE (GENP-AMP) MSE (Proposed) TABLE I: MSE of GENP-AMP and he proposed mehod for NLF signals, δ 0.1. We can simplify he expecaion in (19) as follows: E[X X + Z µ, X 0] x xf Z( µ x )f X X 0 (x)dx x f Z( µ x )f X X 0 (x)dx E Z where Z N (µ 1 ( ) ) 1 + ( ), 1 + ( ). Plugging he above ino (19), using he definiions in (10) and (11), we see ha ( a + bλ ) η (a, b) P + λ + λ. (0) The second erm in (0) can be inerpreed as a Wiener filer based on a and b. Then for he BG signal, using (0) we can easily find he SE using he resul of (14). IV. NUMERICAL RESULTS We now presen simulaion resuls for CAMP using our proposed condiional denoiser, and compare o he generalized elasic ne prior (GENP) AMP algorihm of Wang and Liang [15], a proposed AMP algorihm for incorporaing SI. The algorihm developed by Wang and Liang uses a (sub-opimal) sof-hreshold denoiser. In our firs experimen, we uilize he same seup as in [15]. The signal, generaed using he NLF disribuion (17), has dimension N, 000. The enries of he measuremen marix A are i.i.d. sandard Gaussian, wih uni norm columns, and δ M N 0.1. We se ρ ɛ δ K M as in [15]. The measuremen noise z in (1) has sandard Gaussian enries, and he Gaussian noise in x is N (0, ), where {1,, 4}. For each of 0 rials, we calculae he MSE as 1 N x x, where x is he oupu of he corresponding algorihm; he repored resul is he average MSE over all 0 rials. The resuls in Table I show ha he MSE of he proposed approach is lower han ha of GENP-AMP in each case. 4

5 MSE CAMP SE Ieraion Fig. 1: Empirical MSE performance of CAMP vs. SE predicion. (N 5, 000, M, 500, K 750, 0.1, and z 0.01.) Nex, we repea he same experimen for he BG signal defined in (18). Resuls for GENP-AMP and he proposed approach are given in Table II. Again, CAMP ouperforms GENP-AMP in each case. Finally, Figure 1 compares he empirical MSE performance of CAMP and he performance prediced using SE. To do so, we averaged over 0 rials of a BG recovery problem where N 5, 000, M, 500, K 750, 0.1, and z I can be seen ha he empirical MSE racks ha prediced by SE well. ρ MSE (GENP-AMP) MSE (Proposed) TABLE II: MSE of GENP-AMP and he proposed mehod for BG signals, δ 0.1. REFERENCES [1] M. Bayai and A. Monanari. The dynamics of message passing on dense graphs, wih applicaions o compressed sensing. IEEE Trans. Inf. Theory, 57(): , Feb [] E. Candès. Compressive sampling. In Proc. In. Congress of Mah., Madrid, Spain, Augus 006. [3] M. Chen, F. Renna, and M. Rodrigues. On he design of linear projecions for compressive sensing wih side informaion. In Proc. IEEE In. Symp. Inf. Theory (ISIT), pages , Barcelona, Spain, July 016. [4] T. Cover and J.Thomas. Elemens of Informaion Theory. New York, NY, USA: Wiley-Inerscience, 006. [5] D. Donoho. Compressed sensing. IEEE Trans. Inf. Theory, 5(4): , 006. [6] D. Donoho, A. Maleki, and A. Monanari. Message passing algorihms for compressed sensing. Proc. Nal. Acad. Sci., 106(45): , 009. [7] F. Krzakala, M. Mézard, F. Sausse, Y. Sun, and L. Zdeborová. Probabilisic reconsrucion in compressed sensing: Algorihms, phase diagrams, and hreshold achieving marices. J. Sa. Mech. Theory E., 01(08):P08009, Aug. 01. [8] H. Van Luong, J. Seiler, A. Kaup, S. Forchhammer, and N. Deligiannis. Measuremen bounds for sparse signal reconsrucion wih muliple side informaion. Arxiv preprin arxiv: , Jan [9] J. Moa, N. Deligiannis, and M. Rodrigues. Compressed sensing wih prior informaion: sraegies, geomery, and bounds. IEEE Trans. Inf. Theory, 63(7): , 017. [10] S. Rangan. Generalized approximae message passing for esimaion wih random linear mixing. Arxiv preprin arxiv: , Oc [11] F. Renna, L. Wang, X. Yuan, J. Yang, G. Reeves, A. Calderbank, L.Carin, and M. Rodrigues. Classificaion and reconsrucion of high-dimensional signals from low-dimensional feaures in he presence of side informaion. IEEE Trans. Inf. Theory, 6(11): , 016. [1] C. Rush and R. Venkaaramanan. Finie sample analysis of approximae message passing. Proc. IEEE In. Symp. Inf. Theory, June 015. Full version: hps://arxiv.org/abs/ [13] D. Takhar, J. Laska, M. Wakin, M. Duare, D. Baron, S. Sarvoham, K. Kelly, and R. Baraniuk. A new compressive imaging camera archiecure using opical-domain compression. In Elecronic Imaging 006, pages Inernaional Sociey for Opics and Phoonics, 006. [14] T. Tanaka. A saisical-mechanics approach o large-sysem analysis of CDMA muliuser deecors. IEEE Trans. Inf. Theory, 48(11): , Nov. 00. [15] X. Wang and J. Liang. Approximae message passing-based compressed sensing reconsrucion wih generalized elasic ne prior. Signal Processing: Image Communicaion, 37:19 33, 015. [16] J. Zhu and D. Baron. Performance regions in compressed sensing from noisy measuremens. In Conf. Inf. Sciences Sysems, pages 1 6. IEEE, 013. ACKNOWLEDGMENTS The auhors hank Junan Zhu for insighful conversaion, Yaning Ma for having he insigh ha led o he represenaion of he CAMP denoiser given in Secion II-B, specifically ha of (16), and Joe Zhou for helping us improve our manuscrip. In addiion, Needell acknowledges suppor from NSF CAREER #134871, and NSF BIGDATA #174035; and Baron acknowledges suppor from NSF EECS #

Notes on Kalman Filtering

Notes on Kalman Filtering Noes on Kalman Filering Brian Borchers and Rick Aser November 7, Inroducion Daa Assimilaion is he problem of merging model predicions wih acual measuremens of a sysem o produce an opimal esimae of he curren

More information

L07. KALMAN FILTERING FOR NON-LINEAR SYSTEMS. NA568 Mobile Robotics: Methods & Algorithms

L07. KALMAN FILTERING FOR NON-LINEAR SYSTEMS. NA568 Mobile Robotics: Methods & Algorithms L07. KALMAN FILTERING FOR NON-LINEAR SYSTEMS NA568 Mobile Roboics: Mehods & Algorihms Today s Topic Quick review on (Linear) Kalman Filer Kalman Filering for Non-Linear Sysems Exended Kalman Filer (EKF)

More information

Georey E. Hinton. University oftoronto. Technical Report CRG-TR February 22, Abstract

Georey E. Hinton. University oftoronto.   Technical Report CRG-TR February 22, Abstract Parameer Esimaion for Linear Dynamical Sysems Zoubin Ghahramani Georey E. Hinon Deparmen of Compuer Science Universiy oftorono 6 King's College Road Torono, Canada M5S A4 Email: zoubin@cs.orono.edu Technical

More information

PENALIZED LEAST SQUARES AND PENALIZED LIKELIHOOD

PENALIZED LEAST SQUARES AND PENALIZED LIKELIHOOD PENALIZED LEAST SQUARES AND PENALIZED LIKELIHOOD HAN XIAO 1. Penalized Leas Squares Lasso solves he following opimizaion problem, ˆβ lasso = arg max β R p+1 1 N y i β 0 N x ij β j β j (1.1) for some 0.

More information

State-Space Models. Initialization, Estimation and Smoothing of the Kalman Filter

State-Space Models. Initialization, Estimation and Smoothing of the Kalman Filter Sae-Space Models Iniializaion, Esimaion and Smoohing of he Kalman Filer Iniializaion of he Kalman Filer The Kalman filer shows how o updae pas predicors and he corresponding predicion error variances when

More information

Zürich. ETH Master Course: L Autonomous Mobile Robots Localization II

Zürich. ETH Master Course: L Autonomous Mobile Robots Localization II Roland Siegwar Margaria Chli Paul Furgale Marco Huer Marin Rufli Davide Scaramuzza ETH Maser Course: 151-0854-00L Auonomous Mobile Robos Localizaion II ACT and SEE For all do, (predicion updae / ACT),

More information

Diebold, Chapter 7. Francis X. Diebold, Elements of Forecasting, 4th Edition (Mason, Ohio: Cengage Learning, 2006). Chapter 7. Characterizing Cycles

Diebold, Chapter 7. Francis X. Diebold, Elements of Forecasting, 4th Edition (Mason, Ohio: Cengage Learning, 2006). Chapter 7. Characterizing Cycles Diebold, Chaper 7 Francis X. Diebold, Elemens of Forecasing, 4h Ediion (Mason, Ohio: Cengage Learning, 006). Chaper 7. Characerizing Cycles Afer compleing his reading you should be able o: Define covariance

More information

Robust estimation based on the first- and third-moment restrictions of the power transformation model

Robust estimation based on the first- and third-moment restrictions of the power transformation model h Inernaional Congress on Modelling and Simulaion, Adelaide, Ausralia, 6 December 3 www.mssanz.org.au/modsim3 Robus esimaion based on he firs- and hird-momen resricions of he power ransformaion Nawaa,

More information

Vehicle Arrival Models : Headway

Vehicle Arrival Models : Headway Chaper 12 Vehicle Arrival Models : Headway 12.1 Inroducion Modelling arrival of vehicle a secion of road is an imporan sep in raffic flow modelling. I has imporan applicaion in raffic flow simulaion where

More information

Two Popular Bayesian Estimators: Particle and Kalman Filters. McGill COMP 765 Sept 14 th, 2017

Two Popular Bayesian Estimators: Particle and Kalman Filters. McGill COMP 765 Sept 14 th, 2017 Two Popular Bayesian Esimaors: Paricle and Kalman Filers McGill COMP 765 Sep 14 h, 2017 1 1 1, dx x Bel x u x P x z P Recall: Bayes Filers,,,,,,, 1 1 1 1 u z u x P u z u x z P Bayes z = observaion u =

More information

WATER LEVEL TRACKING WITH CONDENSATION ALGORITHM

WATER LEVEL TRACKING WITH CONDENSATION ALGORITHM WATER LEVEL TRACKING WITH CONDENSATION ALGORITHM Shinsuke KOBAYASHI, Shogo MURAMATSU, Hisakazu KIKUCHI, Masahiro IWAHASHI Dep. of Elecrical and Elecronic Eng., Niigaa Universiy, 8050 2-no-cho Igarashi,

More information

Modal identification of structures from roving input data by means of maximum likelihood estimation of the state space model

Modal identification of structures from roving input data by means of maximum likelihood estimation of the state space model Modal idenificaion of srucures from roving inpu daa by means of maximum likelihood esimaion of he sae space model J. Cara, J. Juan, E. Alarcón Absrac The usual way o perform a forced vibraion es is o fix

More information

CHAPTER 10 VALIDATION OF TEST WITH ARTIFICAL NEURAL NETWORK

CHAPTER 10 VALIDATION OF TEST WITH ARTIFICAL NEURAL NETWORK 175 CHAPTER 10 VALIDATION OF TEST WITH ARTIFICAL NEURAL NETWORK 10.1 INTRODUCTION Amongs he research work performed, he bes resuls of experimenal work are validaed wih Arificial Neural Nework. From he

More information

1 Review of Zero-Sum Games

1 Review of Zero-Sum Games COS 5: heoreical Machine Learning Lecurer: Rob Schapire Lecure #23 Scribe: Eugene Brevdo April 30, 2008 Review of Zero-Sum Games Las ime we inroduced a mahemaical model for wo player zero-sum games. Any

More information

Block Diagram of a DCS in 411

Block Diagram of a DCS in 411 Informaion source Forma A/D From oher sources Pulse modu. Muliplex Bandpass modu. X M h: channel impulse response m i g i s i Digial inpu Digial oupu iming and synchronizaion Digial baseband/ bandpass

More information

Augmented Reality II - Kalman Filters - Gudrun Klinker May 25, 2004

Augmented Reality II - Kalman Filters - Gudrun Klinker May 25, 2004 Augmened Realiy II Kalman Filers Gudrun Klinker May 25, 2004 Ouline Moivaion Discree Kalman Filer Modeled Process Compuing Model Parameers Algorihm Exended Kalman Filer Kalman Filer for Sensor Fusion Lieraure

More information

ACE 562 Fall Lecture 5: The Simple Linear Regression Model: Sampling Properties of the Least Squares Estimators. by Professor Scott H.

ACE 562 Fall Lecture 5: The Simple Linear Regression Model: Sampling Properties of the Least Squares Estimators. by Professor Scott H. ACE 56 Fall 005 Lecure 5: he Simple Linear Regression Model: Sampling Properies of he Leas Squares Esimaors by Professor Sco H. Irwin Required Reading: Griffihs, Hill and Judge. "Inference in he Simple

More information

STATE-SPACE MODELLING. A mass balance across the tank gives:

STATE-SPACE MODELLING. A mass balance across the tank gives: B. Lennox and N.F. Thornhill, 9, Sae Space Modelling, IChemE Process Managemen and Conrol Subjec Group Newsleer STE-SPACE MODELLING Inroducion: Over he pas decade or so here has been an ever increasing

More information

14 Autoregressive Moving Average Models

14 Autoregressive Moving Average Models 14 Auoregressive Moving Average Models In his chaper an imporan parameric family of saionary ime series is inroduced, he family of he auoregressive moving average, or ARMA, processes. For a large class

More information

Physics 235 Chapter 2. Chapter 2 Newtonian Mechanics Single Particle

Physics 235 Chapter 2. Chapter 2 Newtonian Mechanics Single Particle Chaper 2 Newonian Mechanics Single Paricle In his Chaper we will review wha Newon s laws of mechanics ell us abou he moion of a single paricle. Newon s laws are only valid in suiable reference frames,

More information

Financial Econometrics Kalman Filter: some applications to Finance University of Evry - Master 2

Financial Econometrics Kalman Filter: some applications to Finance University of Evry - Master 2 Financial Economerics Kalman Filer: some applicaions o Finance Universiy of Evry - Maser 2 Eric Bouyé January 27, 2009 Conens 1 Sae-space models 2 2 The Scalar Kalman Filer 2 21 Presenaion 2 22 Summary

More information

Kriging Models Predicting Atrazine Concentrations in Surface Water Draining Agricultural Watersheds

Kriging Models Predicting Atrazine Concentrations in Surface Water Draining Agricultural Watersheds 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 Kriging Models Predicing Arazine Concenraions in Surface Waer Draining Agriculural Waersheds Paul L. Mosquin, Jeremy Aldworh, Wenlin Chen Supplemenal Maerial Number

More information

References are appeared in the last slide. Last update: (1393/08/19)

References are appeared in the last slide. Last update: (1393/08/19) SYSEM IDEIFICAIO Ali Karimpour Associae Professor Ferdowsi Universi of Mashhad References are appeared in he las slide. Las updae: 0..204 393/08/9 Lecure 5 lecure 5 Parameer Esimaion Mehods opics o be

More information

10. State Space Methods

10. State Space Methods . Sae Space Mehods. Inroducion Sae space modelling was briefly inroduced in chaper. Here more coverage is provided of sae space mehods before some of heir uses in conrol sysem design are covered in he

More information

Sequential Importance Resampling (SIR) Particle Filter

Sequential Importance Resampling (SIR) Particle Filter Paricle Filers++ Pieer Abbeel UC Berkeley EECS Many slides adaped from Thrun, Burgard and Fox, Probabilisic Roboics 1. Algorihm paricle_filer( S -1, u, z ): 2. Sequenial Imporance Resampling (SIR) Paricle

More information

Online Convex Optimization Example And Follow-The-Leader

Online Convex Optimization Example And Follow-The-Leader CSE599s, Spring 2014, Online Learning Lecure 2-04/03/2014 Online Convex Opimizaion Example And Follow-The-Leader Lecurer: Brendan McMahan Scribe: Sephen Joe Jonany 1 Review of Online Convex Opimizaion

More information

An recursive analytical technique to estimate time dependent physical parameters in the presence of noise processes

An recursive analytical technique to estimate time dependent physical parameters in the presence of noise processes WHAT IS A KALMAN FILTER An recursive analyical echnique o esimae ime dependen physical parameers in he presence of noise processes Example of a ime and frequency applicaion: Offse beween wo clocks PREDICTORS,

More information

Probabilistic Robotics

Probabilistic Robotics Probabilisic Roboics Bayes Filer Implemenaions Gaussian filers Bayes Filer Reminder Predicion bel p u bel d Correcion bel η p z bel Gaussians : ~ π e p N p - Univariae / / : ~ μ μ μ e p Ν p d π Mulivariae

More information

t is a basis for the solution space to this system, then the matrix having these solutions as columns, t x 1 t, x 2 t,... x n t x 2 t...

t is a basis for the solution space to this system, then the matrix having these solutions as columns, t x 1 t, x 2 t,... x n t x 2 t... Mah 228- Fri Mar 24 5.6 Marix exponenials and linear sysems: The analogy beween firs order sysems of linear differenial equaions (Chaper 5) and scalar linear differenial equaions (Chaper ) is much sronger

More information

Math 10B: Mock Mid II. April 13, 2016

Math 10B: Mock Mid II. April 13, 2016 Name: Soluions Mah 10B: Mock Mid II April 13, 016 1. ( poins) Sae, wih jusificaion, wheher he following saemens are rue or false. (a) If a 3 3 marix A saisfies A 3 A = 0, hen i canno be inverible. True.

More information

Class Meeting # 10: Introduction to the Wave Equation

Class Meeting # 10: Introduction to the Wave Equation MATH 8.5 COURSE NOTES - CLASS MEETING # 0 8.5 Inroducion o PDEs, Fall 0 Professor: Jared Speck Class Meeing # 0: Inroducion o he Wave Equaion. Wha is he wave equaion? The sandard wave equaion for a funcion

More information

Stability and Bifurcation in a Neural Network Model with Two Delays

Stability and Bifurcation in a Neural Network Model with Two Delays Inernaional Mahemaical Forum, Vol. 6, 11, no. 35, 175-1731 Sabiliy and Bifurcaion in a Neural Nework Model wih Two Delays GuangPing Hu and XiaoLing Li School of Mahemaics and Physics, Nanjing Universiy

More information

Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation

Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation Course Noes for EE7C Spring 018: Convex Opimizaion and Approximaion Insrucor: Moriz Hard Email: hard+ee7c@berkeley.edu Graduae Insrucor: Max Simchowiz Email: msimchow+ee7c@berkeley.edu Ocober 15, 018 3

More information

Article from. Predictive Analytics and Futurism. July 2016 Issue 13

Article from. Predictive Analytics and Futurism. July 2016 Issue 13 Aricle from Predicive Analyics and Fuurism July 6 Issue An Inroducion o Incremenal Learning By Qiang Wu and Dave Snell Machine learning provides useful ools for predicive analyics The ypical machine learning

More information

Random Walk with Anti-Correlated Steps

Random Walk with Anti-Correlated Steps Random Walk wih Ani-Correlaed Seps John Noga Dirk Wagner 2 Absrac We conjecure he expeced value of random walks wih ani-correlaed seps o be exacly. We suppor his conjecure wih 2 plausibiliy argumens and

More information

A Specification Test for Linear Dynamic Stochastic General Equilibrium Models

A Specification Test for Linear Dynamic Stochastic General Equilibrium Models Journal of Saisical and Economeric Mehods, vol.1, no.2, 2012, 65-70 ISSN: 2241-0384 (prin), 2241-0376 (online) Scienpress Ld, 2012 A Specificaion Tes for Linear Dynamic Sochasic General Equilibrium Models

More information

Lecture 2 October ε-approximation of 2-player zero-sum games

Lecture 2 October ε-approximation of 2-player zero-sum games Opimizaion II Winer 009/10 Lecurer: Khaled Elbassioni Lecure Ocober 19 1 ε-approximaion of -player zero-sum games In his lecure we give a randomized ficiious play algorihm for obaining an approximae soluion

More information

SEIF, EnKF, EKF SLAM. Pieter Abbeel UC Berkeley EECS

SEIF, EnKF, EKF SLAM. Pieter Abbeel UC Berkeley EECS SEIF, EnKF, EKF SLAM Pieer Abbeel UC Berkeley EECS Informaion Filer From an analyical poin of view == Kalman filer Difference: keep rack of he inverse covariance raher han he covariance marix [maer of

More information

23.2. Representing Periodic Functions by Fourier Series. Introduction. Prerequisites. Learning Outcomes

23.2. Representing Periodic Functions by Fourier Series. Introduction. Prerequisites. Learning Outcomes Represening Periodic Funcions by Fourier Series 3. Inroducion In his Secion we show how a periodic funcion can be expressed as a series of sines and cosines. We begin by obaining some sandard inegrals

More information

GMM - Generalized Method of Moments

GMM - Generalized Method of Moments GMM - Generalized Mehod of Momens Conens GMM esimaion, shor inroducion 2 GMM inuiion: Maching momens 2 3 General overview of GMM esimaion. 3 3. Weighing marix...........................................

More information

Sliding Mode Extremum Seeking Control for Linear Quadratic Dynamic Game

Sliding Mode Extremum Seeking Control for Linear Quadratic Dynamic Game Sliding Mode Exremum Seeking Conrol for Linear Quadraic Dynamic Game Yaodong Pan and Ümi Özgüner ITS Research Group, AIST Tsukuba Eas Namiki --, Tsukuba-shi,Ibaraki-ken 5-856, Japan e-mail: pan.yaodong@ais.go.jp

More information

A new flexible Weibull distribution

A new flexible Weibull distribution Communicaions for Saisical Applicaions and Mehods 2016, Vol. 23, No. 5, 399 409 hp://dx.doi.org/10.5351/csam.2016.23.5.399 Prin ISSN 2287-7843 / Online ISSN 2383-4757 A new flexible Weibull disribuion

More information

Recursive Least-Squares Fixed-Interval Smoother Using Covariance Information based on Innovation Approach in Linear Continuous Stochastic Systems

Recursive Least-Squares Fixed-Interval Smoother Using Covariance Information based on Innovation Approach in Linear Continuous Stochastic Systems 8 Froniers in Signal Processing, Vol. 1, No. 1, July 217 hps://dx.doi.org/1.2266/fsp.217.112 Recursive Leas-Squares Fixed-Inerval Smooher Using Covariance Informaion based on Innovaion Approach in Linear

More information

Introduction to Mobile Robotics

Introduction to Mobile Robotics Inroducion o Mobile Roboics Bayes Filer Kalman Filer Wolfram Burgard Cyrill Sachniss Giorgio Grisei Maren Bennewiz Chrisian Plagemann Bayes Filer Reminder Predicion bel p u bel d Correcion bel η p z bel

More information

Using the Kalman filter Extended Kalman filter

Using the Kalman filter Extended Kalman filter Using he Kalman filer Eended Kalman filer Doz. G. Bleser Prof. Sricker Compuer Vision: Objec and People Tracking SA- Ouline Recap: Kalman filer algorihm Using Kalman filers Eended Kalman filer algorihm

More information

Tracking. Announcements

Tracking. Announcements Tracking Tuesday, Nov 24 Krisen Grauman UT Ausin Announcemens Pse 5 ou onigh, due 12/4 Shorer assignmen Auo exension il 12/8 I will no hold office hours omorrow 5 6 pm due o Thanksgiving 1 Las ime: Moion

More information

Simulation-Solving Dynamic Models ABE 5646 Week 2, Spring 2010

Simulation-Solving Dynamic Models ABE 5646 Week 2, Spring 2010 Simulaion-Solving Dynamic Models ABE 5646 Week 2, Spring 2010 Week Descripion Reading Maerial 2 Compuer Simulaion of Dynamic Models Finie Difference, coninuous saes, discree ime Simple Mehods Euler Trapezoid

More information

Chapter 2. First Order Scalar Equations

Chapter 2. First Order Scalar Equations Chaper. Firs Order Scalar Equaions We sar our sudy of differenial equaions in he same way he pioneers in his field did. We show paricular echniques o solve paricular ypes of firs order differenial equaions.

More information

Time series model fitting via Kalman smoothing and EM estimation in TimeModels.jl

Time series model fitting via Kalman smoothing and EM estimation in TimeModels.jl Time series model fiing via Kalman smoohing and EM esimaion in TimeModels.jl Gord Sephen Las updaed: January 206 Conens Inroducion 2. Moivaion and Acknowledgemens....................... 2.2 Noaion......................................

More information

Speaker Adaptation Techniques For Continuous Speech Using Medium and Small Adaptation Data Sets. Constantinos Boulis

Speaker Adaptation Techniques For Continuous Speech Using Medium and Small Adaptation Data Sets. Constantinos Boulis Speaker Adapaion Techniques For Coninuous Speech Using Medium and Small Adapaion Daa Ses Consaninos Boulis Ouline of he Presenaion Inroducion o he speaker adapaion problem Maximum Likelihood Sochasic Transformaions

More information

Introduction to Probability and Statistics Slides 4 Chapter 4

Introduction to Probability and Statistics Slides 4 Chapter 4 Inroducion o Probabiliy and Saisics Slides 4 Chaper 4 Ammar M. Sarhan, asarhan@mahsa.dal.ca Deparmen of Mahemaics and Saisics, Dalhousie Universiy Fall Semeser 8 Dr. Ammar Sarhan Chaper 4 Coninuous Random

More information

Multi-scale 2D acoustic full waveform inversion with high frequency impulsive source

Multi-scale 2D acoustic full waveform inversion with high frequency impulsive source Muli-scale D acousic full waveform inversion wih high frequency impulsive source Vladimir N Zubov*, Universiy of Calgary, Calgary AB vzubov@ucalgaryca and Michael P Lamoureux, Universiy of Calgary, Calgary

More information

Christos Papadimitriou & Luca Trevisan November 22, 2016

Christos Papadimitriou & Luca Trevisan November 22, 2016 U.C. Bereley CS170: Algorihms Handou LN-11-22 Chrisos Papadimiriou & Luca Trevisan November 22, 2016 Sreaming algorihms In his lecure and he nex one we sudy memory-efficien algorihms ha process a sream

More information

An introduction to the theory of SDDP algorithm

An introduction to the theory of SDDP algorithm An inroducion o he heory of SDDP algorihm V. Leclère (ENPC) Augus 1, 2014 V. Leclère Inroducion o SDDP Augus 1, 2014 1 / 21 Inroducion Large scale sochasic problem are hard o solve. Two ways of aacking

More information

How to Deal with Structural Breaks in Practical Cointegration Analysis

How to Deal with Structural Breaks in Practical Cointegration Analysis How o Deal wih Srucural Breaks in Pracical Coinegraion Analysis Roselyne Joyeux * School of Economic and Financial Sudies Macquarie Universiy December 00 ABSTRACT In his noe we consider he reamen of srucural

More information

Testing for a Single Factor Model in the Multivariate State Space Framework

Testing for a Single Factor Model in the Multivariate State Space Framework esing for a Single Facor Model in he Mulivariae Sae Space Framework Chen C.-Y. M. Chiba and M. Kobayashi Inernaional Graduae School of Social Sciences Yokohama Naional Universiy Japan Faculy of Economics

More information

Introduction D P. r = constant discount rate, g = Gordon Model (1962): constant dividend growth rate.

Introduction D P. r = constant discount rate, g = Gordon Model (1962): constant dividend growth rate. Inroducion Gordon Model (1962): D P = r g r = consan discoun rae, g = consan dividend growh rae. If raional expecaions of fuure discoun raes and dividend growh vary over ime, so should he D/P raio. Since

More information

20. Applications of the Genetic-Drift Model

20. Applications of the Genetic-Drift Model 0. Applicaions of he Geneic-Drif Model 1) Deermining he probabiliy of forming any paricular combinaion of genoypes in he nex generaion: Example: If he parenal allele frequencies are p 0 = 0.35 and q 0

More information

Tom Heskes and Onno Zoeter. Presented by Mark Buller

Tom Heskes and Onno Zoeter. Presented by Mark Buller Tom Heskes and Onno Zoeer Presened by Mark Buller Dynamic Bayesian Neworks Direced graphical models of sochasic processes Represen hidden and observed variables wih differen dependencies Generalize Hidden

More information

Anno accademico 2006/2007. Davide Migliore

Anno accademico 2006/2007. Davide Migliore Roboica Anno accademico 2006/2007 Davide Migliore migliore@ele.polimi.i Today Eercise session: An Off-side roblem Robo Vision Task Measuring NBA layers erformance robabilisic Roboics Inroducion The Bayesian

More information

Appendix to Online l 1 -Dictionary Learning with Application to Novel Document Detection

Appendix to Online l 1 -Dictionary Learning with Application to Novel Document Detection Appendix o Online l -Dicionary Learning wih Applicaion o Novel Documen Deecion Shiva Prasad Kasiviswanahan Huahua Wang Arindam Banerjee Prem Melville A Background abou ADMM In his secion, we give a brief

More information

Ensamble methods: Bagging and Boosting

Ensamble methods: Bagging and Boosting Lecure 21 Ensamble mehods: Bagging and Boosing Milos Hauskrech milos@cs.pi.edu 5329 Senno Square Ensemble mehods Mixure of expers Muliple base models (classifiers, regressors), each covers a differen par

More information

Solutions from Chapter 9.1 and 9.2

Solutions from Chapter 9.1 and 9.2 Soluions from Chaper 9 and 92 Secion 9 Problem # This basically boils down o an exercise in he chain rule from calculus We are looking for soluions of he form: u( x) = f( k x c) where k x R 3 and k is

More information

UNIVERSITY OF CALIFORNIA College of Engineering Department of Electrical Engineering and Computer Sciences EECS 121 FINAL EXAM

UNIVERSITY OF CALIFORNIA College of Engineering Department of Electrical Engineering and Computer Sciences EECS 121 FINAL EXAM Name: UNIVERSIY OF CALIFORNIA College of Engineering Deparmen of Elecrical Engineering and Compuer Sciences Professor David se EECS 121 FINAL EXAM 21 May 1997, 5:00-8:00 p.m. Please wrie answers on blank

More information

Mathematical Theory and Modeling ISSN (Paper) ISSN (Online) Vol 3, No.3, 2013

Mathematical Theory and Modeling ISSN (Paper) ISSN (Online) Vol 3, No.3, 2013 Mahemaical Theory and Modeling ISSN -580 (Paper) ISSN 5-05 (Online) Vol, No., 0 www.iise.org The ffec of Inverse Transformaion on he Uni Mean and Consan Variance Assumpions of a Muliplicaive rror Model

More information

Understanding the asymptotic behaviour of empirical Bayes methods

Understanding the asymptotic behaviour of empirical Bayes methods Undersanding he asympoic behaviour of empirical Bayes mehods Boond Szabo, Aad van der Vaar and Harry van Zanen EURANDOM, 11.10.2011. Conens 2/20 Moivaion Nonparameric Bayesian saisics Signal in Whie noise

More information

Air Traffic Forecast Empirical Research Based on the MCMC Method

Air Traffic Forecast Empirical Research Based on the MCMC Method Compuer and Informaion Science; Vol. 5, No. 5; 0 ISSN 93-8989 E-ISSN 93-8997 Published by Canadian Cener of Science and Educaion Air Traffic Forecas Empirical Research Based on he MCMC Mehod Jian-bo Wang,

More information

OBJECTIVES OF TIME SERIES ANALYSIS

OBJECTIVES OF TIME SERIES ANALYSIS OBJECTIVES OF TIME SERIES ANALYSIS Undersanding he dynamic or imedependen srucure of he observaions of a single series (univariae analysis) Forecasing of fuure observaions Asceraining he leading, lagging

More information

Mean-square Stability Control for Networked Systems with Stochastic Time Delay

Mean-square Stability Control for Networked Systems with Stochastic Time Delay JOURNAL OF SIMULAION VOL. 5 NO. May 7 Mean-square Sabiliy Conrol for Newored Sysems wih Sochasic ime Delay YAO Hejun YUAN Fushun School of Mahemaics and Saisics Anyang Normal Universiy Anyang Henan. 455

More information

Oscillation of an Euler Cauchy Dynamic Equation S. Huff, G. Olumolode, N. Pennington, and A. Peterson

Oscillation of an Euler Cauchy Dynamic Equation S. Huff, G. Olumolode, N. Pennington, and A. Peterson PROCEEDINGS OF THE FOURTH INTERNATIONAL CONFERENCE ON DYNAMICAL SYSTEMS AND DIFFERENTIAL EQUATIONS May 4 7, 00, Wilmingon, NC, USA pp 0 Oscillaion of an Euler Cauchy Dynamic Equaion S Huff, G Olumolode,

More information

Optimal Path Planning for Flexible Redundant Robot Manipulators

Optimal Path Planning for Flexible Redundant Robot Manipulators 25 WSEAS In. Conf. on DYNAMICAL SYSEMS and CONROL, Venice, Ialy, November 2-4, 25 (pp363-368) Opimal Pah Planning for Flexible Redundan Robo Manipulaors H. HOMAEI, M. KESHMIRI Deparmen of Mechanical Engineering

More information

Západočeská Univerzita v Plzni, Czech Republic and Groupe ESIEE Paris, France

Západočeská Univerzita v Plzni, Czech Republic and Groupe ESIEE Paris, France ADAPTIVE SIGNAL PROCESSING USING MAXIMUM ENTROPY ON THE MEAN METHOD AND MONTE CARLO ANALYSIS Pavla Holejšovsá, Ing. *), Z. Peroua, Ing. **), J.-F. Bercher, Prof. Assis. ***) Západočesá Univerzia v Plzni,

More information

The electromagnetic interference in case of onboard navy ships computers - a new approach

The electromagnetic interference in case of onboard navy ships computers - a new approach The elecromagneic inerference in case of onboard navy ships compuers - a new approach Prof. dr. ing. Alexandru SOTIR Naval Academy Mircea cel Bărân, Fulgerului Sree, Consanţa, soiralexandru@yahoo.com Absrac.

More information

Final Spring 2007

Final Spring 2007 .615 Final Spring 7 Overview The purpose of he final exam is o calculae he MHD β limi in a high-bea oroidal okamak agains he dangerous n = 1 exernal ballooning-kink mode. Effecively, his corresponds o

More information

Application of a Stochastic-Fuzzy Approach to Modeling Optimal Discrete Time Dynamical Systems by Using Large Scale Data Processing

Application of a Stochastic-Fuzzy Approach to Modeling Optimal Discrete Time Dynamical Systems by Using Large Scale Data Processing Applicaion of a Sochasic-Fuzzy Approach o Modeling Opimal Discree Time Dynamical Sysems by Using Large Scale Daa Processing AA WALASZE-BABISZEWSA Deparmen of Compuer Engineering Opole Universiy of Technology

More information

A Bayesian Approach to Spectral Analysis

A Bayesian Approach to Spectral Analysis Chirped Signals A Bayesian Approach o Specral Analysis Chirped signals are oscillaing signals wih ime variable frequencies, usually wih a linear variaion of frequency wih ime. E.g. f() = A cos(ω + α 2

More information

Ensamble methods: Boosting

Ensamble methods: Boosting Lecure 21 Ensamble mehods: Boosing Milos Hauskrech milos@cs.pi.edu 5329 Senno Square Schedule Final exam: April 18: 1:00-2:15pm, in-class Term projecs April 23 & April 25: a 1:00-2:30pm in CS seminar room

More information

Testing the Random Walk Model. i.i.d. ( ) r

Testing the Random Walk Model. i.i.d. ( ) r he random walk heory saes: esing he Random Walk Model µ ε () np = + np + Momen Condiions where where ε ~ i.i.d he idea here is o es direcly he resricions imposed by momen condiions. lnp lnp µ ( lnp lnp

More information

Estimation of Poses with Particle Filters

Estimation of Poses with Particle Filters Esimaion of Poses wih Paricle Filers Dr.-Ing. Bernd Ludwig Chair for Arificial Inelligence Deparmen of Compuer Science Friedrich-Alexander-Universiä Erlangen-Nürnberg 12/05/2008 Dr.-Ing. Bernd Ludwig (FAU

More information

Econ107 Applied Econometrics Topic 7: Multicollinearity (Studenmund, Chapter 8)

Econ107 Applied Econometrics Topic 7: Multicollinearity (Studenmund, Chapter 8) I. Definiions and Problems A. Perfec Mulicollineariy Econ7 Applied Economerics Topic 7: Mulicollineariy (Sudenmund, Chaper 8) Definiion: Perfec mulicollineariy exiss in a following K-variable regression

More information

IMPLICIT AND INVERSE FUNCTION THEOREMS PAUL SCHRIMPF 1 OCTOBER 25, 2013

IMPLICIT AND INVERSE FUNCTION THEOREMS PAUL SCHRIMPF 1 OCTOBER 25, 2013 IMPLICI AND INVERSE FUNCION HEOREMS PAUL SCHRIMPF 1 OCOBER 25, 213 UNIVERSIY OF BRIISH COLUMBIA ECONOMICS 526 We have exensively sudied how o solve sysems of linear equaions. We know how o check wheher

More information

Macroeconomic Theory Ph.D. Qualifying Examination Fall 2005 ANSWER EACH PART IN A SEPARATE BLUE BOOK. PART ONE: ANSWER IN BOOK 1 WEIGHT 1/3

Macroeconomic Theory Ph.D. Qualifying Examination Fall 2005 ANSWER EACH PART IN A SEPARATE BLUE BOOK. PART ONE: ANSWER IN BOOK 1 WEIGHT 1/3 Macroeconomic Theory Ph.D. Qualifying Examinaion Fall 2005 Comprehensive Examinaion UCLA Dep. of Economics You have 4 hours o complee he exam. There are hree pars o he exam. Answer all pars. Each par has

More information

Linear Response Theory: The connection between QFT and experiments

Linear Response Theory: The connection between QFT and experiments Phys540.nb 39 3 Linear Response Theory: The connecion beween QFT and experimens 3.1. Basic conceps and ideas Q: How do we measure he conduciviy of a meal? A: we firs inroduce a weak elecric field E, and

More information

Compressed Sensing via Universal Denoising and Approximate Message Passing

Compressed Sensing via Universal Denoising and Approximate Message Passing Compressed Sensing via Universal Denoising and Approximae Message Passing Yaning Ma, Junan Zhu, and Dror Baron Deparmen of Elecrical and Compuer Engineering Norh Carolina Sae Universiy; Raleigh, NC 7695,

More information

New effective moduli of isotropic viscoelastic composites. Part I. Theoretical justification

New effective moduli of isotropic viscoelastic composites. Part I. Theoretical justification IOP Conference Series: Maerials Science and Engineering PAPE OPEN ACCESS New effecive moduli of isoropic viscoelasic composies. Par I. Theoreical jusificaion To cie his aricle: A A Sveashkov and A A akurov

More information

Isolated-word speech recognition using hidden Markov models

Isolated-word speech recognition using hidden Markov models Isolaed-word speech recogniion using hidden Markov models Håkon Sandsmark December 18, 21 1 Inroducion Speech recogniion is a challenging problem on which much work has been done he las decades. Some of

More information

Lab #2: Kinematics in 1-Dimension

Lab #2: Kinematics in 1-Dimension Reading Assignmen: Chaper 2, Secions 2-1 hrough 2-8 Lab #2: Kinemaics in 1-Dimension Inroducion: The sudy of moion is broken ino wo main areas of sudy kinemaics and dynamics. Kinemaics is he descripion

More information

A DELAY-DEPENDENT STABILITY CRITERIA FOR T-S FUZZY SYSTEM WITH TIME-DELAYS

A DELAY-DEPENDENT STABILITY CRITERIA FOR T-S FUZZY SYSTEM WITH TIME-DELAYS A DELAY-DEPENDENT STABILITY CRITERIA FOR T-S FUZZY SYSTEM WITH TIME-DELAYS Xinping Guan ;1 Fenglei Li Cailian Chen Insiue of Elecrical Engineering, Yanshan Universiy, Qinhuangdao, 066004, China. Deparmen

More information

0.1 MAXIMUM LIKELIHOOD ESTIMATION EXPLAINED

0.1 MAXIMUM LIKELIHOOD ESTIMATION EXPLAINED 0.1 MAXIMUM LIKELIHOOD ESTIMATIO EXPLAIED Maximum likelihood esimaion is a bes-fi saisical mehod for he esimaion of he values of he parameers of a sysem, based on a se of observaions of a random variable

More information

Ensemble Confidence Estimates Posterior Probability

Ensemble Confidence Estimates Posterior Probability Ensemble Esimaes Poserior Probabiliy Michael Muhlbaier, Aposolos Topalis, and Robi Polikar Rowan Universiy, Elecrical and Compuer Engineering, Mullica Hill Rd., Glassboro, NJ 88, USA {muhlba6, opali5}@sudens.rowan.edu

More information

Mean Square Projection Error Gradient-based Variable Forgetting Factor FAPI

Mean Square Projection Error Gradient-based Variable Forgetting Factor FAPI 3rd Inernaional Conference on Advances in Elecrical and Elecronics Engineering (ICAEE'4) Feb. -, 4 Singapore Mean Square Projecion Error Gradien-based Variable Forgeing Facor FAPI Young-Kwang Seo, Jong-Woo

More information

INTRODUCTION TO MACHINE LEARNING 3RD EDITION

INTRODUCTION TO MACHINE LEARNING 3RD EDITION ETHEM ALPAYDIN The MIT Press, 2014 Lecure Slides for INTRODUCTION TO MACHINE LEARNING 3RD EDITION alpaydin@boun.edu.r hp://www.cmpe.boun.edu.r/~ehem/i2ml3e CHAPTER 2: SUPERVISED LEARNING Learning a Class

More information

The General Linear Test in the Ridge Regression

The General Linear Test in the Ridge Regression ommunicaions for Saisical Applicaions Mehods 2014, Vol. 21, No. 4, 297 307 DOI: hp://dx.doi.org/10.5351/sam.2014.21.4.297 Prin ISSN 2287-7843 / Online ISSN 2383-4757 The General Linear Tes in he Ridge

More information

Chapter 2. Models, Censoring, and Likelihood for Failure-Time Data

Chapter 2. Models, Censoring, and Likelihood for Failure-Time Data Chaper 2 Models, Censoring, and Likelihood for Failure-Time Daa William Q. Meeker and Luis A. Escobar Iowa Sae Universiy and Louisiana Sae Universiy Copyrigh 1998-2008 W. Q. Meeker and L. A. Escobar. Based

More information

T L. t=1. Proof of Lemma 1. Using the marginal cost accounting in Equation(4) and standard arguments. t )+Π RB. t )+K 1(Q RB

T L. t=1. Proof of Lemma 1. Using the marginal cost accounting in Equation(4) and standard arguments. t )+Π RB. t )+K 1(Q RB Elecronic Companion EC.1. Proofs of Technical Lemmas and Theorems LEMMA 1. Le C(RB) be he oal cos incurred by he RB policy. Then we have, T L E[C(RB)] 3 E[Z RB ]. (EC.1) Proof of Lemma 1. Using he marginal

More information

Solutions to the Exam Digital Communications I given on the 11th of June = 111 and g 2. c 2

Solutions to the Exam Digital Communications I given on the 11th of June = 111 and g 2. c 2 Soluions o he Exam Digial Communicaions I given on he 11h of June 2007 Quesion 1 (14p) a) (2p) If X and Y are independen Gaussian variables, hen E [ XY ]=0 always. (Answer wih RUE or FALSE) ANSWER: False.

More information

Planning in POMDPs. Dominik Schoenberger Abstract

Planning in POMDPs. Dominik Schoenberger Abstract Planning in POMDPs Dominik Schoenberger d.schoenberger@sud.u-darmsad.de Absrac This documen briefly explains wha a Parially Observable Markov Decision Process is. Furhermore i inroduces he differen approaches

More information

Nature Neuroscience: doi: /nn Supplementary Figure 1. Spike-count autocorrelations in time.

Nature Neuroscience: doi: /nn Supplementary Figure 1. Spike-count autocorrelations in time. Supplemenary Figure 1 Spike-coun auocorrelaions in ime. Normalized auocorrelaion marices are shown for each area in a daase. The marix shows he mean correlaion of he spike coun in each ime bin wih he spike

More information

12: AUTOREGRESSIVE AND MOVING AVERAGE PROCESSES IN DISCRETE TIME. Σ j =

12: AUTOREGRESSIVE AND MOVING AVERAGE PROCESSES IN DISCRETE TIME. Σ j = 1: AUTOREGRESSIVE AND MOVING AVERAGE PROCESSES IN DISCRETE TIME Moving Averages Recall ha a whie noise process is a series { } = having variance σ. The whie noise process has specral densiy f (λ) = of

More information

Decentralized Stochastic Control with Partial History Sharing: A Common Information Approach

Decentralized Stochastic Control with Partial History Sharing: A Common Information Approach 1 Decenralized Sochasic Conrol wih Parial Hisory Sharing: A Common Informaion Approach Ashuosh Nayyar, Adiya Mahajan and Demoshenis Tenekezis arxiv:1209.1695v1 [cs.sy] 8 Sep 2012 Absrac A general model

More information