Clock Synchronization in WSN: from Traditional Estimation Theory to Distributed Signal Processing
|
|
- Alicia Evans
- 5 years ago
- Views:
Transcription
1 Clock Synchronzaton n WS: from Tradtonal Estmaton Theory to Dstrbuted Sgnal Processng Yk-Chung WU The Unversty of Hong Kong Emal: ycwu@eee.hku.hk, Webpage:
2 Applcatons requre clock synchronzaton Event detecton Data Fuson Sleep and wake-up cycle for power management TDD transmsson schedule 2
3 Challenges? Ideal case My tme s 4:28:2pm ode ode In realty delays exst The delay between rado chp nterpret and the CPU respondng The tme for rado chp to transform the message to EM wave The tme for convertng the receved EM wave nto the orgnal message Fnally sgnal to CPU recepton s completed 3
4 System model ode A clock slope= Ideal clock Clock model: c () t = t q slope= α ode B clock Two-way message exchange s used to establsh the clock relatonshp between two nodes: q Real tme real_tme real_tme = real_tme d w 2 n tn t 4 3 n tn t,, n = real_tme d w,, n = 2 [ c ( tn ) q ] = [ c ( tn) q ] d w, n 3 4 [ c ( tn ) q ] [ c ( tn ) q ] d w, n 4
5 Parwse Synchronzaton: Gaussan case [] Synchronze node to node (assume node s the reference) q c t c t w q c t c w 2 ( n ) = ( n) d, n 3 4 ( n ) = ( tn ) d, n Approach : MLE Assume rounds of tme-stamp exchange, we have 2 c ( ) - ( t ) c t w, 2 c ( ) ( ) - / t c t w, d = 4 3 c ( / t) - c ( t) q t d = w, 4 3 c( t) - c( t) w, Tθ z 5
6 Log-lkelhood functon: dtθ ln f ( t, T θ, d) = ln q s lnear and can be easly estmated: ˆ( ) ( H ) H θ d = T T T ( t d ) Put t back to the log-lkelhood functon, d can be obtaned by maxmzng Dfferentatng ths functon w.r.t. d and set t to zero: Fnally, the clock parameters are recovered from ( d) = ( I T ( T T ) T )( t d) H H 2 2 = P H H ˆ Pt t P d = H 2 P ˆ = /[ θˆ ( dˆ )], ˆ q = [ θˆ ( dˆ )] /[ θˆ ( dˆ )] t 2 2 6
7 Approach 2: Low complexty estmator d appears n both system model equatons, so we can elmnate t by addng two equatons [ c ( tn ) c ( tn )] = [ c ( tn ) c ( t )] 2 w n, w n, n Puttng the rounds of message nto a vector form: q c ( ) ( ) -2 ( t ) c( t ) c t c t w, w, / ' ' = q / t = Tθ z c ( t ) c ( t ) c ( t ) c ( t ) -2 w, w, The MLE for ths equaton s θˆ = ( T T ) T t ' H ' ' H ' Ths estmator s of lower complexty snce there s no need to compute d But snce we are not estmatng the unknowns from the orgnal equatons, there may be some loss n performance 7
8 Comparson We have shown theoretcally that the relatve loss of the performance bound of the low complexty estmator w.r.t. to CRB s less than % 8
9 Parwse synchronzaton under exponental delays [2][3] Approach : MLE Revst the message exchange equatons: q q c ( tn ) c ( tn) d = w, n, c ( tn ) c ( tn ) d = w, n If w,n and w,n are..d. exponental R.V.s, the lkelhood functon s (wth =/, =q / ) f c t c t c t c t d ({ ( n), ( n ), ( n ), ( n )} n=,,, ) exp [( c ( tn ) c ( tn )) 2 d c ( tn ) c ( tn)] n= [ c ( tn ) c ( tn) d 0] [ c ( tn ) c ( tn ) d 0] [ d 0] n= n= = 9
10 Closed-form estmate of can be obtaned by dfferentatng the above equaton w.r.t. and set t to zero ˆ Puttng back nto the lkelhood functon, we can show that the MLE that maxmzes the profle lkelhood functon s * * * 2 3 [,, d ] = arg max [( c ( tn ) c ( tn )) 2 d],, d n = 2 { c ( tn ) c ( tn) d 0} n= ; d 0 subect to { ( 3 ) ( 4 ) 0} c tn c tn d n= ; d 0 Ths s a lnear programmng problem, and can be solved usng exstng solver, but the worst case complexty s at least ( 3 ) We have also proposed a low complexty algorthm for solvng ths problem, and the worst case only takes () 0
11 Approach 2: Iteratve weghted medan Add the two message exchange equatons: [ c ( tn ) c ( tn )] [ c ( tn) c ( tn )] 2 = w, n w, n = T w,n - w,n becomes Laplacan R.V. wth locaton parameter 0 and scale parameter / The log-lkelhood functon s rn, An estmate of and can be obtaned by mnmzng the second term = T sn, ln f ({ T, T }, ) ln T T 2 s, n r, n n= = s, n r, n 2 n= mn T T 2, n= s, n r, n
12 ow, consder two sub-problems When s fxed, the problem s mn 2 0.5( T T ) n= r, n s, n The soluton s the medan value of the sequence r, n s, n 0.5( T T ) n= When s fxed, the problem s mn T ( T / T 2 ) n= r, n s, n r, n Ths s a weghted medan problem for the data set Smple procedure exsts to compute ths Two steps are teratvely updated Snce the obectve functon s convex, t wll converge to the global optmal soluton r, n s, n r, n T,( T / T 2 ) n= 2
13 Comparson Iteratve weghted medan has a sgnfcant loss w.r.t. optmal soluton The man reason s that teratve weghted medan method adds the two message exchange equatons together before estmaton Ths s n contrast to Gaussan settng where ths operaton does not lead to sgnfcant loss 3
14 Proposed low-complexty algorthm has the same performance as LP solver Proposed low-complexty algorthm has the lowest complexty 4
15 etwork-wde synchronzaton How to extend parwse algorthm to work for networkwde synchronzaton? Tree based Clustered based eed overhead to buld and mantan tree or cluster structure Error accumulaton s quck as number of layers ncreases Vulnerable f gateway node des 5
16 Fully Dstrbuted Algorthms Approach : Coordnate Descent [4] Add the two-way message exchange equatons to frst elmnate d: [ c ( t ) c ( t ) 2 q ] [ c ( t ) c ( t ) 2 q ] = w w n n n n, n, n Assume each node perform tmes two-way message exchanges wth each of ts drect neghbors LL({, q },) M = 2 n= = ( ) M [ c ( tn ) c ( tn ) 2 q ] [ c ( tn) c ( tn ) 2 q ] But ths s non-convex w.r.t. unknowns 2 6
17 Wth transformaton = / and = q / M M LL({, } = 2,) [ c ( tn ) c ( tn )] 2 [ c ( tn) c ( tn )] 2 n= = ( ) {, } {, } T T rn, sn, Ths s convex w.r.t. and, and we can alternatvely mnmze ths LL Dfferentatng LL w.r.t. and set t to zero (also w.r.t ), we get two coupled teratve equatons ˆ ( m) = n= ( ) [2 ˆ 2 ˆ ][ T T ] ˆ [ T T T T ] ( m) ( m) {, } {, } ( m) {, } {, } {, } {, } s, n r, n r, n s, n s, n r, n n= ( ) [( T ) ( T ) ] {, } 2 {, } 2 s, n r, n ( ) m ( m) ˆ ( m) {, } {, } ˆ 4 ˆ = [ Ts, n Tr, n ] ˆ ( m) {, } {, } [ Tr, n Ts, n ] 4 ( ) n = ( ) 2 7
18 Approach 2: Belef Propagaton [5] The two-way tme-stamp message exchange equaton (between node and ) can be put nto matrx form [ c ( tn ) c ( tn ) 2 q ] [ c ( tn) c ( tn ) 2 q] = w, n w, n c ( t ) c ( t ) - 2 ( ) ( ) -2 w q / / 2 3 q 4 = c ( t ) c ( t ) - 2 c ( t ) c ( t ) -2 w A β A β = z c t c t, / /,,, w w,,, Margnalzed posteror dstrbuton at node : M M g( β ) p( β ) p( A, A β, β ) dβ... dβ dβ... dβ,, M = {, } E Computatonal demandng, needs centralzed processng 8
19 Express the ont posteror dstrbuton usng factor graph Factor node: local lkelhood functon or pror dstrbuton f = p( A, A β, β ),,, = ( A β A β, I ) 2,,, f = p( β ) Varable node: Margnal dstrbuton at each node can be obtaned by message passng on factor graph: Message from varable node to factor node m () l () l f ( β ) ( ), mf β f B( )\ f =, 9
20 Message from factor node to varable node: m ( β ) = β ) β () l ( l ) f m f ( f, d,, In each teraton, messages are updated n parallel wth the nformaton from drect neghbourng nodes Each node computes the belef locally: b ( l) ( β ( l) ) m f ( β ) fb( β ) = As the lkelhood functon s Gaussan, the messages nvolved n ths algorthm keep the Gaussan form ( l) ( l) ( l) m f β β v f, C f,,, ( ) ( ) ( l) ( l) ( l) mf β β v f C f ( ) (, ),,, 20
21 In practce, each real node computes both knds of messages Informaton passes between real nodes s set as message from factor-tovarable, and nherts the propertes Stll Gaussan. Only mean and covarance needed to be exchanged Updated n parallel by local computaton wth receved mean and covarance messages from neghborng nodes The belef computed at node, wth the receved messages from all drect neghbourng nodes, s stll Gaussan: b ( β) ~ β μ, P where () l () l () () ( ) ( ) P = C f, μ l l l l = P C f, v f, () ( ) ( l ) ( l ) ( l ) () 2
22 ˆ β ) β μ ( l) l) The estmate at node s β = βb ( d = The q and can be recovered from βˆ after convergence ( ( l) It s generally known that f the FG contans cycles, messages can flow many tmes around the graph, leadng to the possblty of dvergence of BP algorthm Two propertes: BP n ths applcaton converges regardless of network topology, even under asynchronous message update The converged soluton can also be proved to be equal to the centralzed ML soluton 22
23 Comparson Smulaton settng: 25 nodes, d [8,2], q [-5.5,5.5], 2 [-0.955,.055],, 000 topologes, = 0. ntalzaton=[, 0], =0 reference node 23
24 BP converges much faster than CD as second order nformaton s ncluded n the messages BP algorthm under asynchronous message exchange 24
25 Dstrbuted trackng wth DKF [4] Clock parameters may stay constant wthn a short perod of tme But t wll change over tme We can ether redo synchronzaton (throwng away prevous estmates), or we can do trackng If the change s slow, trackng s preferred Re-representaton of clock model: After samplng: c t t p B t 0 ( ) = q ( ) t 0 = 0 q () d l 0 ( ) = 0 [ ( ) ] 0 q [ ( ) ] 0 m= 0 c l l m l ( l) () l Ths term s due to phase nose ( t) = p B'( t) 25
26 Wrtng the accumulated skew and offset n recursve forms: ' ' ( l) = ( l ) p [ B ( l) B ( l )] ( l) = ( l ) [ ( l) ] Clock parameter evoluton model: x () l Measurement equatons based on two-way message exchange u () l ( l) 0 ( l ) u ( l) 0 = ( l) 0 ( l ) 0u ( l) 0 Gather all measurement equatons for A T, T, = 2 ( l) 2 ( l) V {, } {, } {, } r n s n l z, = C ( ) ( ), x ( ) l v l l l 0 b Gaussan wth zero mean and varance 2p () 26
27 If all nformaton s gathered n a sngle place, the optmal soluton s centralzed KF KF cannot be mplemented n dstrbuted way snce the Kalman gan matrx contans correlaton among nodes Soluton: Impose a block dagonal structure on the Kalman gan matrx: xˆ ( l l ) = A xˆ ( l l ) b k k k k xˆ ˆ lk lk = x lk lk K lk z, l C, lx( ) lk lk ( ) ( ) ( )( ( )) k K( l ) = arg mn Tr P( l l ) k k k K( l ) s.t. K( l ) = U K ( l ) Ω k M T k = 2 k We can solve for K (l k ) n closed form Each round of trackng ncludes tme-stamp exchange and messages exchange for KF update 27
28 Intalzaton: T - x but t takes a long tme to converge (0 0) = [ 0], P(0 0)= I CD + bootstrap for covarance estmaton (5 rounds of ntal tme-stamp exchange) 28
29 Start wth 25 nodes (B=5) If a node fals durng trackng, we smply remove t from the equatons If the node later resume workng, we wll use ts prevously stored clock parameter estmates and covarance matrx T - If a new node suddenly on n, we wll use x(0 0) = [ 0], P(0 0)= I 29
30 Dstrbuted algorthm under exponental delays [6] The parwse LP problem can be easly extended to network-wde settng: M * * [ x, d ] = arg max ( c ( tn ) c ( tn )) ( c ( tn ) c ( tn)) 2d xd, = ( ) n= n= 2 c ( tn ) c ( tn) d 0 = d subect to c ( tn ) c ( tn ) d 0 (, ) E, n,..., Ths s a LP problem, but very large n sze Centralzed soluton s computatonal expensve and has large communcaton overhead 30
31 Challenge of solvng ths problem n a dstrbuted way: constrants are coupled for parameters at dfferent nodes By ntroducng slack varables w and auxlary replca varables z, we can transform the problem to x, d, w, z M M T d = 2 = ( ) arg mn a x ( 2 ) ( d 0) ( w 0) s.t. B x = z, E x = z, {, } {, } {, } {, } 2 d = z, w = z, {, } {, } z = 0 (, ) E 4 {, } q= q Ths problem can be solved by ADMM (teratvely mnmzng the augmented Lagrangan functon w.r.t. the unknowns, x, d, w, z, and the Lagrange multplers) 3
32 Propertes: The resultant algorthm only nvolves local computaton at each node and communcatons wth ts drect neghbors Closed-form expressons are avalable for each update step It wll converge to the centralzed ML soluton Contrast to exstng applcatons of ADMM: Drect applcaton of ADMM to the orgnal LP would not result n dstrbuted algorthm Most exstng applcatons that result n dstrbuted algorthms are for Gaussan lkelhood Most exstng works consder a sngle (or a small set of) common parameter 32
33 Smulaton results on a 25 nodes network K=5 = CD as ntalzaton help to speed up convergence 33
34 Conclusons We dscussed clock synchronzaton n wreless sensor networks We started wth parwse synchronzaton: Gaussan case and exponental case Then we dscussed network-wde synchronzaton: CD, BP, ADMM We also dscussed trackng usng dstrbuted KF Future works: BP based dstrbuted algorthm for exponental delays? How about arbtrary dstrbuted delays, asymmetrc delays? 34
35 References [] Me Leng and Yk-Chung Wu, ``On Clock Synchronzaton Algorthms for Wreless Sensor etworks under Unknown Delay," IEEE Trans. on Vehcular Technology, vol. 59, no., pp , Jan 200. [2] Me Leng and Yk-Chung Wu, ``Low Complexty Maxmum Lkelhood Estmators for Clock Synchronzaton of Wreless Sensor odes under Exponental Delays," IEEE Trans. on Sgnal Processng, Vol. 59, no. 0, pp , Oct 20. [3] Me Leng and Yk-Chung Wu, ``On ont synchronzaton of clock offset and skew for Wreless Sensor etworks under exponental delay," Proceedngs of the IEEE ISCAS 200, Pars, France, pp , May 200. [4] Bn Luo and Yk-Chung Wu, ``Dstrbuted Clock Parameters Trackng n Wreless Sensor etwork," IEEE Trans. on Wreless Communcatons, Vol. 2, no. 2, pp , Dec 203. [5] Jan Du and Yk-Chung Wu, ``Dstrbuted Clock Skew and Offset Estmaton n Wreless Sensor etworks: Asynchronous Algorthm and Convergence Analyss," IEEE Trans. on Wreless Communcatons, Vol. 2, no., pp , ov [6] Bn Luo, Le Cheng, and Yk-Chung Wu, ``Fully-dstrbuted Clock Synchronzaton n Wreless Sensor etworks Under Exponental Delays," Sgnal Processng, Vol. 25, pp , Aug Further related readngs: Yk-Chung Wu, Qasm M. Chaudhar and Erchn Serpedn, ``Clock Synchronzaton of Wreless Sensor etworks," IEEE Sgnal Processng Magazne, Vol. 28, no., pp.24-38, Jan
36 Jun Zheng and Yk-Chung Wu, ``Jont Tme Synchronzaton and Localzaton of an unknown node n Wreless Sensor etworks," IEEE Trans. on Sgnal Processng, Vol. 58, no. 3, pp , Mar 200. Me Leng and Yk-Chung Wu, ``Dstrbuted Clock Synchronzaton for Wreless Sensor etworks usng Belef Propagaton," IEEE Trans. on Sgnal Processng, Vol. 59, no., pp , ov
Why BP Works STAT 232B
Why BP Works STAT 232B Free Energes Helmholz & Gbbs Free Energes 1 Dstance between Probablstc Models - K-L dvergence b{ KL b{ p{ = b{ ln { } p{ Here, p{ s the eact ont prob. b{ s the appromaton, called
More informationParametric fractional imputation for missing data analysis. Jae Kwang Kim Survey Working Group Seminar March 29, 2010
Parametrc fractonal mputaton for mssng data analyss Jae Kwang Km Survey Workng Group Semnar March 29, 2010 1 Outlne Introducton Proposed method Fractonal mputaton Approxmaton Varance estmaton Multple mputaton
More informationKernel Methods and SVMs Extension
Kernel Methods and SVMs Extenson The purpose of ths document s to revew materal covered n Machne Learnng 1 Supervsed Learnng regardng support vector machnes (SVMs). Ths document also provdes a general
More informationTracking with Kalman Filter
Trackng wth Kalman Flter Scott T. Acton Vrgna Image and Vdeo Analyss (VIVA), Charles L. Brown Department of Electrcal and Computer Engneerng Department of Bomedcal Engneerng Unversty of Vrgna, Charlottesvlle,
More informationEM and Structure Learning
EM and Structure Learnng Le Song Machne Learnng II: Advanced Topcs CSE 8803ML, Sprng 2012 Partally observed graphcal models Mxture Models N(μ 1, Σ 1 ) Z X N N(μ 2, Σ 2 ) 2 Gaussan mxture model Consder
More informationFinite Mixture Models and Expectation Maximization. Most slides are from: Dr. Mario Figueiredo, Dr. Anil Jain and Dr. Rong Jin
Fnte Mxture Models and Expectaton Maxmzaton Most sldes are from: Dr. Maro Fgueredo, Dr. Anl Jan and Dr. Rong Jn Recall: The Supervsed Learnng Problem Gven a set of n samples X {(x, y )},,,n Chapter 3 of
More informationMACHINE APPLIED MACHINE LEARNING LEARNING. Gaussian Mixture Regression
11 MACHINE APPLIED MACHINE LEARNING LEARNING MACHINE LEARNING Gaussan Mture Regresson 22 MACHINE APPLIED MACHINE LEARNING LEARNING Bref summary of last week s lecture 33 MACHINE APPLIED MACHINE LEARNING
More informationECE559VV Project Report
ECE559VV Project Report (Supplementary Notes Loc Xuan Bu I. MAX SUM-RATE SCHEDULING: THE UPLINK CASE We have seen (n the presentaton that, for downlnk (broadcast channels, the strategy maxmzng the sum-rate
More informationHidden Markov Models
Hdden Markov Models Namrata Vaswan, Iowa State Unversty Aprl 24, 204 Hdden Markov Model Defntons and Examples Defntons:. A hdden Markov model (HMM) refers to a set of hdden states X 0, X,..., X t,...,
More informationResource Allocation with a Budget Constraint for Computing Independent Tasks in the Cloud
Resource Allocaton wth a Budget Constrant for Computng Independent Tasks n the Cloud Wemng Sh and Bo Hong School of Electrcal and Computer Engneerng Georga Insttute of Technology, USA 2nd IEEE Internatonal
More informationMLE and Bayesian Estimation. Jie Tang Department of Computer Science & Technology Tsinghua University 2012
MLE and Bayesan Estmaton Je Tang Department of Computer Scence & Technology Tsnghua Unversty 01 1 Lnear Regresson? As the frst step, we need to decde how we re gong to represent the functon f. One example:
More informationOutline. Communication. Bellman Ford Algorithm. Bellman Ford Example. Bellman Ford Shortest Path [1]
DYNAMIC SHORTEST PATH SEARCH AND SYNCHRONIZED TASK SWITCHING Jay Wagenpfel, Adran Trachte 2 Outlne Shortest Communcaton Path Searchng Bellmann Ford algorthm Algorthm for dynamc case Modfcatons to our algorthm
More informationprinceton univ. F 17 cos 521: Advanced Algorithm Design Lecture 7: LP Duality Lecturer: Matt Weinberg
prnceton unv. F 17 cos 521: Advanced Algorthm Desgn Lecture 7: LP Dualty Lecturer: Matt Wenberg Scrbe: LP Dualty s an extremely useful tool for analyzng structural propertes of lnear programs. Whle there
More informationCollege of Computer & Information Science Fall 2009 Northeastern University 20 October 2009
College of Computer & Informaton Scence Fall 2009 Northeastern Unversty 20 October 2009 CS7880: Algorthmc Power Tools Scrbe: Jan Wen and Laura Poplawsk Lecture Outlne: Prmal-dual schema Network Desgn:
More informationInformation Weighted Consensus
Informaton Weghted Consensus A. T. Kamal, J. A. Farrell and A. K. Roy-Chowdhury Unversty of Calforna, Rversde, CA-92521 Abstract Consensus-based dstrbuted estmaton schemes are becomng ncreasngly popular
More informationThe Prncpal Component Transform The Prncpal Component Transform s also called Karhunen-Loeve Transform (KLT, Hotellng Transform, oregenvector Transfor
Prncpal Component Transform Multvarate Random Sgnals A real tme sgnal x(t can be consdered as a random process and ts samples x m (m =0; ;N, 1 a random vector: The mean vector of X s X =[x0; ;x N,1] T
More information1 Convex Optimization
Convex Optmzaton We wll consder convex optmzaton problems. Namely, mnmzaton problems where the objectve s convex (we assume no constrants for now). Such problems often arse n machne learnng. For example,
More informationDifferentiating Gaussian Processes
Dfferentatng Gaussan Processes Andrew McHutchon Aprl 17, 013 1 Frst Order Dervatve of the Posteror Mean The posteror mean of a GP s gven by, f = x, X KX, X 1 y x, X α 1 Only the x, X term depends on the
More informationModule 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur
Module 3 LOSSY IMAGE COMPRESSION SYSTEMS Verson ECE IIT, Kharagpur Lesson 6 Theory of Quantzaton Verson ECE IIT, Kharagpur Instructonal Objectves At the end of ths lesson, the students should be able to:
More informationLecture 8: Time & Clocks. CDK: Sections TVS: Sections
Lecture 8: Tme & Clocks CDK: Sectons 11.1 11.4 TVS: Sectons 6.1 6.2 Topcs Synchronzaton Logcal tme (Lamport) Vector clocks We assume there are benefts from havng dfferent systems n a network able to agree
More informationRockefeller College University at Albany
Rockefeller College Unverst at Alban PAD 705 Handout: Maxmum Lkelhood Estmaton Orgnal b Davd A. Wse John F. Kenned School of Government, Harvard Unverst Modfcatons b R. Karl Rethemeer Up to ths pont n
More informationA Robust Method for Calculating the Correlation Coefficient
A Robust Method for Calculatng the Correlaton Coeffcent E.B. Nven and C. V. Deutsch Relatonshps between prmary and secondary data are frequently quantfed usng the correlaton coeffcent; however, the tradtonal
More informationEnsemble Methods: Boosting
Ensemble Methods: Boostng Ncholas Ruozz Unversty of Texas at Dallas Based on the sldes of Vbhav Gogate and Rob Schapre Last Tme Varance reducton va baggng Generate new tranng data sets by samplng wth replacement
More informationGeneralized Linear Methods
Generalzed Lnear Methods 1 Introducton In the Ensemble Methods the general dea s that usng a combnaton of several weak learner one could make a better learner. More formally, assume that we have a set
More informationHierarchical State Estimation Using Phasor Measurement Units
Herarchcal State Estmaton Usng Phasor Measurement Unts Al Abur Northeastern Unversty Benny Zhao (CA-ISO) and Yeo-Jun Yoon (KPX) IEEE PES GM, Calgary, Canada State Estmaton Workng Group Meetng July 28,
More information8/25/17. Data Modeling. Data Modeling. Data Modeling. Patrice Koehl Department of Biological Sciences National University of Singapore
8/5/17 Data Modelng Patrce Koehl Department of Bologcal Scences atonal Unversty of Sngapore http://www.cs.ucdavs.edu/~koehl/teachng/bl59 koehl@cs.ucdavs.edu Data Modelng Ø Data Modelng: least squares Ø
More informationClassification as a Regression Problem
Target varable y C C, C,, ; Classfcaton as a Regresson Problem { }, 3 L C K To treat classfcaton as a regresson problem we should transform the target y nto numercal values; The choce of numercal class
More informationComposite Hypotheses testing
Composte ypotheses testng In many hypothess testng problems there are many possble dstrbutons that can occur under each of the hypotheses. The output of the source s a set of parameters (ponts n a parameter
More informationErrors for Linear Systems
Errors for Lnear Systems When we solve a lnear system Ax b we often do not know A and b exactly, but have only approxmatons  and ˆb avalable. Then the best thng we can do s to solve ˆx ˆb exactly whch
More informationVQ widely used in coding speech, image, and video
at Scalar quantzers are specal cases of vector quantzers (VQ): they are constraned to look at one sample at a tme (memoryless) VQ does not have such constrant better RD perfomance expected Source codng
More informationMean Field / Variational Approximations
Mean Feld / Varatonal Appromatons resented by Jose Nuñez 0/24/05 Outlne Introducton Mean Feld Appromaton Structured Mean Feld Weghted Mean Feld Varatonal Methods Introducton roblem: We have dstrbuton but
More informationHidden Markov Models & The Multivariate Gaussian (10/26/04)
CS281A/Stat241A: Statstcal Learnng Theory Hdden Markov Models & The Multvarate Gaussan (10/26/04) Lecturer: Mchael I. Jordan Scrbes: Jonathan W. Hu 1 Hdden Markov Models As a bref revew, hdden Markov models
More informationEcon107 Applied Econometrics Topic 3: Classical Model (Studenmund, Chapter 4)
I. Classcal Assumptons Econ7 Appled Econometrcs Topc 3: Classcal Model (Studenmund, Chapter 4) We have defned OLS and studed some algebrac propertes of OLS. In ths topc we wll study statstcal propertes
More informationMASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 12 10/21/2013. Martingale Concentration Inequalities and Applications
MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.65/15.070J Fall 013 Lecture 1 10/1/013 Martngale Concentraton Inequaltes and Applcatons Content. 1. Exponental concentraton for martngales wth bounded ncrements.
More informationOutline and Reading. Dynamic Programming. Dynamic Programming revealed. Computing Fibonacci. The General Dynamic Programming Technique
Outlne and Readng Dynamc Programmng The General Technque ( 5.3.2) -1 Knapsac Problem ( 5.3.3) Matrx Chan-Product ( 5.3.1) Dynamc Programmng verson 1.4 1 Dynamc Programmng verson 1.4 2 Dynamc Programmng
More informationLogistic Regression. CAP 5610: Machine Learning Instructor: Guo-Jun QI
Logstc Regresson CAP 561: achne Learnng Instructor: Guo-Jun QI Bayes Classfer: A Generatve model odel the posteror dstrbuton P(Y X) Estmate class-condtonal dstrbuton P(X Y) for each Y Estmate pror dstrbuton
More informationThe Study of Teaching-learning-based Optimization Algorithm
Advanced Scence and Technology Letters Vol. (AST 06), pp.05- http://dx.do.org/0.57/astl.06. The Study of Teachng-learnng-based Optmzaton Algorthm u Sun, Yan fu, Lele Kong, Haolang Q,, Helongang Insttute
More informationUnified Subspace Analysis for Face Recognition
Unfed Subspace Analyss for Face Recognton Xaogang Wang and Xaoou Tang Department of Informaton Engneerng The Chnese Unversty of Hong Kong Shatn, Hong Kong {xgwang, xtang}@e.cuhk.edu.hk Abstract PCA, LDA
More informationMMA and GCMMA two methods for nonlinear optimization
MMA and GCMMA two methods for nonlnear optmzaton Krster Svanberg Optmzaton and Systems Theory, KTH, Stockholm, Sweden. krlle@math.kth.se Ths note descrbes the algorthms used n the author s 2007 mplementatons
More informationp(z) = 1 a e z/a 1(z 0) yi a i x (1/a) exp y i a i x a i=1 n i=1 (y i a i x) inf 1 (y Ax) inf Ax y (1 ν) y if A (1 ν) = 0 otherwise
Dustn Lennon Math 582 Convex Optmzaton Problems from Boy, Chapter 7 Problem 7.1 Solve the MLE problem when the nose s exponentally strbute wth ensty p(z = 1 a e z/a 1(z 0 The MLE s gven by the followng:
More informationMATH 829: Introduction to Data Mining and Analysis The EM algorithm (part 2)
1/16 MATH 829: Introducton to Data Mnng and Analyss The EM algorthm (part 2) Domnque Gullot Departments of Mathematcal Scences Unversty of Delaware Aprl 20, 2016 Recall 2/16 We are gven ndependent observatons
More informationxp(x µ) = 0 p(x = 0 µ) + 1 p(x = 1 µ) = µ
CSE 455/555 Sprng 2013 Homework 7: Parametrc Technques Jason J. Corso Computer Scence and Engneerng SUY at Buffalo jcorso@buffalo.edu Solutons by Yngbo Zhou Ths assgnment does not need to be submtted and
More informationOutline. Bayesian Networks: Maximum Likelihood Estimation and Tree Structure Learning. Our Model and Data. Outline
Outlne Bayesan Networks: Maxmum Lkelhood Estmaton and Tree Structure Learnng Huzhen Yu janey.yu@cs.helsnk.f Dept. Computer Scence, Unv. of Helsnk Probablstc Models, Sprng, 200 Notces: I corrected a number
More informationStatistical pattern recognition
Statstcal pattern recognton Bayes theorem Problem: decdng f a patent has a partcular condton based on a partcular test However, the test s mperfect Someone wth the condton may go undetected (false negatve
More information4DVAR, according to the name, is a four-dimensional variational method.
4D-Varatonal Data Assmlaton (4D-Var) 4DVAR, accordng to the name, s a four-dmensonal varatonal method. 4D-Var s actually a drect generalzaton of 3D-Var to handle observatons that are dstrbuted n tme. The
More informationEEE 241: Linear Systems
EEE : Lnear Systems Summary #: Backpropagaton BACKPROPAGATION The perceptron rule as well as the Wdrow Hoff learnng were desgned to tran sngle layer networks. They suffer from the same dsadvantage: they
More informationHidden Markov Models
CM229S: Machne Learnng for Bonformatcs Lecture 12-05/05/2016 Hdden Markov Models Lecturer: Srram Sankararaman Scrbe: Akshay Dattatray Shnde Edted by: TBD 1 Introducton For a drected graph G we can wrte
More informationJAB Chain. Long-tail claims development. ASTIN - September 2005 B.Verdier A. Klinger
JAB Chan Long-tal clams development ASTIN - September 2005 B.Verder A. Klnger Outlne Chan Ladder : comments A frst soluton: Munch Chan Ladder JAB Chan Chan Ladder: Comments Black lne: average pad to ncurred
More informationThe conjugate prior to a Bernoulli is. A) Bernoulli B) Gaussian C) Beta D) none of the above
The conjugate pror to a Bernoull s A) Bernoull B) Gaussan C) Beta D) none of the above The conjugate pror to a Gaussan s A) Bernoull B) Gaussan C) Beta D) none of the above MAP estmates A) argmax θ p(θ
More informationAssortment Optimization under MNL
Assortment Optmzaton under MNL Haotan Song Aprl 30, 2017 1 Introducton The assortment optmzaton problem ams to fnd the revenue-maxmzng assortment of products to offer when the prces of products are fxed.
More informationComputing Correlated Equilibria in Multi-Player Games
Computng Correlated Equlbra n Mult-Player Games Chrstos H. Papadmtrou Presented by Zhanxang Huang December 7th, 2005 1 The Author Dr. Chrstos H. Papadmtrou CS professor at UC Berkley (taught at Harvard,
More informationDepartment of Quantitative Methods & Information Systems. Time Series and Their Components QMIS 320. Chapter 6
Department of Quanttatve Methods & Informaton Systems Tme Seres and Ther Components QMIS 30 Chapter 6 Fall 00 Dr. Mohammad Zanal These sldes were modfed from ther orgnal source for educatonal purpose only.
More informationCOMPARISON OF SOME RELIABILITY CHARACTERISTICS BETWEEN REDUNDANT SYSTEMS REQUIRING SUPPORTING UNITS FOR THEIR OPERATIONS
Avalable onlne at http://sck.org J. Math. Comput. Sc. 3 (3), No., 6-3 ISSN: 97-537 COMPARISON OF SOME RELIABILITY CHARACTERISTICS BETWEEN REDUNDANT SYSTEMS REQUIRING SUPPORTING UNITS FOR THEIR OPERATIONS
More informationCOS 521: Advanced Algorithms Game Theory and Linear Programming
COS 521: Advanced Algorthms Game Theory and Lnear Programmng Moses Charkar February 27, 2013 In these notes, we ntroduce some basc concepts n game theory and lnear programmng (LP). We show a connecton
More informationAppendix B: Resampling Algorithms
407 Appendx B: Resamplng Algorthms A common problem of all partcle flters s the degeneracy of weghts, whch conssts of the unbounded ncrease of the varance of the mportance weghts ω [ ] of the partcles
More informationEEL 6266 Power System Operation and Control. Chapter 3 Economic Dispatch Using Dynamic Programming
EEL 6266 Power System Operaton and Control Chapter 3 Economc Dspatch Usng Dynamc Programmng Pecewse Lnear Cost Functons Common practce many utltes prefer to represent ther generator cost functons as sngle-
More informationTime-Varying Systems and Computations Lecture 6
Tme-Varyng Systems and Computatons Lecture 6 Klaus Depold 14. Januar 2014 The Kalman Flter The Kalman estmaton flter attempts to estmate the actual state of an unknown dscrete dynamcal system, gven nosy
More informationANSWERS. Problem 1. and the moment generating function (mgf) by. defined for any real t. Use this to show that E( U) var( U)
Econ 413 Exam 13 H ANSWERS Settet er nndelt 9 deloppgaver, A,B,C, som alle anbefales å telle lkt for å gøre det ltt lettere å stå. Svar er gtt . Unfortunately, there s a prntng error n the hnt of
More informationEstimation: Part 2. Chapter GREG estimation
Chapter 9 Estmaton: Part 2 9. GREG estmaton In Chapter 8, we have seen that the regresson estmator s an effcent estmator when there s a lnear relatonshp between y and x. In ths chapter, we generalzed the
More informationSingular Value Decomposition: Theory and Applications
Sngular Value Decomposton: Theory and Applcatons Danel Khashab Sprng 2015 Last Update: March 2, 2015 1 Introducton A = UDV where columns of U and V are orthonormal and matrx D s dagonal wth postve real
More informationThe Expectation-Maximization Algorithm
The Expectaton-Maxmaton Algorthm Charles Elan elan@cs.ucsd.edu November 16, 2007 Ths chapter explans the EM algorthm at multple levels of generalty. Secton 1 gves the standard hgh-level verson of the algorthm.
More informationChapter 2 - The Simple Linear Regression Model S =0. e i is a random error. S β2 β. This is a minimization problem. Solution is a calculus exercise.
Chapter - The Smple Lnear Regresson Model The lnear regresson equaton s: where y + = β + β e for =,..., y and are observable varables e s a random error How can an estmaton rule be constructed for the
More informationApplication of Nonbinary LDPC Codes for Communication over Fading Channels Using Higher Order Modulations
Applcaton of Nonbnary LDPC Codes for Communcaton over Fadng Channels Usng Hgher Order Modulatons Rong-Hu Peng and Rong-Rong Chen Department of Electrcal and Computer Engneerng Unversty of Utah Ths work
More information3.1 ML and Empirical Distribution
67577 Intro. to Machne Learnng Fall semester, 2008/9 Lecture 3: Maxmum Lkelhood/ Maxmum Entropy Dualty Lecturer: Amnon Shashua Scrbe: Amnon Shashua 1 In the prevous lecture we defned the prncple of Maxmum
More informationChapter 8 Indicator Variables
Chapter 8 Indcator Varables In general, e explanatory varables n any regresson analyss are assumed to be quanttatve n nature. For example, e varables lke temperature, dstance, age etc. are quanttatve n
More informationx i1 =1 for all i (the constant ).
Chapter 5 The Multple Regresson Model Consder an economc model where the dependent varable s a functon of K explanatory varables. The economc model has the form: y = f ( x,x,..., ) xk Approxmate ths by
More informationA Hybrid Variational Iteration Method for Blasius Equation
Avalable at http://pvamu.edu/aam Appl. Appl. Math. ISSN: 1932-9466 Vol. 10, Issue 1 (June 2015), pp. 223-229 Applcatons and Appled Mathematcs: An Internatonal Journal (AAM) A Hybrd Varatonal Iteraton Method
More informatione i is a random error
Chapter - The Smple Lnear Regresson Model The lnear regresson equaton s: where + β + β e for,..., and are observable varables e s a random error How can an estmaton rule be constructed for the unknown
More informationComputing MLE Bias Empirically
Computng MLE Bas Emprcally Kar Wa Lm Australan atonal Unversty January 3, 27 Abstract Ths note studes the bas arses from the MLE estmate of the rate parameter and the mean parameter of an exponental dstrbuton.
More information18.1 Introduction and Recap
CS787: Advanced Algorthms Scrbe: Pryananda Shenoy and Shjn Kong Lecturer: Shuch Chawla Topc: Streamng Algorthmscontnued) Date: 0/26/2007 We contnue talng about streamng algorthms n ths lecture, ncludng
More informationUsing T.O.M to Estimate Parameter of distributions that have not Single Exponential Family
IOSR Journal of Mathematcs IOSR-JM) ISSN: 2278-5728. Volume 3, Issue 3 Sep-Oct. 202), PP 44-48 www.osrjournals.org Usng T.O.M to Estmate Parameter of dstrbutons that have not Sngle Exponental Famly Jubran
More informationDynamic Programming. Preview. Dynamic Programming. Dynamic Programming. Dynamic Programming (Example: Fibonacci Sequence)
/24/27 Prevew Fbonacc Sequence Longest Common Subsequence Dynamc programmng s a method for solvng complex problems by breakng them down nto smpler sub-problems. It s applcable to problems exhbtng the propertes
More informationImage Processing for Bubble Detection in Microfluidics
Image Processng for Bubble Detecton n Mcrofludcs Introducton Chen Fang Mechancal Engneerng Department Stanford Unverst Startng from recentl ears, mcrofludcs devces have been wdel used to buld the bomedcal
More informationProblem Set 9 Solutions
Desgn and Analyss of Algorthms May 4, 2015 Massachusetts Insttute of Technology 6.046J/18.410J Profs. Erk Demane, Srn Devadas, and Nancy Lynch Problem Set 9 Solutons Problem Set 9 Solutons Ths problem
More informationInvariant deformation parameters from GPS permanent networks using stochastic interpolation
Invarant deformaton parameters from GPS permanent networks usng stochastc nterpolaton Ludovco Bag, Poltecnco d Mlano, DIIAR Athanasos Dermans, Arstotle Unversty of Thessalonk Outlne Startng hypotheses
More informationDynamic Systems on Graphs
Prepared by F.L. Lews Updated: Saturday, February 06, 200 Dynamc Systems on Graphs Control Graphs and Consensus A network s a set of nodes that collaborates to acheve what each cannot acheve alone. A network,
More informationLecture 12: Discrete Laplacian
Lecture 12: Dscrete Laplacan Scrbe: Tanye Lu Our goal s to come up wth a dscrete verson of Laplacan operator for trangulated surfaces, so that we can use t n practce to solve related problems We are mostly
More informationLow default modelling: a comparison of techniques based on a real Brazilian corporate portfolio
Low default modellng: a comparson of technques based on a real Brazlan corporate portfolo MSc Gulherme Fernandes and MSc Carlos Rocha Credt Scorng and Credt Control Conference XII August 2011 Analytcs
More informationCIE4801 Transportation and spatial modelling Trip distribution
CIE4801 ransportaton and spatal modellng rp dstrbuton Rob van Nes, ransport & Plannng 17/4/13 Delft Unversty of echnology Challenge the future Content What s t about hree methods Wth specal attenton for
More informationSingle-Facility Scheduling over Long Time Horizons by Logic-based Benders Decomposition
Sngle-Faclty Schedulng over Long Tme Horzons by Logc-based Benders Decomposton Elvn Coban and J. N. Hooker Tepper School of Busness, Carnege Mellon Unversty ecoban@andrew.cmu.edu, john@hooker.tepper.cmu.edu
More informationLecture Notes on Linear Regression
Lecture Notes on Lnear Regresson Feng L fl@sdueducn Shandong Unversty, Chna Lnear Regresson Problem In regresson problem, we am at predct a contnuous target value gven an nput feature vector We assume
More informationU.C. Berkeley CS294: Beyond Worst-Case Analysis Luca Trevisan September 5, 2017
U.C. Berkeley CS94: Beyond Worst-Case Analyss Handout 4s Luca Trevsan September 5, 07 Summary of Lecture 4 In whch we ntroduce semdefnte programmng and apply t to Max Cut. Semdefnte Programmng Recall that
More informationMaximizing Overlap of Large Primary Sampling Units in Repeated Sampling: A comparison of Ernst s Method with Ohlsson s Method
Maxmzng Overlap of Large Prmary Samplng Unts n Repeated Samplng: A comparson of Ernst s Method wth Ohlsson s Method Red Rottach and Padrac Murphy 1 U.S. Census Bureau 4600 Slver Hll Road, Washngton DC
More informationCIS526: Machine Learning Lecture 3 (Sept 16, 2003) Linear Regression. Preparation help: Xiaoying Huang. x 1 θ 1 output... θ M x M
CIS56: achne Learnng Lecture 3 (Sept 6, 003) Preparaton help: Xaoyng Huang Lnear Regresson Lnear regresson can be represented by a functonal form: f(; θ) = θ 0 0 +θ + + θ = θ = 0 ote: 0 s a dummy attrbute
More informationA Particle Filter Algorithm based on Mixing of Prior probability density and UKF as Generate Importance Function
Advanced Scence and Technology Letters, pp.83-87 http://dx.do.org/10.14257/astl.2014.53.20 A Partcle Flter Algorthm based on Mxng of Pror probablty densty and UKF as Generate Importance Functon Lu Lu 1,1,
More informationCombining Constraint Programming and Integer Programming
Combnng Constrant Programmng and Integer Programmng GLOBAL CONSTRAINT OPTIMIZATION COMPONENT Specal Purpose Algorthm mn c T x +(x- 0 ) x( + ()) =1 x( - ()) =1 FILTERING ALGORITHM COST-BASED FILTERING ALGORITHM
More informationStatistical inference for generalized Pareto distribution based on progressive Type-II censored data with random removals
Internatonal Journal of Scentfc World, 2 1) 2014) 1-9 c Scence Publshng Corporaton www.scencepubco.com/ndex.php/ijsw do: 10.14419/jsw.v21.1780 Research Paper Statstcal nference for generalzed Pareto dstrbuton
More informationLecture 12: Classification
Lecture : Classfcaton g Dscrmnant functons g The optmal Bayes classfer g Quadratc classfers g Eucldean and Mahalanobs metrcs g K Nearest Neghbor Classfers Intellgent Sensor Systems Rcardo Guterrez-Osuna
More informationYong Joon Ryang. 1. Introduction Consider the multicommodity transportation problem with convex quadratic cost function. 1 2 (x x0 ) T Q(x x 0 )
Kangweon-Kyungk Math. Jour. 4 1996), No. 1, pp. 7 16 AN ITERATIVE ROW-ACTION METHOD FOR MULTICOMMODITY TRANSPORTATION PROBLEMS Yong Joon Ryang Abstract. The optmzaton problems wth quadratc constrants often
More informationTests of Single Linear Coefficient Restrictions: t-tests and F-tests. 1. Basic Rules. 2. Testing Single Linear Coefficient Restrictions
ECONOMICS 35* -- NOTE ECON 35* -- NOTE Tests of Sngle Lnear Coeffcent Restrctons: t-tests and -tests Basc Rules Tests of a sngle lnear coeffcent restrcton can be performed usng ether a two-taled t-test
More informationSome Comments on Accelerating Convergence of Iterative Sequences Using Direct Inversion of the Iterative Subspace (DIIS)
Some Comments on Acceleratng Convergence of Iteratve Sequences Usng Drect Inverson of the Iteratve Subspace (DIIS) C. Davd Sherrll School of Chemstry and Bochemstry Georga Insttute of Technology May 1998
More informationChecking Pairwise Relationships. Lecture 19 Biostatistics 666
Checkng Parwse Relatonshps Lecture 19 Bostatstcs 666 Last Lecture: Markov Model for Multpont Analyss X X X 1 3 X M P X 1 I P X I P X 3 I P X M I 1 3 M I 1 I I 3 I M P I I P I 3 I P... 1 IBD states along
More informationLow Complexity Soft-Input Soft-Output Hamming Decoder
Low Complexty Soft-Input Soft-Output Hammng Der Benjamn Müller, Martn Holters, Udo Zölzer Helmut Schmdt Unversty Unversty of the Federal Armed Forces Department of Sgnal Processng and Communcatons Holstenhofweg
More informationStatistical Circuit Optimization Considering Device and Interconnect Process Variations
Statstcal Crcut Optmzaton Consderng Devce and Interconnect Process Varatons I-Jye Ln, Tsu-Yee Lng, and Yao-Wen Chang The Electronc Desgn Automaton Laboratory Department of Electrcal Engneerng Natonal Tawan
More informationConjugacy and the Exponential Family
CS281B/Stat241B: Advanced Topcs n Learnng & Decson Makng Conjugacy and the Exponental Famly Lecturer: Mchael I. Jordan Scrbes: Bran Mlch 1 Conjugacy In the prevous lecture, we saw conjugate prors for the
More informationDr. Shalabh Department of Mathematics and Statistics Indian Institute of Technology Kanpur
Analyss of Varance and Desgn of Experment-I MODULE VII LECTURE - 3 ANALYSIS OF COVARIANCE Dr Shalabh Department of Mathematcs and Statstcs Indan Insttute of Technology Kanpur Any scentfc experment s performed
More informationLagrange Multipliers Kernel Trick
Lagrange Multplers Kernel Trck Ncholas Ruozz Unversty of Texas at Dallas Based roughly on the sldes of Davd Sontag General Optmzaton A mathematcal detour, we ll come back to SVMs soon! subject to: f x
More informationParameter Estimation for Dynamic System using Unscented Kalman filter
Parameter Estmaton for Dynamc System usng Unscented Kalman flter Jhoon Seung 1,a, Amr Atya F. 2,b, Alexander G.Parlos 3,c, and Klto Chong 1,4,d* 1 Dvson of Electroncs Engneerng, Chonbuk Natonal Unversty,
More informationADVANCED MACHINE LEARNING ADVANCED MACHINE LEARNING
1 ADVANCED ACHINE LEARNING ADVANCED ACHINE LEARNING Non-lnear regresson technques 2 ADVANCED ACHINE LEARNING Regresson: Prncple N ap N-dm. nput x to a contnuous output y. Learn a functon of the type: N
More informationCoarse-Grain MTCMOS Sleep
Coarse-Gran MTCMOS Sleep Transstor Szng Usng Delay Budgetng Ehsan Pakbazna and Massoud Pedram Unversty of Southern Calforna Dept. of Electrcal Engneerng DATE-08 Munch, Germany Leakage n CMOS Technology
More information