Lec 07 Transforms and Quantization II
|
|
- Dana Weaver
- 6 years ago
- Views:
Transcription
1 Outlne CS/EE 559 / ENG 4 Specal Topcs (784, 785, 78) Lec 7 Transforms and Quantzaton II Lecture 6 Re-Cap Scalar Quantzaton Vector Quantzaton Zhu L Course We: Z. L ultmeda Communcaton, 6 Sprng p. Z. L ultmeda Communcaton, 6 Sprng p. Untary Transforms Untary Transform Propertes y=a,,y n R d, A: dd Preserve Energy: y = A= = [a T, a T,, a T d ] Inner product<, a k > = = = = = = = = Untary Transforms: A s untary f: A - =A T, AA T = I d The ass of A s orthogonal to each other <, > = <, > = Preserve Angles: the angles etween vectors are preserved Eamples: cos sn sn cos untary transform: rotate a vector n R n,.e., rotate the ass coordnates DoF of Untary Transforms k-dmenson projectons n d-dmensonal space: kd k. Aove eample: - = ; normal ponts to the unt sphere n Z. L ultmeda Communcaton, 6 Sprng p. Z. L ultmeda Communcaton, 6 Sprng p.4
2 Energy Compacton and De-correlaton Energy Compacton any common untary transforms tend to pack a large fracton of sgnal energy nto just a few transform coeffcents De-correlaton Hghly correlated nput elements qute uncorrelated output coeffcents Covarance matr = = { {} } DCT Eample: y=dct(), Queston: Is there an optmal transform that do est n ths?,,, 6 R y,y,, y 6 R yy Karhunen-Loève Transform (KLT) a untary transform wth the ass vectors n A eng the orthonormalzed egenvectors of R =, = A T = [a, a,, a d ] =, =,,, assume real nput, wrte A T nstead of A H denote the nverse transform matr as A, AA T =I R s symmetrc for real nput, Hermtan for comple nput.e. R T =R, R H = R R nonnegatve defnte,.e. has real non-negatve egen values lnear dsplay scale: g Attrutons Kar Karhunen 947, chel Loève 948 a.k.a Hotellng transform (Harold Hotellng, dscrete formulaton 9) a.k.a. Prncple Component Analyss (PCA, estmate R from samples) : columns of mage pels dsplay scale: log(+as(g)) Decorrelaton y constructon: Propertes of K-L Transform Energy Compacton Comparson = { } = = = nmzng Error under lmted coeffcents reconstructon,! = =, = Bass restrcton: Keep only a suset of m transform coeffcents and then perform nverse transform ( m N) Keep the coeffcents w.r.t. the egenvectors of the frst m largest egenvalues (ndcaton of energy)
3 Transform on D sgnals Gven a m n mage lock, how to compute ts D transform? By applyng D DCT to the rows and then y the columns. (Separale) DCT transform matr s a kronecker product of D DCT ass ufuncton u N(=8)-pt D DCT ass 8-pt D DCT ass atla Eercse: SVD, PCA, and DCT appromaton In compresson: DCT: not data dependent, motvated y DFT, no need to sgnal ass PCA: data drven, otaned from a class of sgnals, need to sgnal per class SVD: drectly appro. from the sgnal, need to sgnal per mage lock o Queston: can we encode ass etter? u 7 Z. L ultmeda Communcaton, 6 Sprng p.9 Z. L ultmeda Communcaton, 6 Sprng p. DNA Sequence Compresson Seq Data n real world: Qualty score any Reads that are algned, wth mutatons/errors:. llon t each = 8 B reads + qualty (confdence) score + laelng =.5 TB Queston: how to compress sequence (lossless) and confdence (lossy) reads Z. L ultmeda Communcaton, 6 Sprng p. Z. L ultmeda Communcaton, 6 Sprng p.
4 FastQ and SA Current solutons: Remnds of zgzag and run-level codng Lecture 6 Re-Cap Scalar Quantzaton Unform Quantzaton Non-Unform Quantzaton Vector Quantzaton Outlne Z. L ultmeda Communcaton, 6 Sprng p. Z. L ultmeda Communcaton, 6 Sprng p.4 Rate Dstorton Encoder and Decoder n nr f ( ),,..., X n n X Encoder Decoder n Encoder: Represent a sequence X n = {X, X,, Xn} y an nde f n (X n ) n {,,, nr }. Decoder: ap f n (X n ) to a reconstructon sequence. Scalar quantzer: n =, quantze each sample ndvdually. Vector quantzer: n >, quantze a group of samples jontly. Z. L, ultmeda Communcaton, 6 Sprng p.5 Xˆ Scalar Quantzaton y y y y4 y5 y6 y7 y8 = = Fed len Code: Encoder: Partton the real lne nto dsjont ntervals: I [, ),... I: Quantzaton ns. : Inde of quantzaton n.,,..., -. : Decson oundares. y : Reconstructon levels. Encoder: sends the code word of each nterval/n nde to the decoder. Decoder: represents all values n an nterval y a reconstructon level. Z. L, ultmeda Communcaton, 6 Sprng p.6.
5 Rate-Dstorton Tradeoff odel of Quantzaton Thngs to e determned: Numer of ns Decson oundares Reconstructon levels Codewords for n ndees. Dstorton The desgn of quantzaton s a tradeoff etween rate and dstorton: To reduce the sze of the encoded ts, we need to reduce the numer of ns ore dstortons The performance s governed y the ratedstorton theory A B Rate Z. L, ultmeda Communcaton, 6 Sprng p.7 A Quantzaton: q = A(): map to an nde Inverse Quantzaton: ˆ B( q) B( A( )) Q( ) ˆ B() s not eactly the nverse functon of A(), ecause Quantzaton error: Q ˆ q B e( ) ˆ Comnng quantzer and de-quantzer: e() or Z. L, ultmeda Communcaton, 6 Sprng p.8 ˆ ˆ Quantzaton error: easure of Dstorton ean Squared Error (SE) for Quantzaton Average quantzaton error of all nput values Need to know the proalty dstruton of the nput Numer of ns: Decson oundares:, =,, Reconstructon Levels: y, =,, Reconstructon: ˆ y ff d SE e( ) ˆ y y y y4 y5 y6 y7 y ˆ f ( ) d y f ( ) d Z. L, ultmeda Communcaton, 6 Sprng p.9 Unform drse Quantzer All ns have the same sze ecept possly for the two outer ntervals: and y are spaced evenly The spacng of and y are oth (step sze) y for nner ntervals. Unform drse Quantzer For fnte Xma and Xmn: Reconstructon Input Even numer of recon levels s not a recon level ma Xmn=- 6 ns For nfnte Xma and Xmn: ns ma ma= The two outer-most recon levels are stll one step sze away from the nner ones. Z. L ultmeda Communcaton, 6 Sprng p.
6 Unform dtread Quantzer Unform dtread Quantzer Unform dtread Quantzer Reconstructon Input Odd numer of recon levels - s a recon level - Desred n mage/vdeo codng For fnte Xma and Xmn: ma For nfnte Xma and Xmn: Xmn=- 5 ns ns ma ma= Z. L ultmeda Communcaton, 6 Sprng p Reconstructon Input Quantzaton mappng: Output s an nde q A( ) sgn( ). 5 Eample: =.8, q =. De-quantzaton mappng: Eample: q = ˆ B( q) q ˆ Z. L ultmeda Communcaton, 6 Sprng p. Quantzaton of a Unformly Dstruted Source Input X: unformly dstruted n [-Xma, Xma]: f()= / ( Xma) Numer of ns: (even for mdrse quantzer) Step sze: = Xma /. y Xma y -.5 e( ) ˆ - - y y y y6.5 6 y7.5 7 y Xma s unformly dstruted n [- /, /] Z. L ultmeda Communcaton, 6 Sprng p. Quantzaton of a Unformly Dstruted Source SE Prove that d Proof: The pdf s f ( ) d / d How to choose, the numer of ns, such that the dstorton s less than a desred level D? d D ˆ f ( ) d y X ma D X ma f ( ) d D Z. L ultmeda Communcaton, 6 Sprng p.4
7 Sgnal to Nose Rato (SNR) Varance of a random varale unformly dstruted n [- L/, L/]: Let = R, each n nde can e represented y R ts. SNR( db) log log 6.R db X X ma ma / Sgnal Energy log Nose Energy L / L / d L log L log / X / R ma ( log ) R Lecture 6 Re-Cap Scalar Quantzaton Unform Quantzaton Non-Unform Quantzaton Vector Quantzaton Outlne = log = log 55 Z. L ultmeda Communcaton, 6 Sprng p.5 Z. L ultmeda Communcaton, 6 Sprng p.6 Non-unform Quantzaton Unform quantzer s not optmal f source s not unformly dstruted For gven, to reduce SE, we want narrow n when f() s hgh and wde n when f() s low d k ˆ f ( ) d y k k f() k f ( ) d Z. L, ultmeda Communcaton, 6 Sprng p.7 Lloyd-a Scalar Quantzer Also known as pdf-optmzed quantzer Gven, the optmal and y that mnmze SE satsfy: d d Lagrangan condton :,. y d y d y E X X I f ( ) d f ( ) d y s the centrod of nterval [(-), ()]. (condtonal mean) k ˆ f ( ) d y k k k f ( ) d f() - Z. L, ultmeda Communcaton, 6 Sprng p.8 y
8 d Lloyd-a Scalar Quantzer y f ( ) y y y f ( ) s the mdpont of y and y+ Nearest neghorng quantzer. - + Summary of Lloyd-a condtons: y f ( ) d f ( ) d y y y y+ A specal case Relatonshp to unform quantzer: If f() = c (unform), Lloyd-a quantzer reduces to unform quantzer y f ( ) d f ( ) d c c( d ) ( ) ( If the rate s hgh, f() s close to constant n each n, we also have y ( ) / L- quantzer reduces to unform quantzer. ) Z. L, ultmeda Communcaton, 6 Sprng p.9 Z. L, ultmeda Communcaton, 6 Sprng p. Eample f() - For the gven pdf, desgn the optmal -level md-rse quantzer. Soluton: By symmetry, = -, =, =. ( ) d ( ) d / 6 y /. / / f ( ) d y Lloyd-a Scalar Quantzer Summary of condtons for optmal quantzer: f ( ) d f ( ) d Gven, can fnd the correspondng optmal y Gven y, can fnd the correspondng optmal How to fnd optmal and y smultaneously? A deadlock: o Reconstructon levels depend on decson levels o Decson levels depend on reconstructon levels Soluton: teratve method! y y Z. L, ultmeda Communcaton, 6 Sprng p. Z. L, ultmeda Communcaton, 6 Sprng p.
9 Lloyd Algorthm (wth known f() ) If the pdf f() s known: Performance of Lloyd-a Scalar Quantzer Recall: Upper & lower ounds for D(R) functon:. Intalze all y, let j =, d = (dstorton). P R D( R) R. Update all decson levels. Update all y, 4. Computer SE: 5. If (dj- dj) / dj- < ε, stop, otherwse set j = j +, go to step. d y j y y k y k k k f ( ) d / f ( ) d f ( ) d P h( X ) : Entropy Power (always σ ) e Dstruton Unform Laplacan Gaussan Entropy Power 6.7 e e.865 Z. L, ultmeda Communcaton, 6 Sprng p. Z. L, ultmeda Communcaton, 6 Sprng p.4 Performance of Lloyd-a Scalar Quantzer Let X take values n [-V, V] wth pdf f() and varance. If X s quantzed nto ns y the Lloyd-a quantzer, t can e shown that when s large, the mnmal SE s d ( ) V f d Sgnfcance: drect estmate of the quantzaton error n terms of the pdf and the numer of ns. Proof can e found n the followng places: Panter, Dte, Quantzaton Dstorton n Pulse-Count odulaton wth Nonunform Spacng of Levels, Proceedngs of IRE, 95. Notes:. Ths s only good for fnte range V.. The formula s eact for pecewse constant pdf. Z. L, ultmeda Communcaton, 6 Sprng p.5 Performance of Lloyd-a Scalar Quantzer Rate-Dstorton performance: d ( ) V f d If = ^R where V R R f ( ) d d V f ( ) d. Eample: If X has unform dstruton n [-V, V], then V ( ). f d V V V d R V R. L- quantzer reduces to unform quant. Comparng wth ounds: L quantzer only acheves the upper ound. Z. L ultmeda Communcaton, 6 Sprng p.6
10 Outlne Vector Quantzaton Lecture 6 Re-Cap Scalar Quantzaton Vector Quantzaton n nr f ( ),,..., X n n X Encoder Decoder n Encoder: Represent a sequence X n = {X, X,, Xn} y an nde fn(x n ) n {,,, nr }. Decoder: ap fn(xn) to a reconstructon sequence (codeword). Xˆ Codeook: the collecton of all codewords. Questons to e addressed: Why ths s etter than scalar quantzer? How to generate the codeook? How to fnd the est codeword for each nput lock? Z. L ultmeda Communcaton, 6 Sprng p.7 Z. L ultmeda Communcaton, 6 Sprng p.8 VQ Induced from Scalar Quantzer Consder the quantzaton of two neghorng samples of a source: Bt rate: ts / sample 8 quantzaton ns. If unform scalar quantzaton s used for each sample, the -D samplng space s parttoned nto 64 rectangular regons (Vorono regons) Defcences of Scalar Quantzer. All codewords are dstruted n a cuc: Not effcent for most dstrutons. The optmal codeword arrangement should depend on the pdf: Assgn codewords to the typcal regon. Scalar quantzer of one sample Hgh pro regon of AR() source. Hgh pro regon of IID Gaussan. VQ has etter performance even for IID. Z. L ultmeda Communcaton, 6 Sprng p.9 Z. L ultmeda Communcaton, 6 Sprng p.4
11 Defcences of Scalar Quantzer Lnde-Buzo-Gray (LBG) Algorthm. The Vorono regons nduced from SQ are always cuc: Cuc regon vs sphercal regon: Gven the same volume, the granular error of the sphere s the smallest among dfferent shapes. lm n SE Area: Sde length: a error:.77 Gven the same volumes (.e., rate R), the SE of the sphercal Vorono regon s the mnmum among all shapes: cuc e SE 6 sphere Area: Radus=a error: /.4 SE.56 sphere or.5 db loss for cuc Vorono regons (same for all pdfs). Z. L ultmeda Communcaton, 6 Sprng p.4. Algorthm to select code-words from a tranng set. Also known as Generalzed Lloyd Algorthm (GLA):. Start from an ntal set of recon values {y}, = to, and a set of tranng vectors {Xn}, n =, N.. For each tranng vector Xn, fnd the recon value that s closest to t. Q(Xn)= Yj ff d(xn, Yj) d(xn, Y) for all j.. Compute average dstorton. 4. If dstorton s small enough, stop. Otherwse, replace the recon value y the avg values of all vectors n each quantzaton regon. Go to Step. Z. L ultmeda Communcaton, 6 Sprng p.4 kmeans() % desred rate R=8; [nd, vq_codeook]=kmeans(, ^R); atla Implementaton kd-tree mplementaton [kdt.nd, kdt.leafs, kdt.mo]=uldvsualwordlst(, ^R); [node, pref_code]=searchvsualwordlst(q, kdt.nd, kdt.leafs); Summary Transforms Untary transform preserves energy, angle, lmted DoF KLT/PCA: energy compacton and de-correlaton DCT: a good KLT/PCA appromaton A t of ntro to Genome Info Compresson, more to come Scalar Quantzaton: If sgnal s unform, what s the epected quantzaton error? Non-unform sgnal dstruton, optmal quantzaton desgn (Lloyd- a) Vector Quantzaton: ore effcent Fast algorthm ests lke kd-tree ased A specal case of transform: over-complete ass, very sparse coeffcent (only none zero entry) Shall revst wth coupled dctonary approach n super resoluton Z. L ultmeda Communcaton, 6 Sprng p.4 Z. L ultmeda Communcaton, 6 Sprng p.44
Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur
Module 3 LOSSY IMAGE COMPRESSION SYSTEMS Verson ECE IIT, Kharagpur Lesson 6 Theory of Quantzaton Verson ECE IIT, Kharagpur Instructonal Objectves At the end of ths lesson, the students should be able to:
More informationVQ widely used in coding speech, image, and video
at Scalar quantzers are specal cases of vector quantzers (VQ): they are constraned to look at one sample at a tme (memoryless) VQ does not have such constrant better RD perfomance expected Source codng
More informationLossy Compression. Compromise accuracy of reconstruction for increased compression.
Lossy Compresson Compromse accuracy of reconstructon for ncreased compresson. The reconstructon s usually vsbly ndstngushable from the orgnal mage. Typcally, one can get up to 0:1 compresson wth almost
More informationWhat would be a reasonable choice of the quantization step Δ?
CE 108 HOMEWORK 4 EXERCISE 1. Suppose you are samplng the output of a sensor at 10 KHz and quantze t wth a unform quantzer at 10 ts per sample. Assume that the margnal pdf of the sgnal s Gaussan wth mean
More informationChapter 8 SCALAR QUANTIZATION
Outlne Chapter 8 SCALAR QUANTIZATION Yeuan-Kuen Lee [ CU, CSIE ] 8.1 Overvew 8. Introducton 8.4 Unform Quantzer 8.5 Adaptve Quantzaton 8.6 Nonunform Quantzaton 8.7 Entropy-Coded Quantzaton Ch 8 Scalar
More informationLec 13 Review of Video Coding
CS/EE 5590 / ENG 401 Specal Topcs (17804, 17815, 17803) Lec 13 Revew of Vdeo Codng Zhu L Course Web: http://l.web.umkc.edu/lzhu/teachng/016sp.vdeo-communcaton/man.html Z. L Multmeda Communcaton, 016 Sprng
More informationTransform Coding. Transform Coding Principle
Transform Codng Prncple of block-wse transform codng Propertes of orthonormal transforms Dscrete cosne transform (DCT) Bt allocaton for transform coeffcents Entropy codng of transform coeffcents Typcal
More informationFlexible Quantization
wb 06/02/21 1 Flexble Quantzaton Bastaan Klejn KTH School of Electrcal Engneerng Stocholm wb 06/02/21 2 Overvew Motvaton for codng technologes Basc quantzaton and codng Hgh-rate quantzaton theory wb 06/02/21
More informationLec 13 Review of Video Coding
Sprng 017 Multmeda Communcaton Lec 13 Revew of Vdeo Codng Zhu L Course Web: http://l.web.umkc.edu/lzhu Z. L Multmeda Communcaton, Sprng 017 p.1 Outlne About the Quz Monday, 03/09, 5:30-6:45@Haag 309 Format:
More informationLectures - Week 4 Matrix norms, Conditioning, Vector Spaces, Linear Independence, Spanning sets and Basis, Null space and Range of a Matrix
Lectures - Week 4 Matrx norms, Condtonng, Vector Spaces, Lnear Independence, Spannng sets and Bass, Null space and Range of a Matrx Matrx Norms Now we turn to assocatng a number to each matrx. We could
More informationx yi In chapter 14, we want to perform inference (i.e. calculate confidence intervals and perform tests of significance) in this setting.
The Practce of Statstcs, nd ed. Chapter 14 Inference for Regresson Introducton In chapter 3 we used a least-squares regresson lne (LSRL) to represent a lnear relatonshp etween two quanttatve explanator
More informationwhere I = (n x n) diagonal identity matrix with diagonal elements = 1 and off-diagonal elements = 0; and σ 2 e = variance of (Y X).
11.4.1 Estmaton of Multple Regresson Coeffcents In multple lnear regresson, we essentally solve n equatons for the p unnown parameters. hus n must e equal to or greater than p and n practce n should e
More informationThe Gaussian classifier. Nuno Vasconcelos ECE Department, UCSD
he Gaussan classfer Nuno Vasconcelos ECE Department, UCSD Bayesan decson theory recall that we have state of the world X observatons g decson functon L[g,y] loss of predctng y wth g Bayes decson rule s
More informationComposite Hypotheses testing
Composte ypotheses testng In many hypothess testng problems there are many possble dstrbutons that can occur under each of the hypotheses. The output of the source s a set of parameters (ponts n a parameter
More informationHowever, since P is a symmetric idempotent matrix, of P are either 0 or 1 [Eigen-values
Fall 007 Soluton to Mdterm Examnaton STAT 7 Dr. Goel. [0 ponts] For the general lnear model = X + ε, wth uncorrelated errors havng mean zero and varance σ, suppose that the desgn matrx X s not necessarly
More informationU.C. Berkeley CS294: Beyond Worst-Case Analysis Luca Trevisan September 5, 2017
U.C. Berkeley CS94: Beyond Worst-Case Analyss Handout 4s Luca Trevsan September 5, 07 Summary of Lecture 4 In whch we ntroduce semdefnte programmng and apply t to Max Cut. Semdefnte Programmng Recall that
More informationEfficient, General Point Cloud Registration with Kernel Feature Maps
Effcent, General Pont Cloud Regstraton wth Kernel Feature Maps Hanchen Xong, Sandor Szedmak, Justus Pater Insttute of Computer Scence Unversty of Innsbruck 30 May 2013 Hanchen Xong (Un.Innsbruck) 3D Regstraton
More informationLecture Nov
Lecture 18 Nov 07 2008 Revew Clusterng Groupng smlar obects nto clusters Herarchcal clusterng Agglomeratve approach (HAC: teratvely merge smlar clusters Dfferent lnkage algorthms for computng dstances
More informationAsymptotic Quantization: A Method for Determining Zador s Constant
Asymptotc Quantzaton: A Method for Determnng Zador s Constant Joyce Shh Because of the fnte capacty of modern communcaton systems better methods of encodng data are requred. Quantzaton refers to the methods
More informationThe Prncpal Component Transform The Prncpal Component Transform s also called Karhunen-Loeve Transform (KLT, Hotellng Transform, oregenvector Transfor
Prncpal Component Transform Multvarate Random Sgnals A real tme sgnal x(t can be consdered as a random process and ts samples x m (m =0; ;N, 1 a random vector: The mean vector of X s X =[x0; ;x N,1] T
More information1. Inference on Regression Parameters a. Finding Mean, s.d and covariance amongst estimates. 2. Confidence Intervals and Working Hotelling Bands
Content. Inference on Regresson Parameters a. Fndng Mean, s.d and covarance amongst estmates.. Confdence Intervals and Workng Hotellng Bands 3. Cochran s Theorem 4. General Lnear Testng 5. Measures of
More informationScalar and Vector Quantization
Scalar and Vector Quantzaton Máro A. T. Fgueredo, Departamento de Engenhara Electrotécnca e de Computadores, Insttuto Superor Técnco, Lsboa, Portugal maro.fgueredo@tecnco.ulsboa.pt November 207 Quantzaton
More informationMACHINE APPLIED MACHINE LEARNING LEARNING. Gaussian Mixture Regression
11 MACHINE APPLIED MACHINE LEARNING LEARNING MACHINE LEARNING Gaussan Mture Regresson 22 MACHINE APPLIED MACHINE LEARNING LEARNING Bref summary of last week s lecture 33 MACHINE APPLIED MACHINE LEARNING
More informationError Probability for M Signals
Chapter 3 rror Probablty for M Sgnals In ths chapter we dscuss the error probablty n decdng whch of M sgnals was transmtted over an arbtrary channel. We assume the sgnals are represented by a set of orthonormal
More informationInner Product. Euclidean Space. Orthonormal Basis. Orthogonal
Inner Product Defnton 1 () A Eucldean space s a fnte-dmensonal vector space over the reals R, wth an nner product,. Defnton 2 (Inner Product) An nner product, on a real vector space X s a symmetrc, blnear,
More informationMore metrics on cartesian products
More metrcs on cartesan products If (X, d ) are metrc spaces for 1 n, then n Secton II4 of the lecture notes we defned three metrcs on X whose underlyng topologes are the product topology The purpose of
More informationWhich Separator? Spring 1
Whch Separator? 6.034 - Sprng 1 Whch Separator? Mamze the margn to closest ponts 6.034 - Sprng Whch Separator? Mamze the margn to closest ponts 6.034 - Sprng 3 Margn of a pont " # y (w $ + b) proportonal
More informationThe Fourier Transform
e Processng ourer Transform D The ourer Transform Effcent Data epresentaton Dscrete ourer Transform - D Contnuous ourer Transform - D Eamples + + + Jean Baptste Joseph ourer Effcent Data epresentaton Data
More informationMIMA Group. Chapter 2 Bayesian Decision Theory. School of Computer Science and Technology, Shandong University. Xin-Shun SDU
Group M D L M Chapter Bayesan Decson heory Xn-Shun Xu @ SDU School of Computer Scence and echnology, Shandong Unversty Bayesan Decson heory Bayesan decson theory s a statstcal approach to data mnng/pattern
More informationRichard Socher, Henning Peters Elements of Statistical Learning I E[X] = arg min. E[(X b) 2 ]
1 Prolem (10P) Show that f X s a random varale, then E[X] = arg mn E[(X ) 2 ] Thus a good predcton for X s E[X] f the squared dfference s used as the metrc. The followng rules are used n the proof: 1.
More informationPower Allocation for Distributed BLUE Estimation with Full and Limited Feedback of CSI
Power Allocaton for Dstrbuted BLUE Estmaton wth Full and Lmted Feedback of CSI Mohammad Fanae, Matthew C. Valent, and Natala A. Schmd Lane Department of Computer Scence and Electrcal Engneerng West Vrgna
More informationCSE4210 Architecture and Hardware for DSP
4210 Archtecture and Hardware for DSP Lecture 1 Introducton & Number systems Admnstratve Stuff 4210 Archtecture and Hardware for DSP Text: VLSI Dgtal Sgnal Processng Systems: Desgn and Implementaton. K.
More informationp 1 c 2 + p 2 c 2 + p 3 c p m c 2
Where to put a faclty? Gven locatons p 1,..., p m n R n of m houses, want to choose a locaton c n R n for the fre staton. Want c to be as close as possble to all the house. We know how to measure dstance
More information2.29 Numerical Fluid Mechanics
REVIEW Lecture 10: Sprng 2015 Lecture 11 Classfcaton of Partal Dfferental Equatons PDEs) and eamples wth fnte dfference dscretzatons Parabolc PDEs Ellptc PDEs Hyperbolc PDEs Error Types and Dscretzaton
More informationImage classification. Given the bag-of-features representations of images from different classes, how do we learn a model for distinguishing i them?
Image classfcaton Gven te bag-of-features representatons of mages from dfferent classes ow do we learn a model for dstngusng tem? Classfers Learn a decson rule assgnng bag-offeatures representatons of
More informationLogistic Regression. CAP 5610: Machine Learning Instructor: Guo-Jun QI
Logstc Regresson CAP 561: achne Learnng Instructor: Guo-Jun QI Bayes Classfer: A Generatve model odel the posteror dstrbuton P(Y X) Estmate class-condtonal dstrbuton P(X Y) for each Y Estmate pror dstrbuton
More informationLec 12 Rate-Distortion Optimization (RDO) in Video Coding-II
Sprng 07: Multmeda Communcaton Lec ate-dstorton Optmzaton (DO) n Vdeo Codng-II Zhu L Course Web: http://l.web.umkc.edu/lzhu/ Z. L Multmeda Communcaton, Sprng 07 p. Outlne Lec ecap Lagrangan Method HW-3
More information3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X
Statstcs 1: Probablty Theory II 37 3 EPECTATION OF SEVERAL RANDOM VARIABLES As n Probablty Theory I, the nterest n most stuatons les not on the actual dstrbuton of a random vector, but rather on a number
More informationECE 534: Elements of Information Theory. Solutions to Midterm Exam (Spring 2006)
ECE 534: Elements of Informaton Theory Solutons to Mdterm Eam (Sprng 6) Problem [ pts.] A dscrete memoryless source has an alphabet of three letters,, =,, 3, wth probabltes.4,.4, and., respectvely. (a)
More informationChapter 7 Channel Capacity and Coding
Chapter 7 Channel Capacty and Codng Contents 7. Channel models and channel capacty 7.. Channel models Bnary symmetrc channel Dscrete memoryless channels Dscrete-nput, contnuous-output channel Waveform
More informationLecture Notes on Linear Regression
Lecture Notes on Lnear Regresson Feng L fl@sdueducn Shandong Unversty, Chna Lnear Regresson Problem In regresson problem, we am at predct a contnuous target value gven an nput feature vector We assume
More informationLecture 4: Constant Time SVD Approximation
Spectral Algorthms and Representatons eb. 17, Mar. 3 and 8, 005 Lecture 4: Constant Tme SVD Approxmaton Lecturer: Santosh Vempala Scrbe: Jangzhuo Chen Ths topc conssts of three lectures 0/17, 03/03, 03/08),
More informationInterconnect Modeling
Interconnect Modelng Modelng of Interconnects Interconnect R, C and computaton Interconnect models umped RC model Dstrbuted crcut models Hgher-order waveform n dstrbuted RC trees Accuracy and fdelty Prepared
More informationSome Reading. Clustering and Unsupervised Learning. Some Data. K-Means Clustering. CS 536: Machine Learning Littman (Wu, TA)
Some Readng Clusterng and Unsupervsed Learnng CS 536: Machne Learnng Lttman (Wu, TA) Not sure what to suggest for K-Means and sngle-lnk herarchcal clusterng. Klenberg (00). An mpossblty theorem for clusterng
More information( ) [ ( k) ( k) ( x) ( ) ( ) ( ) [ ] ξ [ ] [ ] [ ] ( )( ) i ( ) ( )( ) 2! ( ) = ( ) 3 Interpolation. Polynomial Approximation.
3 Interpolaton {( y } Gven:,,,,,, [ ] Fnd: y for some Mn, Ma Polynomal Appromaton Theorem (Weerstrass Appromaton Theorem --- estence ε [ ab] f( P( , then there ests a polynomal
More informationQuantum Mechanics I - Session 4
Quantum Mechancs I - Sesson 4 Aprl 3, 05 Contents Operators Change of Bass 4 3 Egenvectors and Egenvalues 5 3. Denton....................................... 5 3. Rotaton n D....................................
More informationProbability Theory (revisited)
Probablty Theory (revsted) Summary Probablty v.s. plausblty Random varables Smulaton of Random Experments Challenge The alarm of a shop rang. Soon afterwards, a man was seen runnng n the street, persecuted
More informationFourier Transform. Additive noise. Fourier Tansform. I = S + N. Noise doesn t depend on signal. We ll consider:
Flterng Announcements HW2 wll be posted later today Constructng a mosac by warpng mages. CSE252A Lecture 10a Flterng Exampel: Smoothng by Averagng Kernel: (From Bll Freeman) m=2 I Kernel sze s m+1 by m+1
More informationEntropy Coding. A complete entropy codec, which is an encoder/decoder. pair, consists of the process of encoding or
Sgnal Compresson Sgnal Compresson Entropy Codng Entropy codng s also known as zero-error codng, data compresson or lossless compresson. Entropy codng s wdely used n vrtually all popular nternatonal multmeda
More informationChapter Newton s Method
Chapter 9. Newton s Method After readng ths chapter, you should be able to:. Understand how Newton s method s dfferent from the Golden Secton Search method. Understand how Newton s method works 3. Solve
More informationStatistics Spring MIT Department of Nuclear Engineering
Statstcs.04 Sprng 00.04 S00 Statstcs/Probablty Analyss of eperments Measurement error Measurement process systematc vs. random errors Nose propertes of sgnals and mages quantum lmted mages.04 S00 Probablty
More informationLECTURE :FACTOR ANALYSIS
LCUR :FACOR ANALYSIS Rta Osadchy Based on Lecture Notes by A. Ng Motvaton Dstrbuton coes fro MoG Have suffcent aount of data: >>n denson Use M to ft Mture of Gaussans nu. of tranng ponts If
More informationGeneralized Linear Methods
Generalzed Lnear Methods 1 Introducton In the Ensemble Methods the general dea s that usng a combnaton of several weak learner one could make a better learner. More formally, assume that we have a set
More informationProblem Set 9 Solutions
Desgn and Analyss of Algorthms May 4, 2015 Massachusetts Insttute of Technology 6.046J/18.410J Profs. Erk Demane, Srn Devadas, and Nancy Lynch Problem Set 9 Solutons Problem Set 9 Solutons Ths problem
More informationLinear Approximation with Regularization and Moving Least Squares
Lnear Approxmaton wth Regularzaton and Movng Least Squares Igor Grešovn May 007 Revson 4.6 (Revson : March 004). 5 4 3 0.5 3 3.5 4 Contents: Lnear Fttng...4. Weghted Least Squares n Functon Approxmaton...
More informationSupport Vector Machines CS434
Support Vector Machnes CS434 Lnear Separators Many lnear separators exst that perfectly classfy all tranng examples Whch of the lnear separators s the best? + + + + + + + + + Intuton of Margn Consder ponts
More informationChapter 7 Channel Capacity and Coding
Wreless Informaton Transmsson System Lab. Chapter 7 Channel Capacty and Codng Insttute of Communcatons Engneerng atonal Sun Yat-sen Unversty Contents 7. Channel models and channel capacty 7.. Channel models
More informationCS 3710: Visual Recognition Classification and Detection. Adriana Kovashka Department of Computer Science January 13, 2015
CS 3710: Vsual Recognton Classfcaton and Detecton Adrana Kovashka Department of Computer Scence January 13, 2015 Plan for Today Vsual recognton bascs part 2: Classfcaton and detecton Adrana s research
More informationMaximal Margin Classifier
CS81B/Stat41B: Advanced Topcs n Learnng & Decson Makng Mamal Margn Classfer Lecturer: Mchael Jordan Scrbes: Jana van Greunen Corrected verson - /1/004 1 References/Recommended Readng 1.1 Webstes www.kernel-machnes.org
More informationLecture 3: Probability Distributions
Lecture 3: Probablty Dstrbutons Random Varables Let us begn by defnng a sample space as a set of outcomes from an experment. We denote ths by S. A random varable s a functon whch maps outcomes nto the
More informationWhy Monte Carlo Integration? Introduction to Monte Carlo Method. Continuous Probability. Continuous Probability
Introducton to Monte Carlo Method Kad Bouatouch IRISA Emal: kad@rsa.fr Wh Monte Carlo Integraton? To generate realstc lookng mages, we need to solve ntegrals of or hgher dmenson Pel flterng and lens smulaton
More informationSTAT 309: MATHEMATICAL COMPUTATIONS I FALL 2018 LECTURE 16
STAT 39: MATHEMATICAL COMPUTATIONS I FALL 218 LECTURE 16 1 why teratve methods f we have a lnear system Ax = b where A s very, very large but s ether sparse or structured (eg, banded, Toepltz, banded plus
More informationPattern Classification
Pattern Classfcaton All materals n these sldes ere taken from Pattern Classfcaton (nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wley & Sons, 000 th the permsson of the authors and the publsher
More informationDigital Modems. Lecture 2
Dgtal Modems Lecture Revew We have shown that both Bayes and eyman/pearson crtera are based on the Lkelhood Rato Test (LRT) Λ ( r ) < > η Λ r s called observaton transformaton or suffcent statstc The crtera
More informationA Tutorial on Data Reduction. Linear Discriminant Analysis (LDA) Shireen Elhabian and Aly A. Farag. University of Louisville, CVIP Lab September 2009
A utoral on Data Reducton Lnear Dscrmnant Analss (LDA) hreen Elhaban and Al A Farag Unverst of Lousvlle, CVIP Lab eptember 009 Outlne LDA objectve Recall PCA No LDA LDA o Classes Counter eample LDA C Classes
More informationAffine transformations and convexity
Affne transformatons and convexty The purpose of ths document s to prove some basc propertes of affne transformatons nvolvng convex sets. Here are a few onlne references for background nformaton: http://math.ucr.edu/
More informationC4B Machine Learning Answers II. = σ(z) (1 σ(z)) 1 1 e z. e z = σ(1 σ) (1 + e z )
C4B Machne Learnng Answers II.(a) Show that for the logstc sgmod functon dσ(z) dz = σ(z) ( σ(z)) A. Zsserman, Hlary Term 20 Start from the defnton of σ(z) Note that Then σ(z) = σ = dσ(z) dz = + e z e z
More information= = = (a) Use the MATLAB command rref to solve the system. (b) Let A be the coefficient matrix and B be the right-hand side of the system.
Chapter Matlab Exercses Chapter Matlab Exercses. Consder the lnear system of Example n Secton.. x x x y z y y z (a) Use the MATLAB command rref to solve the system. (b) Let A be the coeffcent matrx and
More informationEstimation: Part 2. Chapter GREG estimation
Chapter 9 Estmaton: Part 2 9. GREG estmaton In Chapter 8, we have seen that the regresson estmator s an effcent estmator when there s a lnear relatonshp between y and x. In ths chapter, we generalzed the
More informationMixture o f of Gaussian Gaussian clustering Nov
Mture of Gaussan clusterng Nov 11 2009 Soft vs hard lusterng Kmeans performs Hard clusterng: Data pont s determnstcally assgned to one and only one cluster But n realty clusters may overlap Soft-clusterng:
More informationInvariant deformation parameters from GPS permanent networks using stochastic interpolation
Invarant deformaton parameters from GPS permanent networks usng stochastc nterpolaton Ludovco Bag, Poltecnco d Mlano, DIIAR Athanasos Dermans, Arstotle Unversty of Thessalonk Outlne Startng hypotheses
More informationLecture 10: Dimensionality reduction
Lecture : Dmensonalt reducton g The curse of dmensonalt g Feature etracton s. feature selecton g Prncpal Components Analss g Lnear Dscrmnant Analss Intellgent Sensor Sstems Rcardo Guterrez-Osuna Wrght
More informationSingular Value Decomposition: Theory and Applications
Sngular Value Decomposton: Theory and Applcatons Danel Khashab Sprng 2015 Last Update: March 2, 2015 1 Introducton A = UDV where columns of U and V are orthonormal and matrx D s dagonal wth postve real
More informationCS 523: Computer Graphics, Spring Shape Modeling. PCA Applications + SVD. Andrew Nealen, Rutgers, /15/2011 1
CS 523: Computer Graphcs, Sprng 20 Shape Modelng PCA Applcatons + SVD Andrew Nealen, utgers, 20 2/5/20 emnder: PCA Fnd prncpal components of data ponts Orthogonal drectons that are domnant n the data (have
More informationn α j x j = 0 j=1 has a nontrivial solution. Here A is the n k matrix whose jth column is the vector for all t j=0
MODULE 2 Topcs: Lnear ndependence, bass and dmenson We have seen that f n a set of vectors one vector s a lnear combnaton of the remanng vectors n the set then the span of the set s unchanged f that vector
More information1 Derivation of Point-to-Plane Minimization
1 Dervaton of Pont-to-Plane Mnmzaton Consder the Chen-Medon (pont-to-plane) framework for ICP. Assume we have a collecton of ponts (p, q ) wth normals n. We want to determne the optmal rotaton and translaton
More informationLearning Theory: Lecture Notes
Learnng Theory: Lecture Notes Lecturer: Kamalka Chaudhur Scrbe: Qush Wang October 27, 2012 1 The Agnostc PAC Model Recall that one of the constrants of the PAC model s that the data dstrbuton has to be
More informationECE559VV Project Report
ECE559VV Project Report (Supplementary Notes Loc Xuan Bu I. MAX SUM-RATE SCHEDULING: THE UPLINK CASE We have seen (n the presentaton that, for downlnk (broadcast channels, the strategy maxmzng the sum-rate
More informationCorrelation and Regression. Correlation 9.1. Correlation. Chapter 9
Chapter 9 Correlaton and Regresson 9. Correlaton Correlaton A correlaton s a relatonshp between two varables. The data can be represented b the ordered pars (, ) where s the ndependent (or eplanator) varable,
More informationLecture 3: Shannon s Theorem
CSE 533: Error-Correctng Codes (Autumn 006 Lecture 3: Shannon s Theorem October 9, 006 Lecturer: Venkatesan Guruswam Scrbe: Wdad Machmouch 1 Communcaton Model The communcaton model we are usng conssts
More informationFeature Selection: Part 1
CSE 546: Machne Learnng Lecture 5 Feature Selecton: Part 1 Instructor: Sham Kakade 1 Regresson n the hgh dmensonal settng How do we learn when the number of features d s greater than the sample sze n?
More informationProbability-Theoretic Junction Trees
Probablty-Theoretc Juncton Trees Payam Pakzad, (wth Venkat Anantharam, EECS Dept, U.C. Berkeley EPFL, ALGO/LMA Semnar 2/2/2004 Margnalzaton Problem Gven an arbtrary functon of many varables, fnd (some
More informationKernels in Support Vector Machines. Based on lectures of Martin Law, University of Michigan
Kernels n Support Vector Machnes Based on lectures of Martn Law, Unversty of Mchgan Non Lnear separable problems AND OR NOT() The XOR problem cannot be solved wth a perceptron. XOR Per Lug Martell - Systems
More informationfind (x): given element x, return the canonical element of the set containing x;
COS 43 Sprng, 009 Dsjont Set Unon Problem: Mantan a collecton of dsjont sets. Two operatons: fnd the set contanng a gven element; unte two sets nto one (destructvely). Approach: Canoncal element method:
More informationQuadratic speedup for unstructured search - Grover s Al-
Quadratc speedup for unstructured search - Grover s Al- CS 94- gorthm /8/07 Sprng 007 Lecture 11 001 Unstructured Search Here s the problem: You are gven a boolean functon f : {1,,} {0,1}, and are promsed
More informationCSE 252C: Computer Vision III
CSE 252C: Computer Vson III Lecturer: Serge Belonge Scrbe: Catherne Wah LECTURE 15 Kernel Machnes 15.1. Kernels We wll study two methods based on a specal knd of functon k(x, y) called a kernel: Kernel
More informationU.C. Berkeley CS294: Spectral Methods and Expanders Handout 8 Luca Trevisan February 17, 2016
U.C. Berkeley CS94: Spectral Methods and Expanders Handout 8 Luca Trevsan February 7, 06 Lecture 8: Spectral Algorthms Wrap-up In whch we talk about even more generalzatons of Cheeger s nequaltes, and
More information: Numerical Analysis Topic 2: Solution of Nonlinear Equations Lectures 5-11:
764: Numercal Analyss Topc : Soluton o Nonlnear Equatons Lectures 5-: UIN Malang Read Chapters 5 and 6 o the tetbook 764_Topc Lecture 5 Soluton o Nonlnear Equatons Root Fndng Problems Dentons Classcaton
More informationprinceton univ. F 17 cos 521: Advanced Algorithm Design Lecture 7: LP Duality Lecturer: Matt Weinberg
prnceton unv. F 17 cos 521: Advanced Algorthm Desgn Lecture 7: LP Dualty Lecturer: Matt Wenberg Scrbe: LP Dualty s an extremely useful tool for analyzng structural propertes of lnear programs. Whle there
More informationSalmon: Lectures on partial differential equations. Consider the general linear, second-order PDE in the form. ,x 2
Salmon: Lectures on partal dfferental equatons 5. Classfcaton of second-order equatons There are general methods for classfyng hgher-order partal dfferental equatons. One s very general (applyng even to
More informationIntro to Visual Recognition
CS 2770: Computer Vson Intro to Vsual Recognton Prof. Adrana Kovashka Unversty of Pttsburgh February 13, 2018 Plan for today What s recognton? a.k.a. classfcaton, categorzaton Support vector machnes Separable
More informationLINEAR REGRESSION ANALYSIS. MODULE IX Lecture Multicollinearity
LINEAR REGRESSION ANALYSIS MODULE IX Lecture - 31 Multcollnearty Dr. Shalabh Department of Mathematcs and Statstcs Indan Insttute of Technology Kanpur 6. Rdge regresson The OLSE s the best lnear unbased
More informationSupport Vector Machines
CS 2750: Machne Learnng Support Vector Machnes Prof. Adrana Kovashka Unversty of Pttsburgh February 17, 2016 Announcement Homework 2 deadlne s now 2/29 We ll have covered everythng you need today or at
More information14 Lagrange Multipliers
Lagrange Multplers 14 Lagrange Multplers The Method of Lagrange Multplers s a powerful technque for constraned optmzaton. Whle t has applcatons far beyond machne learnng t was orgnally developed to solve
More informationGrover s Algorithm + Quantum Zeno Effect + Vaidman
Grover s Algorthm + Quantum Zeno Effect + Vadman CS 294-2 Bomb 10/12/04 Fall 2004 Lecture 11 Grover s algorthm Recall that Grover s algorthm for searchng over a space of sze wors as follows: consder the
More informationFeb 14: Spatial analysis of data fields
Feb 4: Spatal analyss of data felds Mappng rregularly sampled data onto a regular grd Many analyss technques for geophyscal data requre the data be located at regular ntervals n space and/or tme. hs s
More informationThe exam is closed book, closed notes except your one-page cheat sheet.
CS 89 Fall 206 Introducton to Machne Learnng Fnal Do not open the exam before you are nstructed to do so The exam s closed book, closed notes except your one-page cheat sheet Usage of electronc devces
More informationIRO0140 Advanced space time-frequency signal processing
IRO4 Advanced space tme-frequency sgnal processng Lecture Toomas Ruuben Takng nto account propertes of the sgnals, we can group these as followng: Regular and random sgnals (are all sgnal parameters determned
More information2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2: Bayesian pattern classification
E395 - Pattern Recognton Solutons to Introducton to Pattern Recognton, Chapter : Bayesan pattern classfcaton Preface Ths document s a soluton manual for selected exercses from Introducton to Pattern Recognton
More informationLecture 2 Solution of Nonlinear Equations ( Root Finding Problems )
Lecture Soluton o Nonlnear Equatons Root Fndng Problems Dentons Classcaton o Methods Analytcal Solutons Graphcal Methods Numercal Methods Bracketng Methods Open Methods Convergence Notatons Root Fndng
More information