Flexible Quantization
|
|
- Aubrey Wade
- 6 years ago
- Views:
Transcription
1 wb 06/02/21 1 Flexble Quantzaton Bastaan Klejn KTH School of Electrcal Engneerng Stocholm
2 wb 06/02/21 2 Overvew Motvaton for codng technologes Basc quantzaton and codng Hgh-rate quantzaton theory
3 wb 06/02/21 3 Dgtal Representatons Dgtal representaton of sgnal Sequence of samples wth fnte precson Robust aganst dstorton Facltates processng Basc rates for audo sgnals: 48 Hz audo, 16 bts, stereo: bts/second 8 Hz speech, 16 bts: bts/second
4 wb 06/02/21 4 Hgh Rate s Expensve Transmsson Wred lns Last mle Pacet networs Swtchng vdeo Wreless lns WF Moble telephony: codng was enablng technology Secure communcaton Storage Portable audo/vdeo players Output survellance cameras
5 wb 06/02/21 5 Hgh Qualty Now Less Natural Conventonal crcut-swtched networs Vrtually no bt errors, no loss Moble networs Reasonable cost and delay mples bt errors Pacet networs Reasonable cost and delay mples pacet loss
6 wb 06/02/21 6 Networs More Dverse How t was: Sngle-paradgm networ end-to-end One servce How t s: Many paradgms n one composte networ: Crcut-swtched networ Pacet networ Wreless crcut-swtched networ Wreless pacet networ Many types of servce Range of qualty-cost Streamng, one-on-one communcaton
7 wb 06/02/21 7 How We Desgned Coders New applcaton (partcular networ, storage) appeared Study applcaton requrements Desgn coder for applcaton requrements Have competton between coder desgns for condtons of applcaton Select best coder No vson of ntegrated networ
8 wb 06/02/21 8 More On Desgn Condtons Attrbutes of a coder Rate Qualty (subjectve), ncludes sgnal bandwdth Delay Robustness: bt errors and pacet loss Computatonal complexty Desgns selected for one confguraton of attrbutes Assocated wth one networ paradgm Desgn effort rrelevant
9 wb 06/02/21 9 Adaptng to the New Envronment Implcatons of old-school desgn n new world: Coders mplctly unable to adapt: codeboos Transcodng Performance unclear when appled to other condtons GSM coder appled to pacet networs New-school desgn Goal: coders that can adapt n real-tme to Networ condtons Qualty requrements Near-optmal over large range of condtons Employ hgh-rate quantzaton theory and more modelng
10 wb 06/02/21 10 Overvew Motvaton for codng technologes Basc quantzaton and codng Hgh-rate quantzaton theory
11 wb 06/02/21 11 Quantzaton Quantzaton: non-nvertble mappng from Eucldan space R to a countable set of ponts C = { c } that s a subset of R Quantzaton cell: V = { x R : Q( x) = c} Inverse quantzaton s msnomer
12 wb 06/02/21 12 Example: Scalar Quantzer Q(x) x
13 Example: Vector Quantzer wb 06/02/21 13
14 wb 06/02/21 14 Quantzaton Cells and Centrods R V = { x : Q( x) = c} s a cell Usually assumed convex: regular quantzers c Cell = Vorono regon The quantzaton ndex specfes the cell and the reconstructon pont (often called the centrod) If the set of ndces {} s countable, the quantzaton ndex can generally be transmtted wth a fnte number of bts x encoder networ decoder c = { c : x V} C
15 wb 06/02/21 15 Example: Vector Quantzer
16 wb 06/02/21 16 Codng Prncples Is t smart to smply transmt the ndex? NO! (t s f ndex probablty unform) Apply lossless (entropy) codng to ndces Used to create.zp x encoder decoder c = { c : x V} C lossless encoder w networ w lossless decoder
17 wb 06/02/21 17 Mnmum (Bt) Rate of Index Code: the set of all codewords { w } Unquely decodable code: can always reconstruct ~Mnmum codeword length for unquely decodable code: lw ( ) = log ( p( )) (follows from Kraft nequalty) Entropy of the ndex: 2 I H( I) = pi() log 2( pi()) Is ~mnmum average rate needed for ndex More accurately: H( I) L< H( I) + 1
18 wb 06/02/21 18 Example Index resembles con flps = I 2 I = 2 2 = H() I p () log( p ()) 0.5 log(0.5) 0.5 log(0.5) 1bts Index resembles based con flps H( I ) = 0.25 log (0.25) 0.75 log (0.75) = bts 2 2 = 0.05 log (0.05) 0.95 log (0.95) = bts 2 2
19 wb 06/02/21 19 Now Bac to Quantzaton To quantze we need to now wth respect to what Optmal trade-off dstorton versus number of ndces Constraned-resoluton Assumes codeword length s fxed Generally short delay Consstent wth TDMA and FDMA, crcut-swtched networs The past Optmal trade-off dstorton versus average rate Constraned-entropy Assumes only average codeword length matters Often long delay Consstent wth CDMA and pacet-swtched networs The future!
20 wb 06/02/21 20 Old-School, Any-Rate Quantzaton Standard approach Constraned-resoluton Stored codeboos Codeboos traned wth data (Generalzed) Lloyd algorthm (GLA), Bell Labs, 1958 / K-means algorthm
21 wb 06/02/21 21 Old-School, Any-Rate Quantzaton Is constraned-resoluton x encoder networ decoder C c = { c : x V}
22 wb 06/02/21 22 Lloyd Algorthm Note: Encoder = Partton={Vorono regons} Decoder = codeboo={centrods) Lloyd algorthm: ntal encoder and decoder optmze encoder done? fnal encoder and decoder optmze decoder Optmze: mnmze mean dstorton: Locally optmal E[mn d( X, c )] I
23 Outcome Lloyd for Vector Quantzer wb 06/02/21 23
24 wb 06/02/21 24 Practcal (Dscrete) Lloyd Algorthm Have database { x } m m M Encoder = partton = { V = { x } : UV ={ x } } j I ( j) j I Decoder = codeboo = C = { c } { V j} C = { c } { } Coptmze V j done? { Vj} C optmze C Optmze = mnmze overall dstorton: m M [mn d( x, c )] I m
25 wb 06/02/21 25 Old-School, Any-Rate Quantzaton Is n your cell phone Constraned resoluton (fxed number of cells/centrods) Wors even at low rates Locally optmal Dstorton decreases each step Tranng computatonally expensve: not n real tme Iteratve solutons only Many varants: Mult-stage Tree Constraned-entropy verson (around 1990)
26 wb 06/02/21 26 Hgh-Rate Quantzaton Assume data densty can be assumed constant wthn a cell (Bennett, 1948) Assume that noton densty of centrods s meanngful Problem formulaton Gven data densty, dstorton crteron, constrant Fnd centrod densty ( quantzer ) Advantage of approach Optmal quantzer can be computed analytcally Can be done n real-tme
27 wb 06/02/21 27 Dstorton and Geometry: SQ D = V 2 f X( xd ) ( xqx, ( )) dx fx( x) ( x c) dx V v f X ( x) dx f X ( x) Δ c 24 /2 2 2 Δ 1 = xdx= Δ Δ Δ /2 12 Δ 23 c 23 Scalar = cubc geometry
28 wb 06/02/21 28 Dstorton and Geometry: VQ Mean dstorton n cell, r th power crteron, per dm fx ( x ) d( x, Q( x )) dx 1 D = x c dx r f ( x ) dx V V v X r r+ r 1 = V V x c dx = V C(, r, G()) r V V CrG (,, ()) CrGx (,, ( )) s the nertal profle coeffcent of quantzaton
29 wb 06/02/21 29 Quantzaton and Cell Geometry Scalar case: cubc cells 1 Cr ( = 2, = 1, G= optmal) = Cr ( = 2, = 2, G= optmal) = D: Hexagonal n sets of two dmensons -D: Sphercal cells Cr G e 1 ( = 2, =, = optmal) (2 π ) = VQ has space-fllng advantage; 1.53 db (= 0.25 bt)
30 wb 06/02/21 30 CE Quantzers n 2D Two dmensons: square and hexagonal lattce
31 wb 06/02/21 31 Hgh-Rate Quantzaton What we have done: relate local geometry to local dstorton Next step: to relate dstorton, rate and centrod densty (and local geometry) Centrod densty: number of centrods/unt volume gx ( )
32 wb 06/02/21 32 Remnder: Constraned-Entropy Codng Apply lossless (entropy) codng to ndces Used to create.zp Rate s mean rate of codewords Consstent wth CDMA, pacet networs, the future x encoder decoder C c = { c : x V} lossless encoder w networ w lossless decoder
33 wb 06/02/21 33 Constraned-Entropy Quantzaton I Constrant on ndex entropy Equvalent constrant H( I) = p ()log( p ()) = I Vf ( c)log( Vf ( c)) X X ( ) f ( x ) log( f ( x )) log( g ( x ) dx = hx ( ) + f ( x)log( g ( x)) dx I X X X X X f ( x )log( g ( x ) dx = constant X X
34 wb 06/02/21 34 Constraned-Entropy Quantzaton II r Dstorton: D = p () D = p () V C( r,, G()) Add Lagrange-multpler term: I I CrG (,, ) f ( x) g ( x) dx r = CrG (,, ) f X ( x ) g ( x ) + λ log( g( x )) dx Mnmze; get Euler-Lagrange equaton; solve X r D C(, r, G) f ( x ) g ( x ) dx X r r+ 1 fx ( x ) g ( x ) + λg ( x )) = 0 gx ( ) = constant!!
35 wb 06/02/21 35 Moral of the Story For constraned-entropy quantzaton: smplest quantzer s best All cells are same sze and shape (not proven, that) Facltates low computatonal complexty quantzer Can compute quantzer for gven pdf and dstorton Does not mean entre encoder s low complexty! Somewhat non-ntutve: Infnte number of cells/centrods! Cell sze ndependent of data densty
36 wb 06/02/21 36 CE Quantzers n 2D Two dmensons: square and hexagonal lattce
37 wb 06/02/21 37 Constraned-Entropy Quantzaton IIa Complete soluton: At a gven dstorton level the optmal centrod densty: ncreases wth mean ndex rate decreases wth dfferental entropy of data (= complexty of data) Can adjust coder n real tme! ( HI ) gx ( ) = exp ( ) h(x )
38 wb 06/02/21 38 Dstorton-Rate Relaton Relaton dstorton and rate (per dmenson): D = p ( ) D C( r,, G) f ( x ) g( x ) dx I X r X = CrGgx (,, ) ( ) f ( x) dx r = CrG (,, )exp HI () hx ( ) ( ) r
39 wb 06/02/21 39 The Vector-Quantzaton Advantage Dvde dstortons of SQ and VQ (Gray & Looabough, 1989) D D SQ VQ = ( ( 1 )) Cr (,1, G)exp r HI () hx ( ) r CrG (,, )exp HI () hx ( ) ( ) Cr (,1, G) 1 = r h X h X CrG (,, ) 1 exp ( ) ( ) Space-fllng advantage Memory advantage (due to redundancy) ρ = hx ( ) hx ( ) 1 1
40 wb 06/02/21 40 Constraned-Entropy Quantzaton s Easy Unform quantzer: smple to mplement Small advantage from usng best lattce Somewhat more complcated Lossless codng s not easy: Does not even exst n old-school quantzaton Must now data densty
41 wb 06/02/21 41 Some Notes on Lossless Codng Lossless codng tres to reduce rate to ndex entropy Huffman code: Table w based on probablty dstrbuton Wors on per-varable bass; hgh overhead Smple to mplement Arthmetc code: Computes codewords for sequence of coeffcents Trcy to wrte program Low overhead Requres cumulatve dstrbuton functon (cmf) Often nontrval to obtan cmf Preferred method
42 wb 06/02/21 42 Practcal Hgh-Rate CE Codng No sgnfcant commercal mplementatons as yet Quantzer and arthmetc coder are computed; flexble x estmate pdf hgh-rate encoder arthmetc encoder w, 1, 2, 3, L networ c = { c : x V} C decoder arthmetc decoder
43 wb 06/02/21 43 Example PDF Estmaton Dffcult; smplfy problem: Densty modeled as mxture p ( x ) = p ( m) p ( x ) X M X, m m Interpretaton: data fall n one of set of probabltes Each mxture component s Gaussan (usually) Know how to desgn quantzer for Gaussan Symmetrc Just one desgn procedure needed for cmf computaton Encode whch component you select then use correspondng quantzer
44 wb 06/02/21 44 Gaussan mxture Four components:
45 wb 06/02/21 45 Hgh-Rate Quantzaton Not yet wdely appled Real-tme adaptaton not used Constraned entropy (constrant on average rate)
46 wb 06/02/21 46 What Have We Learned Problem: Have audo or vdeo data (transformed or not) Need to encode effcently Old-School Soluton Good performance / not flexble Constraned resoluton Codeboo (often computatonally expensve) Commonly used New-World Approach Good performance / can adapt n real-tme Constraned entropy; requres lossless coder (arthmetc coder) Quantzer and arthmetc coder computed = flexble Not yet ready
47 wb 06/02/21 47 Quantzaton Conclusons Emphass was on performance Emphass s on flexblty (but no loss of performance)
VQ widely used in coding speech, image, and video
at Scalar quantzers are specal cases of vector quantzers (VQ): they are constraned to look at one sample at a tme (memoryless) VQ does not have such constrant better RD perfomance expected Source codng
More informationModule 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur
Module 3 LOSSY IMAGE COMPRESSION SYSTEMS Verson ECE IIT, Kharagpur Lesson 6 Theory of Quantzaton Verson ECE IIT, Kharagpur Instructonal Objectves At the end of ths lesson, the students should be able to:
More informationLossy Compression. Compromise accuracy of reconstruction for increased compression.
Lossy Compresson Compromse accuracy of reconstructon for ncreased compresson. The reconstructon s usually vsbly ndstngushable from the orgnal mage. Typcally, one can get up to 0:1 compresson wth almost
More informationAsymptotic Quantization: A Method for Determining Zador s Constant
Asymptotc Quantzaton: A Method for Determnng Zador s Constant Joyce Shh Because of the fnte capacty of modern communcaton systems better methods of encodng data are requred. Quantzaton refers to the methods
More informationChapter 8 SCALAR QUANTIZATION
Outlne Chapter 8 SCALAR QUANTIZATION Yeuan-Kuen Lee [ CU, CSIE ] 8.1 Overvew 8. Introducton 8.4 Unform Quantzer 8.5 Adaptve Quantzaton 8.6 Nonunform Quantzaton 8.7 Entropy-Coded Quantzaton Ch 8 Scalar
More informationEGR 544 Communication Theory
EGR 544 Communcaton Theory. Informaton Sources Z. Alyazcoglu Electrcal and Computer Engneerng Department Cal Poly Pomona Introducton Informaton Source x n Informaton sources Analog sources Dscrete sources
More informationTransform Coding. Transform Coding Principle
Transform Codng Prncple of block-wse transform codng Propertes of orthonormal transforms Dscrete cosne transform (DCT) Bt allocaton for transform coeffcents Entropy codng of transform coeffcents Typcal
More informationTokyo Institute of Technology Periodic Sequencing Control over Multi Communication Channels with Packet Losses
oyo Insttute of echnology Fujta Laboratory oyo Insttute of echnology erodc Sequencng Control over Mult Communcaton Channels wth acet Losses FL6-7- /8/6 zwrman Gusrald oyo Insttute of echnology Fujta Laboratory
More informationEntropy Coding. A complete entropy codec, which is an encoder/decoder. pair, consists of the process of encoding or
Sgnal Compresson Sgnal Compresson Entropy Codng Entropy codng s also known as zero-error codng, data compresson or lossless compresson. Entropy codng s wdely used n vrtually all popular nternatonal multmeda
More informationCSE4210 Architecture and Hardware for DSP
4210 Archtecture and Hardware for DSP Lecture 1 Introducton & Number systems Admnstratve Stuff 4210 Archtecture and Hardware for DSP Text: VLSI Dgtal Sgnal Processng Systems: Desgn and Implementaton. K.
More informationMaximum Likelihood Estimation of Binary Dependent Variables Models: Probit and Logit. 1. General Formulation of Binary Dependent Variables Models
ECO 452 -- OE 4: Probt and Logt Models ECO 452 -- OE 4 Maxmum Lkelhood Estmaton of Bnary Dependent Varables Models: Probt and Logt hs note demonstrates how to formulate bnary dependent varables models
More informationPower Allocation for Distributed BLUE Estimation with Full and Limited Feedback of CSI
Power Allocaton for Dstrbuted BLUE Estmaton wth Full and Lmted Feedback of CSI Mohammad Fanae, Matthew C. Valent, and Natala A. Schmd Lane Department of Computer Scence and Electrcal Engneerng West Vrgna
More informationWhat would be a reasonable choice of the quantization step Δ?
CE 108 HOMEWORK 4 EXERCISE 1. Suppose you are samplng the output of a sensor at 10 KHz and quantze t wth a unform quantzer at 10 ts per sample. Assume that the margnal pdf of the sgnal s Gaussan wth mean
More informationMemory ecient adaptation of vector quantizers to time-varying channels
Sgnal Processng 83 (3) 59 58 www.elsever.com/locate/sgpro Memory ecent adaptaton of vector quantzers to tme-varyng channels orbert Gortz a;,jorg Klewer b a Insttute for Communcatons Engneerng (LT), Munch
More informationMicrowave Diversity Imaging Compression Using Bioinspired
Mcrowave Dversty Imagng Compresson Usng Bonspred Neural Networks Youwe Yuan 1, Yong L 1, Wele Xu 1, Janghong Yu * 1 School of Computer Scence and Technology, Hangzhou Danz Unversty, Hangzhou, Zhejang,
More informationUsing T.O.M to Estimate Parameter of distributions that have not Single Exponential Family
IOSR Journal of Mathematcs IOSR-JM) ISSN: 2278-5728. Volume 3, Issue 3 Sep-Oct. 202), PP 44-48 www.osrjournals.org Usng T.O.M to Estmate Parameter of dstrbutons that have not Sngle Exponental Famly Jubran
More informationOn balancing multiple video streams with distributed QoS control in mobile communications
On balancng multple vdeo streams wth dstrbuted QoS control n moble communcatons Arjen van der Schaaf, José Angel Lso Arellano, and R. (Inald) L. Lagendjk TU Delft, Mekelweg 4, 68 CD Delft, The Netherlands
More informationMaximum Likelihood Estimation of Binary Dependent Variables Models: Probit and Logit. 1. General Formulation of Binary Dependent Variables Models
ECO 452 -- OE 4: Probt and Logt Models ECO 452 -- OE 4 Mamum Lkelhood Estmaton of Bnary Dependent Varables Models: Probt and Logt hs note demonstrates how to formulate bnary dependent varables models for
More informationComposite Hypotheses testing
Composte ypotheses testng In many hypothess testng problems there are many possble dstrbutons that can occur under each of the hypotheses. The output of the source s a set of parameters (ponts n a parameter
More informationMMA and GCMMA two methods for nonlinear optimization
MMA and GCMMA two methods for nonlnear optmzaton Krster Svanberg Optmzaton and Systems Theory, KTH, Stockholm, Sweden. krlle@math.kth.se Ths note descrbes the algorthms used n the author s 2007 mplementatons
More informationQueueing Networks II Network Performance
Queueng Networks II Network Performance Davd Tpper Assocate Professor Graduate Telecommuncatons and Networkng Program Unversty of Pttsburgh Sldes 6 Networks of Queues Many communcaton systems must be modeled
More informationLec 07 Transforms and Quantization II
Outlne CS/EE 559 / ENG 4 Specal Topcs (784, 785, 78) Lec 7 Transforms and Quantzaton II Lecture 6 Re-Cap Scalar Quantzaton Vector Quantzaton Zhu L Course We: http://l.we.umkc.edu/lzhu/teachng/6sp.vdeo-communcaton/man.html
More informationDigital Modems. Lecture 2
Dgtal Modems Lecture Revew We have shown that both Bayes and eyman/pearson crtera are based on the Lkelhood Rato Test (LRT) Λ ( r ) < > η Λ r s called observaton transformaton or suffcent statstc The crtera
More informationClustering & Unsupervised Learning
Clusterng & Unsupervsed Learnng Ken Kreutz-Delgado (Nuno Vasconcelos) ECE 175A Wnter 2012 UCSD Statstcal Learnng Goal: Gven a relatonshp between a feature vector x and a vector y, and d data samples (x,y
More informationMultigradient for Neural Networks for Equalizers 1
Multgradent for Neural Netorks for Equalzers 1 Chulhee ee, Jnook Go and Heeyoung Km Department of Electrcal and Electronc Engneerng Yonse Unversty 134 Shnchon-Dong, Seodaemun-Ku, Seoul 1-749, Korea ABSTRACT
More informationP R. Lecture 4. Theory and Applications of Pattern Recognition. Dept. of Electrical and Computer Engineering /
Theory and Applcatons of Pattern Recognton 003, Rob Polkar, Rowan Unversty, Glassboro, NJ Lecture 4 Bayes Classfcaton Rule Dept. of Electrcal and Computer Engneerng 0909.40.0 / 0909.504.04 Theory & Applcatons
More informationParametric fractional imputation for missing data analysis. Jae Kwang Kim Survey Working Group Seminar March 29, 2010
Parametrc fractonal mputaton for mssng data analyss Jae Kwang Km Survey Workng Group Semnar March 29, 2010 1 Outlne Introducton Proposed method Fractonal mputaton Approxmaton Varance estmaton Multple mputaton
More informationLinear Approximation with Regularization and Moving Least Squares
Lnear Approxmaton wth Regularzaton and Movng Least Squares Igor Grešovn May 007 Revson 4.6 (Revson : March 004). 5 4 3 0.5 3 3.5 4 Contents: Lnear Fttng...4. Weghted Least Squares n Functon Approxmaton...
More informationCompression in the Real World :Algorithms in the Real World. Compression in the Real World. Compression Outline
Compresson n the Real World 5-853:Algorthms n the Real World Data Compresson: Lectures and 2 Generc Fle Compresson Fles: gzp (LZ77), bzp (Burrows-Wheeler), BOA (PPM) Archvers: ARC (LZW), PKZp (LZW+) Fle
More informationComputing Correlated Equilibria in Multi-Player Games
Computng Correlated Equlbra n Mult-Player Games Chrstos H. Papadmtrou Presented by Zhanxang Huang December 7th, 2005 1 The Author Dr. Chrstos H. Papadmtrou CS professor at UC Berkley (taught at Harvard,
More informationBasic Statistical Analysis and Yield Calculations
October 17, 007 Basc Statstcal Analyss and Yeld Calculatons Dr. José Ernesto Rayas Sánchez 1 Outlne Sources of desgn-performance uncertanty Desgn and development processes Desgn for manufacturablty A general
More informationDepartment of Computer Science Artificial Intelligence Research Laboratory. Iowa State University MACHINE LEARNING
MACHINE LEANING Vasant Honavar Bonformatcs and Computatonal Bology rogram Center for Computatonal Intellgence, Learnng, & Dscovery Iowa State Unversty honavar@cs.astate.edu www.cs.astate.edu/~honavar/
More informationROBUST ENCODING OF THE FS1016 LSF PARAMETERS : APPLICATION OF THE CHANNEL OPTIMIZED TRELLIS CODED VECTOR QUANTIZATION
ROBUST ENCODING OF THE FS6 LSF PARAMETERS : APPLICATION OF THE CHANNEL OPTIMIZED TRELLIS CODED VECTOR QUANTIZATION BOUZID Merouane Speech Communcaton and Sgnal Processng Laboratory, Electroncs Faculty,
More informationFinite Mixture Models and Expectation Maximization. Most slides are from: Dr. Mario Figueiredo, Dr. Anil Jain and Dr. Rong Jin
Fnte Mxture Models and Expectaton Maxmzaton Most sldes are from: Dr. Maro Fgueredo, Dr. Anl Jan and Dr. Rong Jn Recall: The Supervsed Learnng Problem Gven a set of n samples X {(x, y )},,,n Chapter 3 of
More informationMarginal Effects in Probit Models: Interpretation and Testing. 1. Interpreting Probit Coefficients
ECON 5 -- NOE 15 Margnal Effects n Probt Models: Interpretaton and estng hs note ntroduces you to the two types of margnal effects n probt models: margnal ndex effects, and margnal probablty effects. It
More informationClustering & (Ken Kreutz-Delgado) UCSD
Clusterng & Unsupervsed Learnng Nuno Vasconcelos (Ken Kreutz-Delgado) UCSD Statstcal Learnng Goal: Gven a relatonshp between a feature vector x and a vector y, and d data samples (x,y ), fnd an approxmatng
More informationLecture 10 Support Vector Machines II
Lecture 10 Support Vector Machnes II 22 February 2016 Taylor B. Arnold Yale Statstcs STAT 365/665 1/28 Notes: Problem 3 s posted and due ths upcomng Frday There was an early bug n the fake-test data; fxed
More informationSwitched Quasi-Logarithmic Quantizer with Golomb Rice Coding
http://dx.do.org/10.5755/j01.ee.3.4.1877 Swtched Quas-Logarthmc Quantzer wth Golomb Rce Codng Nkola Vucc 1, Zoran Perc 1, Mlan Dncc 1 1 Faculty of Electronc Engneerng, Unversty of Ns, Aleksandar Medvedev
More informationSome modelling aspects for the Matlab implementation of MMA
Some modellng aspects for the Matlab mplementaton of MMA Krster Svanberg krlle@math.kth.se Optmzaton and Systems Theory Department of Mathematcs KTH, SE 10044 Stockholm September 2004 1. Consdered optmzaton
More informationChapter 7 Channel Capacity and Coding
Wreless Informaton Transmsson System Lab. Chapter 7 Channel Capacty and Codng Insttute of Communcatons Engneerng atonal Sun Yat-sen Unversty Contents 7. Channel models and channel capacty 7.. Channel models
More informationNovel Pre-Compression Rate-Distortion Optimization Algorithm for JPEG 2000
Novel Pre-Compresson Rate-Dstorton Optmzaton Algorthm for JPEG 2000 Yu-We Chang, Hung-Ch Fang, Chung-Jr Lan, and Lang-Gee Chen DSP/IC Desgn Laboratory, Graduate Insttute of Electroncs Engneerng Natonal
More informationCOS 521: Advanced Algorithms Game Theory and Linear Programming
COS 521: Advanced Algorthms Game Theory and Lnear Programmng Moses Charkar February 27, 2013 In these notes, we ntroduce some basc concepts n game theory and lnear programmng (LP). We show a connecton
More informationApplication of Nonbinary LDPC Codes for Communication over Fading Channels Using Higher Order Modulations
Applcaton of Nonbnary LDPC Codes for Communcaton over Fadng Channels Usng Hgher Order Modulatons Rong-Hu Peng and Rong-Rong Chen Department of Electrcal and Computer Engneerng Unversty of Utah Ths work
More informationModule 2. Random Processes. Version 2 ECE IIT, Kharagpur
Module Random Processes Lesson 6 Functons of Random Varables After readng ths lesson, ou wll learn about cdf of functon of a random varable. Formula for determnng the pdf of a random varable. Let, X be
More informationNumerical Heat and Mass Transfer
Master degree n Mechancal Engneerng Numercal Heat and Mass Transfer 06-Fnte-Dfference Method (One-dmensonal, steady state heat conducton) Fausto Arpno f.arpno@uncas.t Introducton Why we use models and
More information4DVAR, according to the name, is a four-dimensional variational method.
4D-Varatonal Data Assmlaton (4D-Var) 4DVAR, accordng to the name, s a four-dmensonal varatonal method. 4D-Var s actually a drect generalzaton of 3D-Var to handle observatons that are dstrbuted n tme. The
More informationThe Geometry of Logit and Probit
The Geometry of Logt and Probt Ths short note s meant as a supplement to Chapters and 3 of Spatal Models of Parlamentary Votng and the notaton and reference to fgures n the text below s to those two chapters.
More informationThe Expectation-Maximization Algorithm
The Expectaton-Maxmaton Algorthm Charles Elan elan@cs.ucsd.edu November 16, 2007 Ths chapter explans the EM algorthm at multple levels of generalty. Secton 1 gves the standard hgh-level verson of the algorthm.
More informationScalar and Vector Quantization
Scalar and Vector Quantzaton Máro A. T. Fgueredo, Departamento de Engenhara Electrotécnca e de Computadores, Insttuto Superor Técnco, Lsboa, Portugal maro.fgueredo@tecnco.ulsboa.pt November 207 Quantzaton
More informationLecture 14 (03/27/18). Channels. Decoding. Preview of the Capacity Theorem.
Lecture 14 (03/27/18). Channels. Decodng. Prevew of the Capacty Theorem. A. Barg The concept of a communcaton channel n nformaton theory s an abstracton for transmttng dgtal (and analog) nformaton from
More informationLossless Compression Performance of a Simple Counter- Based Entropy Coder
ITB J. ICT, Vol. 5, No. 3, 20, 73-84 73 Lossless Compresson Performance of a Smple Counter- Based Entropy Coder Armen Z. R. Lang,2 ITB Research Center on Informaton and Communcaton Technology 2 Informaton
More informationMLE and Bayesian Estimation. Jie Tang Department of Computer Science & Technology Tsinghua University 2012
MLE and Bayesan Estmaton Je Tang Department of Computer Scence & Technology Tsnghua Unversty 01 1 Lnear Regresson? As the frst step, we need to decde how we re gong to represent the functon f. One example:
More informationCommunication with AWGN Interference
Communcaton wth AWG Interference m {m } {p(m } Modulator s {s } r=s+n Recever ˆm AWG n m s a dscrete random varable(rv whch takes m wth probablty p(m. Modulator maps each m nto a waveform sgnal s m=m
More informationLogistic Regression. CAP 5610: Machine Learning Instructor: Guo-Jun QI
Logstc Regresson CAP 561: achne Learnng Instructor: Guo-Jun QI Bayes Classfer: A Generatve model odel the posteror dstrbuton P(Y X) Estmate class-condtonal dstrbuton P(X Y) for each Y Estmate pror dstrbuton
More informationA New Scrambling Evaluation Scheme based on Spatial Distribution Entropy and Centroid Difference of Bit-plane
A New Scramblng Evaluaton Scheme based on Spatal Dstrbuton Entropy and Centrod Dfference of Bt-plane Lang Zhao *, Avshek Adhkar Kouch Sakura * * Graduate School of Informaton Scence and Electrcal Engneerng,
More informationMACHINE APPLIED MACHINE LEARNING LEARNING. Gaussian Mixture Regression
11 MACHINE APPLIED MACHINE LEARNING LEARNING MACHINE LEARNING Gaussan Mture Regresson 22 MACHINE APPLIED MACHINE LEARNING LEARNING Bref summary of last week s lecture 33 MACHINE APPLIED MACHINE LEARNING
More informationPulse Coded Modulation
Pulse Coded Modulaton PCM (Pulse Coded Modulaton) s a voce codng technque defned by the ITU-T G.711 standard and t s used n dgtal telephony to encode the voce sgnal. The frst step n the analog to dgtal
More informationEngineering Risk Benefit Analysis
Engneerng Rsk Beneft Analyss.55, 2.943, 3.577, 6.938, 0.86, 3.62, 6.862, 22.82, ESD.72, ESD.72 RPRA 2. Elements of Probablty Theory George E. Apostolaks Massachusetts Insttute of Technology Sprng 2007
More informationError Probability for M Signals
Chapter 3 rror Probablty for M Sgnals In ths chapter we dscuss the error probablty n decdng whch of M sgnals was transmtted over an arbtrary channel. We assume the sgnals are represented by a set of orthonormal
More informationrisk and uncertainty assessment
Optmal forecastng of atmospherc qualty n ndustral regons: rsk and uncertanty assessment Vladmr Penenko Insttute of Computatonal Mathematcs and Mathematcal Geophyscs SD RAS Goal Development of theoretcal
More informationAPPENDIX A Some Linear Algebra
APPENDIX A Some Lnear Algebra The collecton of m, n matrces A.1 Matrces a 1,1,..., a 1,n A = a m,1,..., a m,n wth real elements a,j s denoted by R m,n. If n = 1 then A s called a column vector. Smlarly,
More informationLecture Notes on Linear Regression
Lecture Notes on Lnear Regresson Feng L fl@sdueducn Shandong Unversty, Chna Lnear Regresson Problem In regresson problem, we am at predct a contnuous target value gven an nput feature vector We assume
More informationExpectation Maximization Mixture Models HMMs
-755 Machne Learnng for Sgnal Processng Mture Models HMMs Class 9. 2 Sep 200 Learnng Dstrbutons for Data Problem: Gven a collecton of eamples from some data, estmate ts dstrbuton Basc deas of Mamum Lelhood
More informationApproximately achieving Gaussian relay network capacity with lattice codes
Approxmately achevng Gaussan relay network capacty wth lattce codes Ayfer Özgür EFL, Lausanne, Swtzerland ayfer.ozgur@epfl.ch Suhas Dggav UCLA, USA and EFL, Swtzerland suhas@ee.ucla.edu Abstract Recently,
More informationTOPICS MULTIPLIERLESS FILTER DESIGN ELEMENTARY SCHOOL ALGORITHM MULTIPLICATION
1 2 MULTIPLIERLESS FILTER DESIGN Realzaton of flters wthout full-fledged multplers Some sldes based on support materal by W. Wolf for hs book Modern VLSI Desgn, 3 rd edton. Partly based on followng papers:
More informationChapter 7 Channel Capacity and Coding
Chapter 7 Channel Capacty and Codng Contents 7. Channel models and channel capacty 7.. Channel models Bnary symmetrc channel Dscrete memoryless channels Dscrete-nput, contnuous-output channel Waveform
More informationMarkov Chain Monte Carlo (MCMC), Gibbs Sampling, Metropolis Algorithms, and Simulated Annealing Bioinformatics Course Supplement
Markov Chan Monte Carlo MCMC, Gbbs Samplng, Metropols Algorthms, and Smulated Annealng 2001 Bonformatcs Course Supplement SNU Bontellgence Lab http://bsnuackr/ Outlne! Markov Chan Monte Carlo MCMC! Metropols-Hastngs
More informationResource Allocation with a Budget Constraint for Computing Independent Tasks in the Cloud
Resource Allocaton wth a Budget Constrant for Computng Independent Tasks n the Cloud Wemng Sh and Bo Hong School of Electrcal and Computer Engneerng Georga Insttute of Technology, USA 2nd IEEE Internatonal
More informationAN IMPROVED PARTICLE FILTER ALGORITHM BASED ON NEURAL NETWORK FOR TARGET TRACKING
AN IMPROVED PARTICLE FILTER ALGORITHM BASED ON NEURAL NETWORK FOR TARGET TRACKING Qn Wen, Peng Qcong 40 Lab, Insttuton of Communcaton and Informaton Engneerng,Unversty of Electronc Scence and Technology
More informationLimited Dependent Variables
Lmted Dependent Varables. What f the left-hand sde varable s not a contnuous thng spread from mnus nfnty to plus nfnty? That s, gven a model = f (, β, ε, where a. s bounded below at zero, such as wages
More informationThe Gaussian classifier. Nuno Vasconcelos ECE Department, UCSD
he Gaussan classfer Nuno Vasconcelos ECE Department, UCSD Bayesan decson theory recall that we have state of the world X observatons g decson functon L[g,y] loss of predctng y wth g Bayes decson rule s
More informationECE 534: Elements of Information Theory. Solutions to Midterm Exam (Spring 2006)
ECE 534: Elements of Informaton Theory Solutons to Mdterm Eam (Sprng 6) Problem [ pts.] A dscrete memoryless source has an alphabet of three letters,, =,, 3, wth probabltes.4,.4, and., respectvely. (a)
More informationFeature Selection: Part 1
CSE 546: Machne Learnng Lecture 5 Feature Selecton: Part 1 Instructor: Sham Kakade 1 Regresson n the hgh dmensonal settng How do we learn when the number of features d s greater than the sample sze n?
More informationSolutions HW #2. minimize. Ax = b. Give the dual problem, and make the implicit equality constraints explicit. Solution.
Solutons HW #2 Dual of general LP. Fnd the dual functon of the LP mnmze subject to c T x Gx h Ax = b. Gve the dual problem, and make the mplct equalty constrants explct. Soluton. 1. The Lagrangan s L(x,
More informationHidden Markov Models & The Multivariate Gaussian (10/26/04)
CS281A/Stat241A: Statstcal Learnng Theory Hdden Markov Models & The Multvarate Gaussan (10/26/04) Lecturer: Mchael I. Jordan Scrbes: Jonathan W. Hu 1 Hdden Markov Models As a bref revew, hdden Markov models
More informationLecture 14: Forces and Stresses
The Nuts and Bolts of Frst-Prncples Smulaton Lecture 14: Forces and Stresses Durham, 6th-13th December 2001 CASTEP Developers Group wth support from the ESF ψ k Network Overvew of Lecture Why bother? Theoretcal
More informationCOGNITIVE RADIO NETWORKS BASED ON OPPORTUNISTIC BEAMFORMING WITH QUANTIZED FEEDBACK
COGNITIVE RADIO NETWORKS BASED ON OPPORTUNISTIC BEAMFORMING WITH QUANTIZED FEEDBACK Ayman MASSAOUDI, Noura SELLAMI 2, Mohamed SIALA MEDIATRON Lab., Sup Com Unversty of Carthage 283 El Ghazala Arana, Tunsa
More information18.1 Introduction and Recap
CS787: Advanced Algorthms Scrbe: Pryananda Shenoy and Shjn Kong Lecturer: Shuch Chawla Topc: Streamng Algorthmscontnued) Date: 0/26/2007 We contnue talng about streamng algorthms n ths lecture, ncludng
More informationDigital Signal Processing
Dgtal Sgnal Processng Dscrete-tme System Analyss Manar Mohasen Offce: F8 Emal: manar.subh@ut.ac.r School of IT Engneerng Revew of Precedent Class Contnuous Sgnal The value of the sgnal s avalable over
More informationFlexible Allocation of Capacity in Multi-Cell CDMA Networks
Flexble Allocaton of Capacty n Mult-Cell CDMA Networs Robert Al, Manu Hegde, Mort Naragh-Pour*, Paul Mn Washngton Unversty, St. Lous, MO *Lousana State Unversty, Baton Rouge, LA Outlne Capacty and Probablty
More informationIntroduction to Information Theory, Data Compression,
Introducton to Informaton Theory, Data Compresson, Codng Mehd Ibm Brahm, Laura Mnkova Aprl 5, 208 Ths s the augmented transcrpt of a lecture gven by Luc Devroye on the 3th of March 208 for a Data Structures
More informationClustering gene expression data & the EM algorithm
CG, Fall 2011-12 Clusterng gene expresson data & the EM algorthm CG 08 Ron Shamr 1 How Gene Expresson Data Looks Entres of the Raw Data matrx: Rato values Absolute values Row = gene s expresson pattern
More information9.913 Pattern Recognition for Vision. Class IV Part I Bayesian Decision Theory Yuri Ivanov
9.93 Class IV Part I Bayesan Decson Theory Yur Ivanov TOC Roadmap to Machne Learnng Bayesan Decson Makng Mnmum Error Rate Decsons Mnmum Rsk Decsons Mnmax Crteron Operatng Characterstcs Notaton x - scalar
More informationEdge Isoperimetric Inequalities
November 7, 2005 Ross M. Rchardson Edge Isopermetrc Inequaltes 1 Four Questons Recall that n the last lecture we looked at the problem of sopermetrc nequaltes n the hypercube, Q n. Our noton of boundary
More informationProbability-Theoretic Junction Trees
Probablty-Theoretc Juncton Trees Payam Pakzad, (wth Venkat Anantharam, EECS Dept, U.C. Berkeley EPFL, ALGO/LMA Semnar 2/2/2004 Margnalzaton Problem Gven an arbtrary functon of many varables, fnd (some
More informationLecture 12: Classification
Lecture : Classfcaton g Dscrmnant functons g The optmal Bayes classfer g Quadratc classfers g Eucldean and Mahalanobs metrcs g K Nearest Neghbor Classfers Intellgent Sensor Systems Rcardo Guterrez-Osuna
More informationAn Upper Bound on SINR Threshold for Call Admission Control in Multiple-Class CDMA Systems with Imperfect Power-Control
An Upper Bound on SINR Threshold for Call Admsson Control n Multple-Class CDMA Systems wth Imperfect ower-control Mahmoud El-Sayes MacDonald, Dettwler and Assocates td. (MDA) Toronto, Canada melsayes@hotmal.com
More informationAn Improved multiple fractal algorithm
Advanced Scence and Technology Letters Vol.31 (MulGraB 213), pp.184-188 http://dx.do.org/1.1427/astl.213.31.41 An Improved multple fractal algorthm Yun Ln, Xaochu Xu, Jnfeng Pang College of Informaton
More informationLec 12 Rate-Distortion Optimization (RDO) in Video Coding-II
Sprng 07: Multmeda Communcaton Lec ate-dstorton Optmzaton (DO) n Vdeo Codng-II Zhu L Course Web: http://l.web.umkc.edu/lzhu/ Z. L Multmeda Communcaton, Sprng 07 p. Outlne Lec ecap Lagrangan Method HW-3
More informationDynamic Programming. Preview. Dynamic Programming. Dynamic Programming. Dynamic Programming (Example: Fibonacci Sequence)
/24/27 Prevew Fbonacc Sequence Longest Common Subsequence Dynamc programmng s a method for solvng complex problems by breakng them down nto smpler sub-problems. It s applcable to problems exhbtng the propertes
More informationThe Feynman path integral
The Feynman path ntegral Aprl 3, 205 Hesenberg and Schrödnger pctures The Schrödnger wave functon places the tme dependence of a physcal system n the state, ψ, t, where the state s a vector n Hlbert space
More informationQuadratic speedup for unstructured search - Grover s Al-
Quadratc speedup for unstructured search - Grover s Al- CS 94- gorthm /8/07 Sprng 007 Lecture 11 001 Unstructured Search Here s the problem: You are gven a boolean functon f : {1,,} {0,1}, and are promsed
More informationIV. Performance Optimization
IV. Performance Optmzaton A. Steepest descent algorthm defnton how to set up bounds on learnng rate mnmzaton n a lne (varyng learnng rate) momentum learnng examples B. Newton s method defnton Gauss-Newton
More informationGrover s Algorithm + Quantum Zeno Effect + Vaidman
Grover s Algorthm + Quantum Zeno Effect + Vadman CS 294-2 Bomb 10/12/04 Fall 2004 Lecture 11 Grover s algorthm Recall that Grover s algorthm for searchng over a space of sze wors as follows: consder the
More informationProbability Theory. The nth coefficient of the Taylor series of f(k), expanded around k = 0, gives the nth moment of x as ( ik) n n!
8333: Statstcal Mechancs I Problem Set # 3 Solutons Fall 3 Characterstc Functons: Probablty Theory The characterstc functon s defned by fk ep k = ep kpd The nth coeffcent of the Taylor seres of fk epanded
More informationFor now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results.
Neural Networks : Dervaton compled by Alvn Wan from Professor Jtendra Malk s lecture Ths type of computaton s called deep learnng and s the most popular method for many problems, such as computer vson
More information3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X
Statstcs 1: Probablty Theory II 37 3 EPECTATION OF SEVERAL RANDOM VARIABLES As n Probablty Theory I, the nterest n most stuatons les not on the actual dstrbuton of a random vector, but rather on a number
More informationPhysics 607 Exam 1. ( ) = 1, Γ( z +1) = zγ( z) x n e x2 dx = 1. e x2
Physcs 607 Exam 1 Please be well-organzed, and show all sgnfcant steps clearly n all problems. You are graded on your wor, so please do not just wrte down answers wth no explanaton! Do all your wor on
More informationAutomatic Object Trajectory- Based Motion Recognition Using Gaussian Mixture Models
Automatc Object Trajectory- Based Moton Recognton Usng Gaussan Mxture Models Fasal I. Bashr, Ashfaq A. Khokhar, Dan Schonfeld Electrcal and Computer Engneerng, Unversty of Illnos at Chcago. Chcago, IL,
More informationPHYS 705: Classical Mechanics. Calculus of Variations II
1 PHYS 705: Classcal Mechancs Calculus of Varatons II 2 Calculus of Varatons: Generalzaton (no constrant yet) Suppose now that F depends on several dependent varables : We need to fnd such that has a statonary
More information