Compression in the Real World :Algorithms in the Real World. Compression in the Real World. Compression Outline

Size: px
Start display at page:

Download "Compression in the Real World :Algorithms in the Real World. Compression in the Real World. Compression Outline"

Transcription

1 Compresson n the Real World 5-853:Algorthms n the Real World Data Compresson: Lectures and 2 Generc Fle Compresson Fles: gzp (LZ77), bzp (Burrows-Wheeler), BOA (PPM) Archvers: ARC (LZW), PKZp (LZW+) Fle systems: NTFS Communcaton Fax: ITU-T Group 3 (run-length + Huffman) Modems: V.42bs protocol (LZW), MNP5 (run-length+huffman) Vrtual Connectons Page Page 2 Compresson n the Real World Multmeda Images: gf (LZW), jbg (context), jpeg-ls (resdual), jpeg (transform+rl+arthmetc) TV: HDTV (mpeg-4) Sound: mp3 An example Other structures Indexes: google, lycos Meshes (for graphcs): edgebreaker Graphs Databases: Page 3 Compresson Outlne Introducton: Lossless vs. lossy Model and coder Benchmarks Informaton Theory: Entropy, etc. Probablty Codng: Huffman + Arthmetc Codng Applcatons of Probablty Codng: PPM + others Lempel-Zv Algorthms: LZ77, gzp, compress,... Other Lossless Algorthms: Burrows-Wheeler Lossy algorthms for mages: JPEG, MPEG,... Compressng graphs and meshes: BBK Page 4

2 Encodng/Decodng Wll use message n generc sense to mean the data to be compressed Input Message Encoder Compressed Message Decoder The encoder and decoder need to understand common compressed format. Output Message Lossless vs. Lossy Lossless: Input message = Output message Lossy: Input message Output message Lossy does not necessarly mean loss of qualty. In fact the output could be better than the nput. Drop random nose n mages (dust on lens) Drop background n musc Fx spellng errors n text. Put nto better form. Wrtng s the art of lossy text compresson Page Page 6 How much can we compress? For lossless compresson, assumng all nput messages are vald, f even one strng s compressed, some other must expand. Model vs. Coder To compress we need a bas on the probablty of messages. The model determnes ths bas Messages Model Encoder Probs. Coder Bts Example models: Smple: Character counts, repeated strngs Complex: Models of a human face Page Page 8 2

3 Qualty of Compresson Runtme vs. Compresson vs. Generalty Several standard corpuses to compare algorthms e.g. Calgary Corpus 2 books, 5 papers, bblography, collecton of news artcles, 3 programs, termnal sesson, 2 object fles, geophyscal data, btmap bw mage The Archve Comparson Test mantans a comparson of just about all algorthms publcly avalable Comparson of Algorthms Program Algorthm Tme BPC Score RK LZ + PPM BOA PPM Var PPMD PPM IMP BW BZIP BW GZIP LZ77 Var LZ77 LZ77? 3.94? Page Page 0 Compresson Outlne Introducton: Lossy vs. Lossless, Benchmarks, Informaton Theory: Entropy Condtonal Entropy Entropy of the Englsh Language Probablty Codng: Huffman + Arthmetc Codng Applcatons of Probablty Codng: PPM + others Lempel-Zv Algorthms: LZ77, gzp, compress,... Other Lossless Algorthms: Burrows-Wheeler Lossy algorthms for mages: JPEG, MPEG,... Compressng graphs and meshes: BBK Page Informaton Theory An nterface between modelng and codng Entropy A measure of nformaton content Condtonal Entropy Informaton content based on a context Entropy of the Englsh Language How much nformaton does each character n typcal Englsh text contan? Page 2 3

4 Entropy (Shannon 948) For a set of messages S wth probablty p(s), s S, the self nformaton of s s: s () = log = log ps ( ) ps () Measured n bts f the log s base 2. The lower the probablty, the hgher the nformaton Entropy s the weghted average of self nformaton. HS ( ) = ps ( )log ps () Entropy Example p( S) = {. 25,. 25,. 25,. 25,. 25} H ( S) = 3.25log log8 = 2.25 p( S) = {. 5,. 25,. 25,. 25,. 25} H ( S) =.5log log8 = 2 p( S) = {. 75,. 0625,. 0625,. 0625,. 0625} H ( S) =.75 log(4 3) log6 = Page Page 4 Condtonal Entropy Example of a Markov Chan The condtonal probablty p(s c) s the probablty of s n a context c. The condtonal self nformaton s sc ( ) = log psc ( ) The condtonal nformaton can be ether more or less than the uncondtonal nformaton. The condtonal entropy s the weghted average of the condtonal self nformaton H ( S C) = p( c) p( s c)log c C p( s c) p(w w).9 w. p(b w) p(w b).2 b p(b b) Page Page 6 4

5 Entropy of the Englsh Language How can we measure the nformaton per character? ASCII code = 7 Entropy = 4.5 (based on character probabltes) Huffman codes (average) = 4.7 Unx Compress = 3.5 Gzp = 2.6 Bzp =.9 Entropy =.3 (for text compresson test ) Must be less than.3 for Englsh language. Shannon s experment Asked humans to predct the next character gven the whole prevous text. He used these as condtonal probabltes to estmate the entropy of the Englsh Language. The number of guesses requred for rght answer: # of guesses > 5 Probablty From the experment he predcted H(Englsh) = Page Page 8 Compresson Outlne Introducton: Lossy vs. Lossless, Benchmarks, Informaton Theory: Entropy, etc. Probablty Codng: Prefx codes and relatonshp to Entropy Huffman codes Arthmetc codes Applcatons of Probablty Codng: PPM + others Lempel-Zv Algorthms: LZ77, gzp, compress,... Other Lossless Algorthms: Burrows-Wheeler Lossy algorthms for mages: JPEG, MPEG,... Compressng graphs and meshes: BBK Page 9 Assumptons and Defntons Communcaton (or a fle) s broken up nto peces called messages. Each message come from a message set S = {s,,s n } wth a probablty dstrbuton p(s). Probabltes must sum to. Set can be nfnte. Code C(s): A mappng from a message set to codewords, each of whch s a strng of bts Message sequence: a sequence of messages Note: Adjacent messages mght be of a dfferent types and come from a dfferent probablty dstrbutons Page 20 5

6 Dscrete or Blended We wll consder two types of codng: Dscrete: each message s a fxed set of bts Huffman codng, Shannon-Fano codng message: Blended: bts can be shared among messages Arthmetc codng Unquely Decodable Codes A varable length code assgns a bt strng (codeword) of varable length to every message value e.g. a =, b = 0, c = 0, d = 0 What f you get the sequence of bts 0? Is t aba, ca, or, ad? A unquely decodable code s a varable length code n whch bt strngs can always be unquely decomposed nto ts codewords. message:,2,3, and Page Page 22 Prefx Codes A prefx code s a varable length code n whch no codeword s a prefx of another word. e.g., a = 0, b = 0, c =, d = 0 All prefx codes are unquely decodable Prefx Codes: as a tree Can be vewed as a bnary tree wth message values at the leaves and 0s or s on the edges: a d b c a = 0, b = 0, c =, d = Page Page 24 6

7 Some Prefx Codes for Integers n Bnary Unary Gamma Average Length For a code C wth assocated probabltes p(c) the average length s defned as l ( C) = p()() c l c a c C We say that a prefx code C s optmal f for all prefx codes C, l a (C) l a (C ) l(c) = length of the codeword c (a postve nteger) Many other fxed prefx codes: Golomb, phased-bnary, subexponental, Page Page 26 Relatonshp to Entropy Theorem (lower bound): For any probablty dstrbuton p(s) wth assocated unquely decodable code C, HS ( ) l( C) a Theorem (upper bound): For any probablty dstrbuton p(s) wth assocated optmal prefx code C, la ( C) H( S) + Kraft McMllan Inequalty Theorem (Kraft-McMllan): For any unquely decodable code C, c C Also, for any set of lengths L such that 2 l l L there s a prefx code C such that 2 l( c) l( c ) = l (,..., L ) = Page Page 28 7

8 Proof of the Upper Bound (Part ) Assgn each message a length: We then have l s 2 = 2 ( p( )) So by the Kraft-McMllan nequalty there s a prefx code wth lengths l(s). l ( s) = log s 2 ( p s ) ( ) log / ( ) = = ( p s ) log / ( ) ps () Page 29 Proof of the Upper Bound (Part 2) Now we can calculate the average length gven l(s) l ( S) = p( s) l( s) a And we are done. ( ps) = ps () log / () ps ( ) ( + log( / ps ( ))) = + ps ( ) log( / ps ( )) = + HS ( ) Page 30 Another property of optmal codes Theorem: If C s an optmal prefx code for the probabltes {p,, p n } then p > p j mples l(c ) l(c j ) Proof: (by contradcton) Assume l(c ) > l(c j ). Consder swtchng codes c and c j. If l a s the average length of the orgnal code, the length of the new code s ' la = la + pj( lc ( ) lc ( j)) + p( lc ( j) lc ( )) = la + ( pj p)( l( c) l( cj)) < la Ths s a contradcton snce l a s not optmal Page 3 Huffman Codes Invented by Huffman as a class assgnment n 950. Used n many, f not most, compresson algorthms gzp, bzp, jpeg (as opton), fax compresson, Propertes: Generates optmal prefx codes Cheap to generate codes Cheap to encode and decode l a = H f probabltes are powers of Page 32 8

9 Huffman Codes Huffman Algorthm: Start wth a forest of trees each consstng of a sngle vertex correspondng to a message s and wth weght p(s) Repeat untl one tree left: Select two trees wth mnmum weght roots p and p 2 Jon nto sngle tree by addng root wth weght p + p Page 33 Example p(a) =., p(b) =.2, p(c ) =.2, p(d) =.5 a(.) b(.2) c(.2) d(.5) (.3) (.5) (.0) 0 a(.) b(.2) (.3) (.5) d(.5) c(.2) 0 Step a(.) b(.2) (.3) c(.2) 0 Step 2 a(.) b(.2) Step 3 a=000, b=00, c=0, d= Page 34 Encodng and Decodng Encodng: Start at leaf of Huffman tree and follow path to the root. Reverse order of bts and send. Decodng: Start at root of Huffman tree and take branch for each bt receved. When at leaf can output message and return to root. There are even faster methods that (.0) 0 can process 8 or 32 bts at a tme (.5) d(.5) 0 (.3) c(.2) 0 a(.) b(.2) Page 35 Huffman codes are optmal Theorem: The Huffman algorthm generates an optmal prefx code. Proof outlne: Inducton on the number of messages n. Consder a message set S wth n+ messages. Can make t so least probable messages of S are neghbors n the Huffman tree 2. Replace the two messages wth one message wth probablty p(m ) + p(m 2 ) makng S 3. Show that f S s optmal, then S s optmal 4. S s optmal by nducton Page 36 9

10 Problem wth Huffman Codng Consder a message wth probablty.999. The self nformaton of ths message s log(. 999) =.0044 If we were to send a 000 such message we mght hope to use 000*.004 =.44 bts. Usng Huffman codes we requre at least one bt per message, so we would requre 000 bts. Arthmetc Codng: Introducton Allows blendng of bts n a message sequence. Only requres 3 bts for the example Can bound total bts requred based on sum of self nformaton: n l < 2 + s = Used n PPM, JPEG/MPEG (as opton), DMM More expensve than Huffman codng, but nteger mplementaton s not too bad Page Page 38 Arthmetc Codng: message ntervals Assgn each probablty dstrbuton to an nterval range from 0 (nclusve) to (exclusve). e.g..0 c =.3 = 0.7 f ( ) p( j) j= b =.5 f(a) =.0, f(b) =.2, f(c) = a = The nterval for a partcular message wll be called the message nterval (e.g for b the nterval s [.2,.7)) Page 39 Arthmetc Codng: sequence ntervals Code a message sequence by composng ntervals. For example: bac c =.3 c =.3 c = b =.5 a = b =.5 a =.2 The fnal nterval s [.27,.3) We call ths the sequence nterval b =.5 a = Page 40 0

11 Arthmetc Codng: sequence ntervals To code a sequence of messages wth probabltes p ( =..n) use the followng: l = f l = l + s f bottom of nterval s = p s = s p sze of nterval Each message narrows the nterval by a factor of p. Fnal nterval sze: n s n = p = Warnng Three types of nterval: message nterval : nterval for a sngle message sequence nterval : composton of message ntervals code nterval : nterval for a specfc code used to represent a sequence nterval (dscussed later) Page Page 42 Unquely defnng an nterval Important property:the sequence ntervals for dstnct message sequences of length n wll never overlap Therefore: specfyng any number n the fnal nterval unquely determnes the sequence. Decodng s smlar to encodng, but on each step need to determne what the message value s and then reduce nterval Arthmetc Codng: Decodng Example Decodng the number.49, knowng the message s of length 3: c =.3 b =.5 a =.2 The message s bbc c =.3 b =.5 a = c =.3 b =.5 a = Page Page 44

12 Representng Fractons Bnary fractonal representaton:.75 / 3 /6 =. =.00 =.0 So how about just usng the smallest bnary fractonal representaton n the sequence nterval. e.g. [0,.33) =.0 [.33,.66) =. [.66,) =. But what f you receve a? Should we wat for another? Representng an Interval Can vew bnary fractonal numbers as ntervals by consderng all completons. e.g. mn max nterval.. 0. [. 750,. ) [. 625,. 75) We wll call ths the code nterval Page Page 46 Code Intervals: example [0,.33) =.0 [.33,.66) =. [.66,) = Note that f code ntervals overlap then one code s a prefx of the other. Lemma: If a set of code ntervals do not overlap then the correspondng codes form a prefx code. Selectng the Code Interval To fnd a prefx code fnd a bnary fractonal number whose code nterval s contaned n the sequence nterval Sequence Interval Code Interval (.0) Can use the fracton l + s/2 truncated to log( s 2) = + log s bts Page Page 48 2

13 Selectng a code nterval: example [0,.33) =.00 [.33,.66) =.00 [.66,) = e.g: for [.33,.66 ), l =.33, s =.33 l + s/2 =.5 =.000 truncated to + log s = + log(.33) = 3 bts s.00 Is ths the best we can do for [0,.33)? RealArth Encodng and Decodng RealArthEncode: Determne l and s usng orgnal recurrences Code usng l + s/2 truncated to + -log s bts RealArthDecode: Read bts as needed so code nterval falls wthn a message nterval, and then narrow sequence nterval. Repeat untl n messages have been decoded Page Page 50 Bound on Length Theorem: For n messages wth self nformaton {s,,s n } RealArthEncode wll generate at most 2 + n s = bts. Proof: log s + = n + log p = = n + log p = = + n s = < 2 + n s = Page 5 Integer Arthmetc Codng Problem wth RealArthCode s that operatons on arbtrary precson real numbers s expensve. Key Ideas of nteger verson: Keep ntegers n range [0..R) where R=2 k Use roundng to generate nteger sequence nterval Whenever sequence nterval falls nto top, bottom or mddle half, expand the nterval by factor of 2 Ths nteger Algorthm s an approxmaton or the real algorthm Page 52 3

14 Integer Arthmetc Codng Integer Arthmetc (contractng) The probablty dstrbuton as ntegers R=256 Probabltes as counts: e.g. c(a) =, c(b) = 7, c(c) = 30 T s the sum of counts e.g. 48 (+7+30) T=48 Partal sums f as before: e.g. f(a) = 0, f(b) =, f(c) = 8 c = 30 Requre that R > 4T so that 8 probabltes do not get rounded to b = 7 zero a = 0 l = 0, s = R s u l = = = u l + l l + s ( f + s ) / T + s f / T R=256 u l Page Page 54 Integer Arthmetc (scalng) If l R/2 then (n top half) Output followed by m 0s m = 0 Scale message nterval by expandng by 2 If u < R/2 then (n bottom half) Output 0 followed by m s m = 0 Scale message nterval by expandng by 2 If l R/4 and u < 3R/4 then (n mddle half) Increment m Scale message nterval by expandng by Page 55 4

Entropy Coding. A complete entropy codec, which is an encoder/decoder. pair, consists of the process of encoding or

Entropy Coding. A complete entropy codec, which is an encoder/decoder. pair, consists of the process of encoding or Sgnal Compresson Sgnal Compresson Entropy Codng Entropy codng s also known as zero-error codng, data compresson or lossless compresson. Entropy codng s wdely used n vrtually all popular nternatonal multmeda

More information

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur Module 3 LOSSY IMAGE COMPRESSION SYSTEMS Verson ECE IIT, Kharagpur Lesson 6 Theory of Quantzaton Verson ECE IIT, Kharagpur Instructonal Objectves At the end of ths lesson, the students should be able to:

More information

Introduction to information theory and data compression

Introduction to information theory and data compression Introducton to nformaton theory and data compresson Adel Magra, Emma Gouné, Irène Woo March 8, 207 Ths s the augmented transcrpt of a lecture gven by Luc Devroye on March 9th 207 for a Data Structures

More information

Introduction to Information Theory, Data Compression,

Introduction to Information Theory, Data Compression, Introducton to Informaton Theory, Data Compresson, Codng Mehd Ibm Brahm, Laura Mnkova Aprl 5, 208 Ths s the augmented transcrpt of a lecture gven by Luc Devroye on the 3th of March 208 for a Data Structures

More information

Chapter 8 SCALAR QUANTIZATION

Chapter 8 SCALAR QUANTIZATION Outlne Chapter 8 SCALAR QUANTIZATION Yeuan-Kuen Lee [ CU, CSIE ] 8.1 Overvew 8. Introducton 8.4 Unform Quantzer 8.5 Adaptve Quantzaton 8.6 Nonunform Quantzaton 8.7 Entropy-Coded Quantzaton Ch 8 Scalar

More information

ECE 534: Elements of Information Theory. Solutions to Midterm Exam (Spring 2006)

ECE 534: Elements of Information Theory. Solutions to Midterm Exam (Spring 2006) ECE 534: Elements of Informaton Theory Solutons to Mdterm Eam (Sprng 6) Problem [ pts.] A dscrete memoryless source has an alphabet of three letters,, =,, 3, wth probabltes.4,.4, and., respectvely. (a)

More information

Lecture 3: Shannon s Theorem

Lecture 3: Shannon s Theorem CSE 533: Error-Correctng Codes (Autumn 006 Lecture 3: Shannon s Theorem October 9, 006 Lecturer: Venkatesan Guruswam Scrbe: Wdad Machmouch 1 Communcaton Model The communcaton model we are usng conssts

More information

Stanford University CS359G: Graph Partitioning and Expanders Handout 4 Luca Trevisan January 13, 2011

Stanford University CS359G: Graph Partitioning and Expanders Handout 4 Luca Trevisan January 13, 2011 Stanford Unversty CS359G: Graph Parttonng and Expanders Handout 4 Luca Trevsan January 3, 0 Lecture 4 In whch we prove the dffcult drecton of Cheeger s nequalty. As n the past lectures, consder an undrected

More information

Problem Set 9 Solutions

Problem Set 9 Solutions Desgn and Analyss of Algorthms May 4, 2015 Massachusetts Insttute of Technology 6.046J/18.410J Profs. Erk Demane, Srn Devadas, and Nancy Lynch Problem Set 9 Solutons Problem Set 9 Solutons Ths problem

More information

College of Computer & Information Science Fall 2009 Northeastern University 20 October 2009

College of Computer & Information Science Fall 2009 Northeastern University 20 October 2009 College of Computer & Informaton Scence Fall 2009 Northeastern Unversty 20 October 2009 CS7880: Algorthmc Power Tools Scrbe: Jan Wen and Laura Poplawsk Lecture Outlne: Prmal-dual schema Network Desgn:

More information

Tornado and Luby Transform Codes. Ashish Khisti Presentation October 22, 2003

Tornado and Luby Transform Codes. Ashish Khisti Presentation October 22, 2003 Tornado and Luby Transform Codes Ashsh Khst 6.454 Presentaton October 22, 2003 Background: Erasure Channel Elas[956] studed the Erasure Channel β x x β β x 2 m x 2 k? Capacty of Noseless Erasure Channel

More information

EGR 544 Communication Theory

EGR 544 Communication Theory EGR 544 Communcaton Theory. Informaton Sources Z. Alyazcoglu Electrcal and Computer Engneerng Department Cal Poly Pomona Introducton Informaton Source x n Informaton sources Analog sources Dscrete sources

More information

18.1 Introduction and Recap

18.1 Introduction and Recap CS787: Advanced Algorthms Scrbe: Pryananda Shenoy and Shjn Kong Lecturer: Shuch Chawla Topc: Streamng Algorthmscontnued) Date: 0/26/2007 We contnue talng about streamng algorthms n ths lecture, ncludng

More information

Errors for Linear Systems

Errors for Linear Systems Errors for Lnear Systems When we solve a lnear system Ax b we often do not know A and b exactly, but have only approxmatons  and ˆb avalable. Then the best thng we can do s to solve ˆx ˆb exactly whch

More information

j) = 1 (note sigma notation) ii. Continuous random variable (e.g. Normal distribution) 1. density function: f ( x) 0 and f ( x) dx = 1

j) = 1 (note sigma notation) ii. Continuous random variable (e.g. Normal distribution) 1. density function: f ( x) 0 and f ( x) dx = 1 Random varables Measure of central tendences and varablty (means and varances) Jont densty functons and ndependence Measures of assocaton (covarance and correlaton) Interestng result Condtonal dstrbutons

More information

APPENDIX A Some Linear Algebra

APPENDIX A Some Linear Algebra APPENDIX A Some Lnear Algebra The collecton of m, n matrces A.1 Matrces a 1,1,..., a 1,n A = a m,1,..., a m,n wth real elements a,j s denoted by R m,n. If n = 1 then A s called a column vector. Smlarly,

More information

Lecture 3: Probability Distributions

Lecture 3: Probability Distributions Lecture 3: Probablty Dstrbutons Random Varables Let us begn by defnng a sample space as a set of outcomes from an experment. We denote ths by S. A random varable s a functon whch maps outcomes nto the

More information

NUMERICAL DIFFERENTIATION

NUMERICAL DIFFERENTIATION NUMERICAL DIFFERENTIATION 1 Introducton Dfferentaton s a method to compute the rate at whch a dependent output y changes wth respect to the change n the ndependent nput x. Ths rate of change s called the

More information

VQ widely used in coding speech, image, and video

VQ widely used in coding speech, image, and video at Scalar quantzers are specal cases of vector quantzers (VQ): they are constraned to look at one sample at a tme (memoryless) VQ does not have such constrant better RD perfomance expected Source codng

More information

Bit Juggling. Representing Information. representations. - Some other bits. - Representing information using bits - Number. Chapter

Bit Juggling. Representing Information. representations. - Some other bits. - Representing information using bits - Number. Chapter Representng Informaton 1 1 1 1 Bt Jugglng - Representng nformaton usng bts - Number representatons - Some other bts Chapter 3.1-3.3 REMINDER: Problem Set #1 s now posted and s due next Wednesday L3 Encodng

More information

Linear Regression Analysis: Terminology and Notation

Linear Regression Analysis: Terminology and Notation ECON 35* -- Secton : Basc Concepts of Regresson Analyss (Page ) Lnear Regresson Analyss: Termnology and Notaton Consder the generc verson of the smple (two-varable) lnear regresson model. It s represented

More information

A New Scrambling Evaluation Scheme based on Spatial Distribution Entropy and Centroid Difference of Bit-plane

A New Scrambling Evaluation Scheme based on Spatial Distribution Entropy and Centroid Difference of Bit-plane A New Scramblng Evaluaton Scheme based on Spatal Dstrbuton Entropy and Centrod Dfference of Bt-plane Lang Zhao *, Avshek Adhkar Kouch Sakura * * Graduate School of Informaton Scence and Electrcal Engneerng,

More information

Introduction to Vapor/Liquid Equilibrium, part 2. Raoult s Law:

Introduction to Vapor/Liquid Equilibrium, part 2. Raoult s Law: CE304, Sprng 2004 Lecture 4 Introducton to Vapor/Lqud Equlbrum, part 2 Raoult s Law: The smplest model that allows us do VLE calculatons s obtaned when we assume that the vapor phase s an deal gas, and

More information

Lecture 4: November 17, Part 1 Single Buffer Management

Lecture 4: November 17, Part 1 Single Buffer Management Lecturer: Ad Rosén Algorthms for the anagement of Networs Fall 2003-2004 Lecture 4: November 7, 2003 Scrbe: Guy Grebla Part Sngle Buffer anagement In the prevous lecture we taled about the Combned Input

More information

Calculation of time complexity (3%)

Calculation of time complexity (3%) Problem 1. (30%) Calculaton of tme complexty (3%) Gven n ctes, usng exhaust search to see every result takes O(n!). Calculaton of tme needed to solve the problem (2%) 40 ctes:40! dfferent tours 40 add

More information

Expected Value and Variance

Expected Value and Variance MATH 38 Expected Value and Varance Dr. Neal, WKU We now shall dscuss how to fnd the average and standard devaton of a random varable X. Expected Value Defnton. The expected value (or average value, or

More information

Math Review. CptS 223 Advanced Data Structures. Larry Holder School of Electrical Engineering and Computer Science Washington State University

Math Review. CptS 223 Advanced Data Structures. Larry Holder School of Electrical Engineering and Computer Science Washington State University Math Revew CptS 223 dvanced Data Structures Larry Holder School of Electrcal Engneerng and Computer Scence Washngton State Unversty 1 Why do we need math n a data structures course? nalyzng data structures

More information

Lecture 14 (03/27/18). Channels. Decoding. Preview of the Capacity Theorem.

Lecture 14 (03/27/18). Channels. Decoding. Preview of the Capacity Theorem. Lecture 14 (03/27/18). Channels. Decodng. Prevew of the Capacty Theorem. A. Barg The concept of a communcaton channel n nformaton theory s an abstracton for transmttng dgtal (and analog) nformaton from

More information

6. Stochastic processes (2)

6. Stochastic processes (2) Contents Markov processes Brth-death processes Lect6.ppt S-38.45 - Introducton to Teletraffc Theory Sprng 5 Markov process Consder a contnuous-tme and dscrete-state stochastc process X(t) wth state space

More information

} Often, when learning, we deal with uncertainty:

} Often, when learning, we deal with uncertainty: Uncertanty and Learnng } Often, when learnng, we deal wth uncertanty: } Incomplete data sets, wth mssng nformaton } Nosy data sets, wth unrelable nformaton } Stochastcty: causes and effects related non-determnstcally

More information

6. Stochastic processes (2)

6. Stochastic processes (2) 6. Stochastc processes () Lect6.ppt S-38.45 - Introducton to Teletraffc Theory Sprng 5 6. Stochastc processes () Contents Markov processes Brth-death processes 6. Stochastc processes () Markov process

More information

NP-Completeness : Proofs

NP-Completeness : Proofs NP-Completeness : Proofs Proof Methods A method to show a decson problem Π NP-complete s as follows. (1) Show Π NP. (2) Choose an NP-complete problem Π. (3) Show Π Π. A method to show an optmzaton problem

More information

Spectral Graph Theory and its Applications September 16, Lecture 5

Spectral Graph Theory and its Applications September 16, Lecture 5 Spectral Graph Theory and ts Applcatons September 16, 2004 Lecturer: Danel A. Spelman Lecture 5 5.1 Introducton In ths lecture, we wll prove the followng theorem: Theorem 5.1.1. Let G be a planar graph

More information

First day August 1, Problems and Solutions

First day August 1, Problems and Solutions FOURTH INTERNATIONAL COMPETITION FOR UNIVERSITY STUDENTS IN MATHEMATICS July 30 August 4, 997, Plovdv, BULGARIA Frst day August, 997 Problems and Solutons Problem. Let {ε n } n= be a sequence of postve

More information

ENTROPIC QUESTIONING

ENTROPIC QUESTIONING ENTROPIC QUESTIONING NACHUM. Introucton Goal. Pck the queston that contrbutes most to fnng a sutable prouct. Iea. Use an nformaton-theoretc measure. Bascs. Entropy (a non-negatve real number) measures

More information

Affine transformations and convexity

Affine transformations and convexity Affne transformatons and convexty The purpose of ths document s to prove some basc propertes of affne transformatons nvolvng convex sets. Here are a few onlne references for background nformaton: http://math.ucr.edu/

More information

3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X

3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X Statstcs 1: Probablty Theory II 37 3 EPECTATION OF SEVERAL RANDOM VARIABLES As n Probablty Theory I, the nterest n most stuatons les not on the actual dstrbuton of a random vector, but rather on a number

More information

find (x): given element x, return the canonical element of the set containing x;

find (x): given element x, return the canonical element of the set containing x; COS 43 Sprng, 009 Dsjont Set Unon Problem: Mantan a collecton of dsjont sets. Two operatons: fnd the set contanng a gven element; unte two sets nto one (destructvely). Approach: Canoncal element method:

More information

Edge Isoperimetric Inequalities

Edge Isoperimetric Inequalities November 7, 2005 Ross M. Rchardson Edge Isopermetrc Inequaltes 1 Four Questons Recall that n the last lecture we looked at the problem of sopermetrc nequaltes n the hypercube, Q n. Our noton of boundary

More information

U.C. Berkeley CS294: Spectral Methods and Expanders Handout 8 Luca Trevisan February 17, 2016

U.C. Berkeley CS294: Spectral Methods and Expanders Handout 8 Luca Trevisan February 17, 2016 U.C. Berkeley CS94: Spectral Methods and Expanders Handout 8 Luca Trevsan February 7, 06 Lecture 8: Spectral Algorthms Wrap-up In whch we talk about even more generalzatons of Cheeger s nequaltes, and

More information

Lec 02 Entropy and Lossless Coding I

Lec 02 Entropy and Lossless Coding I Multmeda Communcaton, Fall 208 Lec 02 Entroy and Lossless Codng I Zhu L Z. L Multmeda Communcaton, Fall 208. Outlne Lecture 0 ReCa Info Theory on Entroy Lossless Entroy Codng Z. L Multmeda Communcaton,

More information

Hopfield networks and Boltzmann machines. Geoffrey Hinton et al. Presented by Tambet Matiisen

Hopfield networks and Boltzmann machines. Geoffrey Hinton et al. Presented by Tambet Matiisen Hopfeld networks and Boltzmann machnes Geoffrey Hnton et al. Presented by Tambet Matsen 18.11.2014 Hopfeld network Bnary unts Symmetrcal connectons http://www.nnwj.de/hopfeld-net.html Energy functon The

More information

Generalized Linear Methods

Generalized Linear Methods Generalzed Lnear Methods 1 Introducton In the Ensemble Methods the general dea s that usng a combnaton of several weak learner one could make a better learner. More formally, assume that we have a set

More information

Lossless Compression Performance of a Simple Counter- Based Entropy Coder

Lossless Compression Performance of a Simple Counter- Based Entropy Coder ITB J. ICT, Vol. 5, No. 3, 20, 73-84 73 Lossless Compresson Performance of a Smple Counter- Based Entropy Coder Armen Z. R. Lang,2 ITB Research Center on Informaton and Communcaton Technology 2 Informaton

More information

Dynamic Programming. Preview. Dynamic Programming. Dynamic Programming. Dynamic Programming (Example: Fibonacci Sequence)

Dynamic Programming. Preview. Dynamic Programming. Dynamic Programming. Dynamic Programming (Example: Fibonacci Sequence) /24/27 Prevew Fbonacc Sequence Longest Common Subsequence Dynamc programmng s a method for solvng complex problems by breakng them down nto smpler sub-problems. It s applcable to problems exhbtng the propertes

More information

Lossy Compression. Compromise accuracy of reconstruction for increased compression.

Lossy Compression. Compromise accuracy of reconstruction for increased compression. Lossy Compresson Compromse accuracy of reconstructon for ncreased compresson. The reconstructon s usually vsbly ndstngushable from the orgnal mage. Typcally, one can get up to 0:1 compresson wth almost

More information

I529: Machine Learning in Bioinformatics (Spring 2017) Markov Models

I529: Machine Learning in Bioinformatics (Spring 2017) Markov Models I529: Machne Learnng n Bonformatcs (Sprng 217) Markov Models Yuzhen Ye School of Informatcs and Computng Indana Unversty, Bloomngton Sprng 217 Outlne Smple model (frequency & profle) revew Markov chan

More information

4 Analysis of Variance (ANOVA) 5 ANOVA. 5.1 Introduction. 5.2 Fixed Effects ANOVA

4 Analysis of Variance (ANOVA) 5 ANOVA. 5.1 Introduction. 5.2 Fixed Effects ANOVA 4 Analyss of Varance (ANOVA) 5 ANOVA 51 Introducton ANOVA ANOVA s a way to estmate and test the means of multple populatons We wll start wth one-way ANOVA If the populatons ncluded n the study are selected

More information

HMMT February 2016 February 20, 2016

HMMT February 2016 February 20, 2016 HMMT February 016 February 0, 016 Combnatorcs 1. For postve ntegers n, let S n be the set of ntegers x such that n dstnct lnes, no three concurrent, can dvde a plane nto x regons (for example, S = {3,

More information

Stanford University CS254: Computational Complexity Notes 7 Luca Trevisan January 29, Notes for Lecture 7

Stanford University CS254: Computational Complexity Notes 7 Luca Trevisan January 29, Notes for Lecture 7 Stanford Unversty CS54: Computatonal Complexty Notes 7 Luca Trevsan January 9, 014 Notes for Lecture 7 1 Approxmate Countng wt an N oracle We complete te proof of te followng result: Teorem 1 For every

More information

Foundations of Arithmetic

Foundations of Arithmetic Foundatons of Arthmetc Notaton We shall denote the sum and product of numbers n the usual notaton as a 2 + a 2 + a 3 + + a = a, a 1 a 2 a 3 a = a The notaton a b means a dvdes b,.e. ac = b where c s an

More information

Module 2. Random Processes. Version 2 ECE IIT, Kharagpur

Module 2. Random Processes. Version 2 ECE IIT, Kharagpur Module Random Processes Lesson 6 Functons of Random Varables After readng ths lesson, ou wll learn about cdf of functon of a random varable. Formula for determnng the pdf of a random varable. Let, X be

More information

CS : Algorithms and Uncertainty Lecture 17 Date: October 26, 2016

CS : Algorithms and Uncertainty Lecture 17 Date: October 26, 2016 CS 29-128: Algorthms and Uncertanty Lecture 17 Date: October 26, 2016 Instructor: Nkhl Bansal Scrbe: Mchael Denns 1 Introducton In ths lecture we wll be lookng nto the secretary problem, and an nterestng

More information

Introduction to Econometrics (3 rd Updated Edition, Global Edition) Solutions to Odd-Numbered End-of-Chapter Exercises: Chapter 13

Introduction to Econometrics (3 rd Updated Edition, Global Edition) Solutions to Odd-Numbered End-of-Chapter Exercises: Chapter 13 Introducton to Econometrcs (3 rd Updated Edton, Global Edton by James H. Stock and Mark W. Watson Solutons to Odd-Numbered End-of-Chapter Exercses: Chapter 13 (Ths verson August 17, 014 Stock/Watson -

More information

Channel Encoder. Channel. Figure 7.1: Communication system

Channel Encoder. Channel. Figure 7.1: Communication system Chapter 7 Processes The model of a communcaton system that we have been developng s shown n Fgure 7.. Ths model s also useful for some computaton systems. The source s assumed to emt a stream of symbols.

More information

Notes on Frequency Estimation in Data Streams

Notes on Frequency Estimation in Data Streams Notes on Frequency Estmaton n Data Streams In (one of) the data streamng model(s), the data s a sequence of arrvals a 1, a 2,..., a m of the form a j = (, v) where s the dentty of the tem and belongs to

More information

More metrics on cartesian products

More metrics on cartesian products More metrcs on cartesan products If (X, d ) are metrc spaces for 1 n, then n Secton II4 of the lecture notes we defned three metrcs on X whose underlyng topologes are the product topology The purpose of

More information

Learning Theory: Lecture Notes

Learning Theory: Lecture Notes Learnng Theory: Lecture Notes Lecturer: Kamalka Chaudhur Scrbe: Qush Wang October 27, 2012 1 The Agnostc PAC Model Recall that one of the constrants of the PAC model s that the data dstrbuton has to be

More information

CHAPTER 17 Amortized Analysis

CHAPTER 17 Amortized Analysis CHAPTER 7 Amortzed Analyss In an amortzed analyss, the tme requred to perform a sequence of data structure operatons s averaged over all the operatons performed. It can be used to show that the average

More information

= z 20 z n. (k 20) + 4 z k = 4

= z 20 z n. (k 20) + 4 z k = 4 Problem Set #7 solutons 7.2.. (a Fnd the coeffcent of z k n (z + z 5 + z 6 + z 7 + 5, k 20. We use the known seres expanson ( n+l ( z l l z n below: (z + z 5 + z 6 + z 7 + 5 (z 5 ( + z + z 2 + z + 5 5

More information

The Gaussian classifier. Nuno Vasconcelos ECE Department, UCSD

The Gaussian classifier. Nuno Vasconcelos ECE Department, UCSD he Gaussan classfer Nuno Vasconcelos ECE Department, UCSD Bayesan decson theory recall that we have state of the world X observatons g decson functon L[g,y] loss of predctng y wth g Bayes decson rule s

More information

Basic Regular Expressions. Introduction. Introduction to Computability. Theory. Motivation. Lecture4: Regular Expressions

Basic Regular Expressions. Introduction. Introduction to Computability. Theory. Motivation. Lecture4: Regular Expressions Introducton to Computablty Theory Lecture: egular Expressons Prof Amos Israel Motvaton If one wants to descrbe a regular language, La, she can use the a DFA, Dor an NFA N, such L ( D = La that that Ths

More information

CSE4210 Architecture and Hardware for DSP

CSE4210 Architecture and Hardware for DSP 4210 Archtecture and Hardware for DSP Lecture 1 Introducton & Number systems Admnstratve Stuff 4210 Archtecture and Hardware for DSP Text: VLSI Dgtal Sgnal Processng Systems: Desgn and Implementaton. K.

More information

Open Systems: Chemical Potential and Partial Molar Quantities Chemical Potential

Open Systems: Chemical Potential and Partial Molar Quantities Chemical Potential Open Systems: Chemcal Potental and Partal Molar Quanttes Chemcal Potental For closed systems, we have derved the followng relatonshps: du = TdS pdv dh = TdS + Vdp da = SdT pdv dg = VdP SdT For open systems,

More information

Introductory Cardinality Theory Alan Kaylor Cline

Introductory Cardinality Theory Alan Kaylor Cline Introductory Cardnalty Theory lan Kaylor Clne lthough by name the theory of set cardnalty may seem to be an offshoot of combnatorcs, the central nterest s actually nfnte sets. Combnatorcs deals wth fnte

More information

Assignment 2. Tyler Shendruk February 19, 2010

Assignment 2. Tyler Shendruk February 19, 2010 Assgnment yler Shendruk February 9, 00 Kadar Ch. Problem 8 We have an N N symmetrc matrx, M. he symmetry means M M and we ll say the elements of the matrx are m j. he elements are pulled from a probablty

More information

Note on EM-training of IBM-model 1

Note on EM-training of IBM-model 1 Note on EM-tranng of IBM-model INF58 Language Technologcal Applcatons, Fall The sldes on ths subject (nf58 6.pdf) ncludng the example seem nsuffcent to gve a good grasp of what s gong on. Hence here are

More information

Case A. P k = Ni ( 2L i k 1 ) + (# big cells) 10d 2 P k.

Case A. P k = Ni ( 2L i k 1 ) + (# big cells) 10d 2 P k. THE CELLULAR METHOD In ths lecture, we ntroduce the cellular method as an approach to ncdence geometry theorems lke the Szemeréd-Trotter theorem. The method was ntroduced n the paper Combnatoral complexty

More information

Using T.O.M to Estimate Parameter of distributions that have not Single Exponential Family

Using T.O.M to Estimate Parameter of distributions that have not Single Exponential Family IOSR Journal of Mathematcs IOSR-JM) ISSN: 2278-5728. Volume 3, Issue 3 Sep-Oct. 202), PP 44-48 www.osrjournals.org Usng T.O.M to Estmate Parameter of dstrbutons that have not Sngle Exponental Famly Jubran

More information

Application of Nonbinary LDPC Codes for Communication over Fading Channels Using Higher Order Modulations

Application of Nonbinary LDPC Codes for Communication over Fading Channels Using Higher Order Modulations Applcaton of Nonbnary LDPC Codes for Communcaton over Fadng Channels Usng Hgher Order Modulatons Rong-Hu Peng and Rong-Rong Chen Department of Electrcal and Computer Engneerng Unversty of Utah Ths work

More information

Linear Feature Engineering 11

Linear Feature Engineering 11 Lnear Feature Engneerng 11 2 Least-Squares 2.1 Smple least-squares Consder the followng dataset. We have a bunch of nputs x and correspondng outputs y. The partcular values n ths dataset are x y 0.23 0.19

More information

x = , so that calculated

x = , so that calculated Stat 4, secton Sngle Factor ANOVA notes by Tm Plachowsk n chapter 8 we conducted hypothess tests n whch we compared a sngle sample s mean or proporton to some hypotheszed value Chapter 9 expanded ths to

More information

Pulse Coded Modulation

Pulse Coded Modulation Pulse Coded Modulaton PCM (Pulse Coded Modulaton) s a voce codng technque defned by the ITU-T G.711 standard and t s used n dgtal telephony to encode the voce sgnal. The frst step n the analog to dgtal

More information

Lecture 10: May 6, 2013

Lecture 10: May 6, 2013 TTIC/CMSC 31150 Mathematcal Toolkt Sprng 013 Madhur Tulsan Lecture 10: May 6, 013 Scrbe: Wenje Luo In today s lecture, we manly talked about random walk on graphs and ntroduce the concept of graph expander,

More information

Retrieval Models: Language models

Retrieval Models: Language models CS-590I Informaton Retreval Retreval Models: Language models Luo S Department of Computer Scence Purdue Unversty Introducton to language model Ungram language model Document language model estmaton Maxmum

More information

Lecture 4: Universal Hash Functions/Streaming Cont d

Lecture 4: Universal Hash Functions/Streaming Cont d CSE 5: Desgn and Analyss of Algorthms I Sprng 06 Lecture 4: Unversal Hash Functons/Streamng Cont d Lecturer: Shayan Oves Gharan Aprl 6th Scrbe: Jacob Schreber Dsclamer: These notes have not been subjected

More information

Maximizing the number of nonnegative subsets

Maximizing the number of nonnegative subsets Maxmzng the number of nonnegatve subsets Noga Alon Hao Huang December 1, 213 Abstract Gven a set of n real numbers, f the sum of elements of every subset of sze larger than k s negatve, what s the maxmum

More information

Lecture 3. Ax x i a i. i i

Lecture 3. Ax x i a i. i i 18.409 The Behavor of Algorthms n Practce 2/14/2 Lecturer: Dan Spelman Lecture 3 Scrbe: Arvnd Sankar 1 Largest sngular value In order to bound the condton number, we need an upper bound on the largest

More information

U.C. Berkeley CS294: Beyond Worst-Case Analysis Handout 6 Luca Trevisan September 12, 2017

U.C. Berkeley CS294: Beyond Worst-Case Analysis Handout 6 Luca Trevisan September 12, 2017 U.C. Berkeley CS94: Beyond Worst-Case Analyss Handout 6 Luca Trevsan September, 07 Scrbed by Theo McKenze Lecture 6 In whch we study the spectrum of random graphs. Overvew When attemptng to fnd n polynomal

More information

For now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results.

For now, let us focus on a specific model of neurons. These are simplified from reality but can achieve remarkable results. Neural Networks : Dervaton compled by Alvn Wan from Professor Jtendra Malk s lecture Ths type of computaton s called deep learnng and s the most popular method for many problems, such as computer vson

More information

Lecture 5 Decoding Binary BCH Codes

Lecture 5 Decoding Binary BCH Codes Lecture 5 Decodng Bnary BCH Codes In ths class, we wll ntroduce dfferent methods for decodng BCH codes 51 Decodng the [15, 7, 5] 2 -BCH Code Consder the [15, 7, 5] 2 -code C we ntroduced n the last lecture

More information

Homework Assignment 3 Due in class, Thursday October 15

Homework Assignment 3 Due in class, Thursday October 15 Homework Assgnment 3 Due n class, Thursday October 15 SDS 383C Statstcal Modelng I 1 Rdge regresson and Lasso 1. Get the Prostrate cancer data from http://statweb.stanford.edu/~tbs/elemstatlearn/ datasets/prostate.data.

More information

ECE559VV Project Report

ECE559VV Project Report ECE559VV Project Report (Supplementary Notes Loc Xuan Bu I. MAX SUM-RATE SCHEDULING: THE UPLINK CASE We have seen (n the presentaton that, for downlnk (broadcast channels, the strategy maxmzng the sum-rate

More information

Module 9. Lecture 6. Duality in Assignment Problems

Module 9. Lecture 6. Duality in Assignment Problems Module 9 1 Lecture 6 Dualty n Assgnment Problems In ths lecture we attempt to answer few other mportant questons posed n earler lecture for (AP) and see how some of them can be explaned through the concept

More information

SPANC -- SPlitpole ANalysis Code User Manual

SPANC -- SPlitpole ANalysis Code User Manual Functonal Descrpton of Code SPANC -- SPltpole ANalyss Code User Manual Author: Dale Vsser Date: 14 January 00 Spanc s a code created by Dale Vsser for easer calbratons of poston spectra from magnetc spectrometer

More information

Chapter 5. Solution of System of Linear Equations. Module No. 6. Solution of Inconsistent and Ill Conditioned Systems

Chapter 5. Solution of System of Linear Equations. Module No. 6. Solution of Inconsistent and Ill Conditioned Systems Numercal Analyss by Dr. Anta Pal Assstant Professor Department of Mathematcs Natonal Insttute of Technology Durgapur Durgapur-713209 emal: anta.bue@gmal.com 1 . Chapter 5 Soluton of System of Lnear Equatons

More information

MLE and Bayesian Estimation. Jie Tang Department of Computer Science & Technology Tsinghua University 2012

MLE and Bayesian Estimation. Jie Tang Department of Computer Science & Technology Tsinghua University 2012 MLE and Bayesan Estmaton Je Tang Department of Computer Scence & Technology Tsnghua Unversty 01 1 Lnear Regresson? As the frst step, we need to decde how we re gong to represent the functon f. One example:

More information

The Order Relation and Trace Inequalities for. Hermitian Operators

The Order Relation and Trace Inequalities for. Hermitian Operators Internatonal Mathematcal Forum, Vol 3, 08, no, 507-57 HIKARI Ltd, wwwm-hkarcom https://doorg/0988/mf088055 The Order Relaton and Trace Inequaltes for Hermtan Operators Y Huang School of Informaton Scence

More information

Week 2. This week, we covered operations on sets and cardinality.

Week 2. This week, we covered operations on sets and cardinality. Week 2 Ths week, we covered operatons on sets and cardnalty. Defnton 0.1 (Correspondence). A correspondence between two sets A and B s a set S contaned n A B = {(a, b) a A, b B}. A correspondence from

More information

Interval Estimation in the Classical Normal Linear Regression Model. 1. Introduction

Interval Estimation in the Classical Normal Linear Regression Model. 1. Introduction ECONOMICS 35* -- NOTE 7 ECON 35* -- NOTE 7 Interval Estmaton n the Classcal Normal Lnear Regresson Model Ths note outlnes the basc elements of nterval estmaton n the Classcal Normal Lnear Regresson Model

More information

Microwave Diversity Imaging Compression Using Bioinspired

Microwave Diversity Imaging Compression Using Bioinspired Mcrowave Dversty Imagng Compresson Usng Bonspred Neural Networks Youwe Yuan 1, Yong L 1, Wele Xu 1, Janghong Yu * 1 School of Computer Scence and Technology, Hangzhou Danz Unversty, Hangzhou, Zhejang,

More information

Statistics MINITAB - Lab 2

Statistics MINITAB - Lab 2 Statstcs 20080 MINITAB - Lab 2 1. Smple Lnear Regresson In smple lnear regresson we attempt to model a lnear relatonshp between two varables wth a straght lne and make statstcal nferences concernng that

More information

Classification as a Regression Problem

Classification as a Regression Problem Target varable y C C, C,, ; Classfcaton as a Regresson Problem { }, 3 L C K To treat classfcaton as a regresson problem we should transform the target y nto numercal values; The choce of numercal class

More information

COS 511: Theoretical Machine Learning. Lecturer: Rob Schapire Lecture # 15 Scribe: Jieming Mao April 1, 2013

COS 511: Theoretical Machine Learning. Lecturer: Rob Schapire Lecture # 15 Scribe: Jieming Mao April 1, 2013 COS 511: heoretcal Machne Learnng Lecturer: Rob Schapre Lecture # 15 Scrbe: Jemng Mao Aprl 1, 013 1 Bref revew 1.1 Learnng wth expert advce Last tme, we started to talk about learnng wth expert advce.

More information

Lecture 3 January 31, 2017

Lecture 3 January 31, 2017 CS 224: Advanced Algorthms Sprng 207 Prof. Jelan Nelson Lecture 3 January 3, 207 Scrbe: Saketh Rama Overvew In the last lecture we covered Y-fast tres and Fuson Trees. In ths lecture we start our dscusson

More information

MATH 5707 HOMEWORK 4 SOLUTIONS 2. 2 i 2p i E(X i ) + E(Xi 2 ) ä i=1. i=1

MATH 5707 HOMEWORK 4 SOLUTIONS 2. 2 i 2p i E(X i ) + E(Xi 2 ) ä i=1. i=1 MATH 5707 HOMEWORK 4 SOLUTIONS CİHAN BAHRAN 1. Let v 1,..., v n R m, all lengths v are not larger than 1. Let p 1,..., p n [0, 1] be arbtrary and set w = p 1 v 1 + + p n v n. Then there exst ε 1,..., ε

More information

Limited Dependent Variables

Limited Dependent Variables Lmted Dependent Varables. What f the left-hand sde varable s not a contnuous thng spread from mnus nfnty to plus nfnty? That s, gven a model = f (, β, ε, where a. s bounded below at zero, such as wages

More information

Clustering gene expression data & the EM algorithm

Clustering gene expression data & the EM algorithm CG, Fall 2011-12 Clusterng gene expresson data & the EM algorthm CG 08 Ron Shamr 1 How Gene Expresson Data Looks Entres of the Raw Data matrx: Rato values Absolute values Row = gene s expresson pattern

More information

Computing Correlated Equilibria in Multi-Player Games

Computing Correlated Equilibria in Multi-Player Games Computng Correlated Equlbra n Mult-Player Games Chrstos H. Papadmtrou Presented by Zhanxang Huang December 7th, 2005 1 The Author Dr. Chrstos H. Papadmtrou CS professor at UC Berkley (taught at Harvard,

More information