Entropy Coding. A complete entropy codec, which is an encoder/decoder. pair, consists of the process of encoding or

Size: px
Start display at page:

Download "Entropy Coding. A complete entropy codec, which is an encoder/decoder. pair, consists of the process of encoding or"

Transcription

1 Sgnal Compresson Sgnal Compresson Entropy Codng Entropy codng s also known as zero-error codng, data compresson or lossless compresson. Entropy codng s wdely used n vrtually all popular nternatonal multmeda compresson standards such as JPEG and MPEG. A complete entropy codec, whch s an encoder/decoder par, conssts of the process of encodng or compressng a random source (typcally quantzed transform coeffcents) and the process of decodng or decompressng the compressed sgnal to perfectly regenerate the orgnal random source. In other words, Sgnal Compresson Sgnal Compresson 4 there s no loss of nformaton due to the process of entropy codng. Random Source Entropy Encodng Compressed Source Thus, entropy codng does not ntroduce any dstorton, and hence, the combnaton of the entropy encoder and entropy decoder fathfully reconstructs the nput to the Random Source Entropy Decodng Compressed Source entropy encoder.

2 Sgnal Compresson 5 Therefore, any possble loss-of-nformaton or dstorton that may be ntroduced n a sgnal compresson system s not due to entropy encodng/decodng. As we dscussed prevously, a typcal mage compresson system, for example, ncludes a transform process, a quantzaton process, and an entropy codng stage. In such system, the dstorton s ntroduced due to quantzaton. Moreover, Sgnal Compresson 7 Code Desgn and Notatons In general, entropy codng (or source codng ) s acheved by desgnng a code, C, whch provdes a oneto-one mappng from any possble outcome a random varable X ( source ) to a codeword. There two alphabets n ths case; one alphabet s the tradtonal alphabet of the random source X, and the Sgnal Compresson 6 for such a system, and from the perspectve of the entropy encoder, the nput random source to that encoder s the quantzed transform coeffcents. Transform Coeffcents Quantzed Coeffcents Random Source Transform Quantzaton Entropy Codng Compressed Source Examples KT DCT Wavelets Examples Huffman Arthmetc Sgnal Compresson 8 second alphabet s the one that s used for constructng the codewords. Based on the second alphabet, we can construct and defne the set D, whch s the set of all fnte-length strng of symbols wthdrawn from the alphabet.

3 Sgnal Compresson 9 The most common and popular codes are bnary codes, where the alphabet of the codewords s smply the bnary bts one and zero. Sgnal Compresson Bnary codes can be represented effcently usng bnary trees. In ths case, the frst two branches of the root node represent the possble bt assgned to the frst bt of a codeword. Once that frst bt s known, and f the codeword has a second bt, then the second par of branches represents the second bt and so on. Sgnal Compresson Alphabet (A) of Random Source (X) X A A a B b C c.. Set of Codewords D Alphabet of code symbols used to construct Codewords b B b b In ths example: B Sgnal Compresson Bnary tree representaton of a bnary (D-ary; D=) prefx code. Set of Codewords Alphabet of code symbols used to construct codewords B B D

4 Sgnal Compresson Defnton A source code, C, s a mappng from a random varable (source) X wth alphabet to a fnte length strng of symbols, where each strng of symbols (codeword) s a member of the set D : C: D Sgnal Compresson 5 Example et X be a random source wth x,,, 4. et,, and hence D. Then: D... } {,,,...,,,...,,,,,,... Sgnal Compresson 4 The codewords n D are formed from an alphabet B that has D elements: D. We say that we have a D-ary code; or B s a D-ary alphabet. As dscussed prevously, the most common case s when the alphabet B s the set B, ; therefore, n ths case, D and we have bnary codewords. Sgnal Compresson 6 We can defne the code C as follows: Codeword ength C C x C C x C C x C4 C x4 4

5 Sgnal Compresson 7 Defnton or a random varable X wth a p.m.f. p, p,..., p m, the expected length of a code C X s: m C p. Sgnal Compresson 9 In addton, we have to desgn codes that are unquely decodable. In other words, f the source generates a sequence: x, x, x,... that s mapped nto a sequence of C x, C x, C x,..., then we should be codewords able to recover the orgnal source sequence x, x, x,... C x, C x, C x,.... from the codewords sequence Sgnal Compresson 8 Code Types The desgn of a good code follows the basc noton of entropy: or random outcomes wth a hgh probablty, a good code assgns short codewords and vce versa. The overall objectve s to have the average length C to be as small as possble. Sgnal Compresson In general, and as a start, we are nterested n codes that map each random outcome x nto a unque codeword that dffers from the codeword of any other outcome. or a random source wth alphabet,,...m a nonsngular code meets the followng constrant: C x C x j j

6 Sgnal Compresson Although a non-sngular code s unquely decodable for a sngle symbol, t does not guarantee unque decodablty for a sequence of outcomes of X. Sgnal Compresson In the above example, the code C s non-sngular, however, t s not unquely decodable. Meanwhle, the code C s both non-sngular and unquely decodable. Therefore, not all non-sngular codes are unquely decodable; however, every unquely decodable code s non-sngular. Sgnal Compresson Example: Code C Code C C C x C x C C x C x C C x C x C4 C x4 C x4 Sgnal Compresson 4 It s mportant to note that a unquely decodable code may requre the decodng of multple codewords to unquely dentfy the orgnal source sequence. Ths s the case for the above code C. (Can you gve an example when the C decoder needs to wat for more codewords before beng able to unquely decode a sequence?)

7 Sgnal Compresson 5 Therefore, t s hghly desrable to desgn a unquely decodable code that can be decoded nstantaneously when recevng each codeword. Ths type of codes are known as nstantaneous, prefx free, or smply prefx codes. In a prefx code, a codeword cannot be used as a prefx for any other codewords. Sgnal Compresson 7 It should be rather ntutve to know that every prefx code s unquely decodable but the nverse s not always true. In summary, the three major types of codes, non-sngular, unquely decodable, and prefx codes, are related as shown n the followng dagram. Sgnal Compresson 6 Example: In the followng example, no codeword s used as a prefx for any other codeword. C C x C C x C Cx C 4 C x4 Sgnal Compresson 8 All possble codes Unquely decodable codes Nonsngular codes Prefx (nstantaneous) codes

8 Sgnal Compresson 9 Kraft Inequalty Based on the above dscusson, t should be clear that unquely decodable codes represent a subset of all possble codes. Also, prefx codes are a subset of unquely decodable codes. Prefx codes meet a certan constrant, whch s known as the Kraft Inequalty. Sgnal Compresson Conversely, gven a set of codeword lengths that meet the nequalty D, there exsts a prefx code for m ths set of lengths. Proof A prefx code C can be represented by a D-ary tree. Below we llustrate the proof usng a bnary code and a Sgnal Compresson Theorem or any prefx D-ary code C wth codeword lengths,,..., m the followng must be satsfed: m D. Sgnal Compresson correspondng bnary tree. (The same prncples apply to hgher order codes/trees.) or llustraton purposes, let us consder the code: C C x C C x C Cx C 4 C x4 Ths code can be represented as follows.

9 Sgnal Compresson Bnary tree representaton of a bnary (D-ary; D=) prefx code. B Set of Codewords B D D Alphabet of code symbols used to construct codewords Sgnal Compresson 5 Bnary tree representaton of a bnary (D-ary; D=) prefx code. eaf nodes of the Codeword eaf nodes of the Codeword Sgnal Compresson 4 An mportant attrbute of the above tree representaton of codes s the number of leaf nodes that are assocated wth each codeword. or example, the frst codeword C, there are four leaf nodes that are assocated wth t. Smlarly, the codeword C, has two leaf nodes. Sgnal Compresson 6 The last two codewords are leaf nodes themselves, and hence each of these s assocated wth a sngle leaf node (tself).

10 Sgnal Compresson 7 eaf nodes of the Codeword eaf nodes of the Codeword 4 eaf nodes of codeword eaf nodes of codeword Sgnal Compresson 9 of leaf nodes that are assocated wth (descendant of) a max codeword at level s D. urthermore snce each group of leaf nodes of a codeword wth length s a dsjont from any other group of leaf nodes j, then: m max max D D whch mples: D m. Sgnal Compresson 8 Note that for a prefx code, any codeword cannot be an ancestor of any other codeword. et max be the maxmum length among all codeword lengths of a prefx code. or each codeword wth length max, ths codeword s at depth of the D-ary tree. Hence, the total number Sgnal Compresson 4 By smlar arguments, one can construct a prefx code for a set of lengths that satsfy the above constrant: D. m QED

11 Sgnal Compresson 4 Optmum Codes Here we address the ssue of fndng mnmum length C codes gven the constrant mposed by the Kraft nequalty. In partcular, we are nterested n fndng codes that satsfy: Sgnal Compresson 4 Consequently, we can mnmze the followng objectve functon: m m. J p D J p D D ln. p D ln. D Sgnal Compresson 4 m mn C mn p,,... m,,... m such that m D. If we assume that equalty s satsfed: D, we m can formulate the problem usng agrange multplers. Sgnal Compresson 44 D p. ln D Usng the constrant m D, ln D D p log p D

12 Sgnal Compresson 45 Therefore: The average length C of an optmum code can be expressed as: m p m p log p H X D D Sgnal Compresson 47 D p p log D. However, and n general, the probablty dstrbuton values ( p ) do not necessarly guarantee nteger-valued lengths for the codewords. Sgnal Compresson 46 where H D X s the entropy of the orgnal source X (measured wth a logarthmc base D). or a bnary code, D, then the average length s the same as the standard (base-) entropy measured n bts. Based on the above dervaton, achevng an optmum prefx code C wth an entropy length H X s only D possble when: Sgnal Compresson 48 Below, we state one of the most fundamental theorems n nformaton theory that relates the average length of any prefx code wth the entropy of the random source wth general dstrbuton values ( p ). Ths theorem, whch s commonly known as the entropy bound theorem, llustrates that any code cannot have an average length that s smaller than the entropy of the random source.

13 Sgnal Compresson 49 Theorem (Entropy Bound) The expected length of a prefx D-ary code C for a random source X wth an entropy H D X satsfes the followng nequalty: HD X wth equalty f-and-only-f D p. Sgnal Compresson 5 H D X. Such dstrbutons are known as D-adc. or the bnary case, D, we have a Dyadc dstrbuton (or a dyadc code). Example of a dyadc dstrbuton s: p, p, p, p ; and 4 8 8,,,. Sgnal Compresson 5 Observaton from the Entropy Bound Theorem The Entropy Bound Theorem and ts proof leads to mportant observatons that we outlne below: or random sources wth dstrbutons that satsfy p D, where s an nteger for,,..., m, there exsts a prefx code that acheves the entropy Sgnal Compresson 5 Entropy Codng Methods Here, we wll dscuss leadng examples of entropy codng methods that are broadly used n practce, and whch have been adopted by leadng nternatonal compresson standards. In partcular, we wll dscuss Huffman codng and arthmetc codng, both of whch lead to optmal entropy codng.

14 Sgnal Compresson 5 Key Propertes of Optmum Prefx Codes Here, we outlne few key propertes of optmum prefx codes that wll lead to the Huffman codng procedure. We adopt the notaton C to represent the codeword wth length of a code C. Sgnal Compresson 55 Property Assumng p p p m p m, then the largest codewords of an optmum code have the same length: m. m Sgnal Compresson 54 Property If C j and C k are two codewords of an optmum prefx code C, then: pj pk j k Sgnal Compresson 56 Set of Codewords m m

15 Sgnal Compresson 57 Set of Codewords Unused shorter codeword m m Sgnal Compresson 59 Property There exts an optmum code C where the largest codewords are sblngs (.e., they dffer n one bt). Sgnal Compresson 58 Set of Codewords Unused shorter codeword m m Sgnal Compresson 6 Property 4 or a bnary random source, the optmum prefx code s of length:.

16 Sgnal Compresson 6 The Huffman Entropy Codng Procedure The above propertes lead to the Huffman entropy codng procedure for generatng prefx codes.a core noton n ths procedure s the observaton that optmzng a gven code C s equvalent to optmzng a shortened verson ' C. Sgnal Compresson 6 other outcome (treat them as bnary, and use an optmum bnary code.).. Repeat step untl we have a bnary source, whch one merged result n a probablty. We now llustrate the Huffman procedure usng few examples. Sgnal Compresson 6 The Huffman codng procedures can be summarzed by the followng steps:. Sort the outcomes accordng probablty dstrbuton: p p pm pm.. Merge the two least probable outcomes. And assgn a zero to one outcome and a one to the Sgnal Compresson 64 Example p nd an optmum set of codewords: C?, C?, C?, C? 4 p p p 4 4 The optmum codewords must meet the followng: 4 4 C C 4 and sblngs

17 Sgnal Compresson 65 Combnng the least probable outcomes: p p p p 4 p p p 4 Sgnal Compresson 67 Now we have a probablty one. Nothng else to merge. p p p p 4 4 p p p Sgnal Compresson 66 Use the least probable outcomes of the shortened codes; p p p 4 p 4 p p p Sgnal Compresson 68 p p p p 4 4 p p p C C C 4 C What s the average length?

18 Sgnal Compresson 69 In some cases, we may encounter more than one choce for mergng the probablty dstrbuton values.(ths was the case n the above example.) One mportant queston s: what s the mpact of selectng one choce for combnng the probabltes versus the other? We llustrate ths below by selectng an alternatve opton for combnng the probabltes. Sgnal Compresson 7 As can be seen n the above example, the Huffman procedure can lead to dfferent prefx codes (f multple optons for mergng are encountered). Hence, an mportant queston s: Does one opton provde a better code (n terms of provdng a smaller average code length )? Sgnal Compresson 7 p p p p 4 4. C C C 4 C Sgnal Compresson 7 The Huffman procedure can also be used for the case when D (.e., the code s not bnary anymore). Care should be taken though when dealng wth a nonbnary code desgn.

19 Sgnal Compresson 7 Arthmetc Codng Although Huffman codes are optmal on a symbol-bysymbol bass, there s stll room for mprovements n terms of achevng lower overhead. or example, a bnary source wth entropy H X, stll requres one bt-per-symbol when usng a Huffman code. Hence, f, for example, H X.5, then a Huffman code spends Sgnal Compresson 75 hence, sometmes called the Shannon-ano-Elas (SE) codes. Therefore, we frst outlne the prncples and procedures of SE codes, and then descrbe arthmetc codng. Sgnal Compresson 74 double the amount of bts per symbol (relatve to the true optmum lmt of H X.5 ). Arthmetc codng s an approach that addresses the overhead ssue by codng a contnuous sequence of source symbols whle tryng to approach the entropy lmt H X. Arthmetc codng has roots n a codng approach proposed by Shannon, ano, and Elsas, and Sgnal Compresson 76 Shannon-ano-Elas Codng The SE codng procedure s based on usng the cumulatve dstrbuton functon (CD) x of a random source X ; xpr X x. The CD provdes a unque one-to-one mappng for the possble outcomes of any random source X.

20 Sgnal Compresson 77 In other words, f we denote to the alphabet of a dscrete random source X by the nteger ndex set:,,...,m, then t s well known that: j, j. Ths can be llustrated by the followng example of a typcal CD functon of a dscrete random source. Sgnal Compresson 79 One mportant characterstcs of the CD of a dscrete random source s that the CD defnes a set of nonoverlappng ntervals n ts range of possble values between zero and one. (Recall that the CD provdes a measure of probablty, and hence t s always confned between zero and one.) Sgnal Compresson 78 x 4 4 x Sgnal Compresson 8 Based on the above CD example, we can have a welldefned set of non-overlappng ntervals as shown n the next fgure.

21 Sgnal Compresson 8 x 4 Non-overlappng Intervals 4 x Sgnal Compresson 8 x 4 p p p p 4 4 x Sgnal Compresson 8 Another mportant observaton s that the sze of each (non-overlappng) nterval n the range of the CD x s defned by the probablty-mass-functon (PM) value p Pr X of a partcular outcome X. Ths s the same s the level of jumps that we can observe n the starcase-lke shape of a CD of a dscrete random source. Ths s hghlghted by the next fgure. Sgnal Compresson 84 Overall, and by usng the CD of a random source, one can defne a unque mappng between any possble outcome and a partcular (unque) nterval n the range between zero and one. urthermore, one can select any value wthn each (unque) nterval of a correspondng random outcome ( ) to represent that

22 Sgnal Compresson 85 outcome. Ths selected value serves as a codeword for that outcome. The SE procedure, whch s based on the above CDdrven prncples of unque mappng, can be defned as follows: Sgnal Compresson 87. Select a partcular value wthn the nterval, to represent the outcome X. Ths value s known as the modfed CD and s denoted by x. Sgnal Compresson 86. Map each outcome X to the nterval,., Inclusve Exclusve Sgnal Compresson 88 In prncple, any value wthn the nterval, can be used for the modfed CD. A natural choce s the mddle of the correspondng nterval,. Hence, the modfed CD can be expressed as follows:

23 Sgnal Compresson 89 p, whch, n turn, can be expressed as:. Ths s llustrated by the next fgure. Sgnal Compresson 9 So far, t should be clear that [,), and t provdes a unque mappng for the possble random outcomes of X.. Generate a codeword to represent, and hence to represent the outcome X. Below we consder smple examples of such codewords accordng to a SE codng procedure. Sgnal Compresson 9 x x Sgnal Compresson 9 Examples of modfed CD Values and Codewords The followng table outlnes a dyadc set of examples of values that could be used for a modfed CD and the correspondng codewords for such values.

24 Sgnal Compresson 9 Bnary Representaton. Codeword Sgnal Compresson 95 In general the number of bts needed to code the modfed CD value could be nfnte snce could be any real number. In practce, however, a fnte number of bts s used to represent ( approxmate ). It should be clear that the number of bts used Sgnal Compresson 94 The above values of modfed CD can be combned to represent hgher precson values as shown n the next table. Bnary Representaton.75. Codeword.65. Sgnal Compresson 96 must be suffcently large to make sure that the codeword representng s unque (.e., there should not be overlap n the ntervals representng the random outcomes). By usng a truncated value for the orgnal value, we antcpate a loss n precson.

25 Sgnal Compresson 97 et be the truncated value used to represent the orgnal modfed CD based on bts. Naturally, the larger number of bts used, the hgher precson, and the smaller the dfference between and. Sgnal Compresson 99 Consequently, and based on the defnton of the modfed CD value: p, n order to mantan unque mappng, the maxmum error has to be smaller than p / : p. Sgnal Compresson 98 It can be shown that the dfference between the orgnal modfed CD value and ts approxmaton satsfes the followng nequalty:. Sgnal Compresson Ths leads to the followng constrant on the length : p log log log log p log p log p

26 Sgnal Compresson Therefore: log p. Example The followng table shows an example of a random source X wth four possble outcomes and the Sgnal Compresson X p (Bnary) SE Code Sgnal Compresson correspondng PM, CD, and modfed CD values and codewords used based on SE codng. Sgnal Compresson 4 Arthmetc Codng The advantages of the SE codng procedure can be realzed when t s sued to code multple outcomes of the random source under consderaton. Arthmetc codng s bascally an SE codng appled to multple outcomes of the random source.

27 Sgnal Compresson 5 Under AC, we code a sequence of n outcomes:,,..., n, where each outcome,,..., m j. Each possble vector X of the random source X s mapped to a unque value: ( n ) [,). Sgnal Compresson 7 Example Arthmetc codng begns wth dvdng the zero to one range based on the CD functon of the random source. In ths example, the source can take one of three possble outcomes. Sgnal Compresson 6 The best way to llustrate arthmetc codng s through a couple of examples as shown below. Sgnal Compresson 8 x,,

28 Sgnal Compresson 9 If we assume that we are nterested n codng n outcomes, the followng fgures show the partcular nterval and correspondng value x that arthmetc codng focuses on to code the vector,,. Sgnal Compresson x, Sgnal Compresson x, Sgnal Compresson x x, Transmt ths number to represent the vector: x,

29 Sgnal Compresson Smlarly, the followng fgure shows the partcular nterval and correspondng value x that arthmetc codng focuses on to code the vector,,. Sgnal Compresson 5 Based on the above examples, we can defne: and ( n) ( n) ( n ) l u ( ) ( ) ( ), n n n u l where ( n ) u and ( n ) l are the upper and lower bounds of a ( ) ( ) unque nterval n, n l u that ( n ) belong to. Below, we use these expressons to llustrate the arthmetc codng procedure. Sgnal Compresson 4 x, x Sgnal Compresson 6 Example The codng process starts wth the ntal step values: () l () u () () () u l

30 Sgnal Compresson 7 () u () () () u l () l Sgnal Compresson 9 () u () () () u l Example,, () l Sgnal Compresson 8 After the ntal step, the nterval ( ) ( ) ( ) and n n n u l correspondng value are updated ( n) ( n) ( n ) l u accordng to the partcular outcomes that the random source s generatng. Ths s llustrated below. Sgnal Compresson u () () u? () () () u l, () () l l?

31 Sgnal Compresson u () ( () ) () u l. () () () u l, l () () () ( ) l l. Sgnal Compresson u () u () u ( ) (). () l l () () () () u l () l () () () u l ( ) () l l ( ),. Sgnal Compresson u () ( () ) () u l.. () u l () () () () u l, () () ( ) l l.. () l Sgnal Compresson 4 The arthmetc codng procedure can be summarzed by the followng steps that are outlnes below.

32 Sgnal Compresson 5 n ( n) ( n) l u x ( n ) ( n ) ( n ) u l. n ( n ) ( n ) ( n ) l l. n ( n ) ( n ) ( n ) u l () () l u Sgnal Compresson 7 Sgnal Compresson 6 Smlar to SE codng, after determnng the value ( n ), we use ( n ) bts to represent ( n ) accordng to the constrant: ( ) n log px.

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur Module 3 LOSSY IMAGE COMPRESSION SYSTEMS Verson ECE IIT, Kharagpur Lesson 6 Theory of Quantzaton Verson ECE IIT, Kharagpur Instructonal Objectves At the end of ths lesson, the students should be able to:

More information

Compression in the Real World :Algorithms in the Real World. Compression in the Real World. Compression Outline

Compression in the Real World :Algorithms in the Real World. Compression in the Real World. Compression Outline Compresson n the Real World 5-853:Algorthms n the Real World Data Compresson: Lectures and 2 Generc Fle Compresson Fles: gzp (LZ77), bzp (Burrows-Wheeler), BOA (PPM) Archvers: ARC (LZW), PKZp (LZW+) Fle

More information

Module 2. Random Processes. Version 2 ECE IIT, Kharagpur

Module 2. Random Processes. Version 2 ECE IIT, Kharagpur Module Random Processes Lesson 6 Functons of Random Varables After readng ths lesson, ou wll learn about cdf of functon of a random varable. Formula for determnng the pdf of a random varable. Let, X be

More information

Chapter 8 SCALAR QUANTIZATION

Chapter 8 SCALAR QUANTIZATION Outlne Chapter 8 SCALAR QUANTIZATION Yeuan-Kuen Lee [ CU, CSIE ] 8.1 Overvew 8. Introducton 8.4 Unform Quantzer 8.5 Adaptve Quantzaton 8.6 Nonunform Quantzaton 8.7 Entropy-Coded Quantzaton Ch 8 Scalar

More information

Lecture 3: Shannon s Theorem

Lecture 3: Shannon s Theorem CSE 533: Error-Correctng Codes (Autumn 006 Lecture 3: Shannon s Theorem October 9, 006 Lecturer: Venkatesan Guruswam Scrbe: Wdad Machmouch 1 Communcaton Model The communcaton model we are usng conssts

More information

Pulse Coded Modulation

Pulse Coded Modulation Pulse Coded Modulaton PCM (Pulse Coded Modulaton) s a voce codng technque defned by the ITU-T G.711 standard and t s used n dgtal telephony to encode the voce sgnal. The frst step n the analog to dgtal

More information

Lossy Compression. Compromise accuracy of reconstruction for increased compression.

Lossy Compression. Compromise accuracy of reconstruction for increased compression. Lossy Compresson Compromse accuracy of reconstructon for ncreased compresson. The reconstructon s usually vsbly ndstngushable from the orgnal mage. Typcally, one can get up to 0:1 compresson wth almost

More information

Introduction to information theory and data compression

Introduction to information theory and data compression Introducton to nformaton theory and data compresson Adel Magra, Emma Gouné, Irène Woo March 8, 207 Ths s the augmented transcrpt of a lecture gven by Luc Devroye on March 9th 207 for a Data Structures

More information

VQ widely used in coding speech, image, and video

VQ widely used in coding speech, image, and video at Scalar quantzers are specal cases of vector quantzers (VQ): they are constraned to look at one sample at a tme (memoryless) VQ does not have such constrant better RD perfomance expected Source codng

More information

ECE 534: Elements of Information Theory. Solutions to Midterm Exam (Spring 2006)

ECE 534: Elements of Information Theory. Solutions to Midterm Exam (Spring 2006) ECE 534: Elements of Informaton Theory Solutons to Mdterm Eam (Sprng 6) Problem [ pts.] A dscrete memoryless source has an alphabet of three letters,, =,, 3, wth probabltes.4,.4, and., respectvely. (a)

More information

Lectures - Week 4 Matrix norms, Conditioning, Vector Spaces, Linear Independence, Spanning sets and Basis, Null space and Range of a Matrix

Lectures - Week 4 Matrix norms, Conditioning, Vector Spaces, Linear Independence, Spanning sets and Basis, Null space and Range of a Matrix Lectures - Week 4 Matrx norms, Condtonng, Vector Spaces, Lnear Independence, Spannng sets and Bass, Null space and Range of a Matrx Matrx Norms Now we turn to assocatng a number to each matrx. We could

More information

Lecture 3: Probability Distributions

Lecture 3: Probability Distributions Lecture 3: Probablty Dstrbutons Random Varables Let us begn by defnng a sample space as a set of outcomes from an experment. We denote ths by S. A random varable s a functon whch maps outcomes nto the

More information

APPENDIX A Some Linear Algebra

APPENDIX A Some Linear Algebra APPENDIX A Some Lnear Algebra The collecton of m, n matrces A.1 Matrces a 1,1,..., a 1,n A = a m,1,..., a m,n wth real elements a,j s denoted by R m,n. If n = 1 then A s called a column vector. Smlarly,

More information

Edge Isoperimetric Inequalities

Edge Isoperimetric Inequalities November 7, 2005 Ross M. Rchardson Edge Isopermetrc Inequaltes 1 Four Questons Recall that n the last lecture we looked at the problem of sopermetrc nequaltes n the hypercube, Q n. Our noton of boundary

More information

3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X

3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X Statstcs 1: Probablty Theory II 37 3 EPECTATION OF SEVERAL RANDOM VARIABLES As n Probablty Theory I, the nterest n most stuatons les not on the actual dstrbuton of a random vector, but rather on a number

More information

Lecture 14 (03/27/18). Channels. Decoding. Preview of the Capacity Theorem.

Lecture 14 (03/27/18). Channels. Decoding. Preview of the Capacity Theorem. Lecture 14 (03/27/18). Channels. Decodng. Prevew of the Capacty Theorem. A. Barg The concept of a communcaton channel n nformaton theory s an abstracton for transmttng dgtal (and analog) nformaton from

More information

CHAPTER 5 NUMERICAL EVALUATION OF DYNAMIC RESPONSE

CHAPTER 5 NUMERICAL EVALUATION OF DYNAMIC RESPONSE CHAPTER 5 NUMERICAL EVALUATION OF DYNAMIC RESPONSE Analytcal soluton s usually not possble when exctaton vares arbtrarly wth tme or f the system s nonlnear. Such problems can be solved by numercal tmesteppng

More information

Chapter 7 Channel Capacity and Coding

Chapter 7 Channel Capacity and Coding Chapter 7 Channel Capacty and Codng Contents 7. Channel models and channel capacty 7.. Channel models Bnary symmetrc channel Dscrete memoryless channels Dscrete-nput, contnuous-output channel Waveform

More information

Foundations of Arithmetic

Foundations of Arithmetic Foundatons of Arthmetc Notaton We shall denote the sum and product of numbers n the usual notaton as a 2 + a 2 + a 3 + + a = a, a 1 a 2 a 3 a = a The notaton a b means a dvdes b,.e. ac = b where c s an

More information

COS 521: Advanced Algorithms Game Theory and Linear Programming

COS 521: Advanced Algorithms Game Theory and Linear Programming COS 521: Advanced Algorthms Game Theory and Lnear Programmng Moses Charkar February 27, 2013 In these notes, we ntroduce some basc concepts n game theory and lnear programmng (LP). We show a connecton

More information

2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2: Bayesian pattern classification

2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2: Bayesian pattern classification E395 - Pattern Recognton Solutons to Introducton to Pattern Recognton, Chapter : Bayesan pattern classfcaton Preface Ths document s a soluton manual for selected exercses from Introducton to Pattern Recognton

More information

Introduction to Information Theory, Data Compression,

Introduction to Information Theory, Data Compression, Introducton to Informaton Theory, Data Compresson, Codng Mehd Ibm Brahm, Laura Mnkova Aprl 5, 208 Ths s the augmented transcrpt of a lecture gven by Luc Devroye on the 3th of March 208 for a Data Structures

More information

Lecture Notes on Linear Regression

Lecture Notes on Linear Regression Lecture Notes on Lnear Regresson Feng L fl@sdueducn Shandong Unversty, Chna Lnear Regresson Problem In regresson problem, we am at predct a contnuous target value gven an nput feature vector We assume

More information

Kernel Methods and SVMs Extension

Kernel Methods and SVMs Extension Kernel Methods and SVMs Extenson The purpose of ths document s to revew materal covered n Machne Learnng 1 Supervsed Learnng regardng support vector machnes (SVMs). Ths document also provdes a general

More information

Notes on Frequency Estimation in Data Streams

Notes on Frequency Estimation in Data Streams Notes on Frequency Estmaton n Data Streams In (one of) the data streamng model(s), the data s a sequence of arrvals a 1, a 2,..., a m of the form a j = (, v) where s the dentty of the tem and belongs to

More information

EGR 544 Communication Theory

EGR 544 Communication Theory EGR 544 Communcaton Theory. Informaton Sources Z. Alyazcoglu Electrcal and Computer Engneerng Department Cal Poly Pomona Introducton Informaton Source x n Informaton sources Analog sources Dscrete sources

More information

Affine transformations and convexity

Affine transformations and convexity Affne transformatons and convexty The purpose of ths document s to prove some basc propertes of affne transformatons nvolvng convex sets. Here are a few onlne references for background nformaton: http://math.ucr.edu/

More information

Econ107 Applied Econometrics Topic 3: Classical Model (Studenmund, Chapter 4)

Econ107 Applied Econometrics Topic 3: Classical Model (Studenmund, Chapter 4) I. Classcal Assumptons Econ7 Appled Econometrcs Topc 3: Classcal Model (Studenmund, Chapter 4) We have defned OLS and studed some algebrac propertes of OLS. In ths topc we wll study statstcal propertes

More information

MMA and GCMMA two methods for nonlinear optimization

MMA and GCMMA two methods for nonlinear optimization MMA and GCMMA two methods for nonlnear optmzaton Krster Svanberg Optmzaton and Systems Theory, KTH, Stockholm, Sweden. krlle@math.kth.se Ths note descrbes the algorthms used n the author s 2007 mplementatons

More information

Generalized Linear Methods

Generalized Linear Methods Generalzed Lnear Methods 1 Introducton In the Ensemble Methods the general dea s that usng a combnaton of several weak learner one could make a better learner. More formally, assume that we have a set

More information

ANSWERS. Problem 1. and the moment generating function (mgf) by. defined for any real t. Use this to show that E( U) var( U)

ANSWERS. Problem 1. and the moment generating function (mgf) by. defined for any real t. Use this to show that E( U) var( U) Econ 413 Exam 13 H ANSWERS Settet er nndelt 9 deloppgaver, A,B,C, som alle anbefales å telle lkt for å gøre det ltt lettere å stå. Svar er gtt . Unfortunately, there s a prntng error n the hnt of

More information

Solution Thermodynamics

Solution Thermodynamics Soluton hermodynamcs usng Wagner Notaton by Stanley. Howard Department of aterals and etallurgcal Engneerng South Dakota School of nes and echnology Rapd Cty, SD 57701 January 7, 001 Soluton hermodynamcs

More information

Error Probability for M Signals

Error Probability for M Signals Chapter 3 rror Probablty for M Sgnals In ths chapter we dscuss the error probablty n decdng whch of M sgnals was transmtted over an arbtrary channel. We assume the sgnals are represented by a set of orthonormal

More information

Chapter 5. Solution of System of Linear Equations. Module No. 6. Solution of Inconsistent and Ill Conditioned Systems

Chapter 5. Solution of System of Linear Equations. Module No. 6. Solution of Inconsistent and Ill Conditioned Systems Numercal Analyss by Dr. Anta Pal Assstant Professor Department of Mathematcs Natonal Insttute of Technology Durgapur Durgapur-713209 emal: anta.bue@gmal.com 1 . Chapter 5 Soluton of System of Lnear Equatons

More information

More metrics on cartesian products

More metrics on cartesian products More metrcs on cartesan products If (X, d ) are metrc spaces for 1 n, then n Secton II4 of the lecture notes we defned three metrcs on X whose underlyng topologes are the product topology The purpose of

More information

Linear Approximation with Regularization and Moving Least Squares

Linear Approximation with Regularization and Moving Least Squares Lnear Approxmaton wth Regularzaton and Movng Least Squares Igor Grešovn May 007 Revson 4.6 (Revson : March 004). 5 4 3 0.5 3 3.5 4 Contents: Lnear Fttng...4. Weghted Least Squares n Functon Approxmaton...

More information

Structure and Drive Paul A. Jensen Copyright July 20, 2003

Structure and Drive Paul A. Jensen Copyright July 20, 2003 Structure and Drve Paul A. Jensen Copyrght July 20, 2003 A system s made up of several operatons wth flow passng between them. The structure of the system descrbes the flow paths from nputs to outputs.

More information

Chapter 7 Channel Capacity and Coding

Chapter 7 Channel Capacity and Coding Wreless Informaton Transmsson System Lab. Chapter 7 Channel Capacty and Codng Insttute of Communcatons Engneerng atonal Sun Yat-sen Unversty Contents 7. Channel models and channel capacty 7.. Channel models

More information

NP-Completeness : Proofs

NP-Completeness : Proofs NP-Completeness : Proofs Proof Methods A method to show a decson problem Π NP-complete s as follows. (1) Show Π NP. (2) Choose an NP-complete problem Π. (3) Show Π Π. A method to show an optmzaton problem

More information

find (x): given element x, return the canonical element of the set containing x;

find (x): given element x, return the canonical element of the set containing x; COS 43 Sprng, 009 Dsjont Set Unon Problem: Mantan a collecton of dsjont sets. Two operatons: fnd the set contanng a gven element; unte two sets nto one (destructvely). Approach: Canoncal element method:

More information

= z 20 z n. (k 20) + 4 z k = 4

= z 20 z n. (k 20) + 4 z k = 4 Problem Set #7 solutons 7.2.. (a Fnd the coeffcent of z k n (z + z 5 + z 6 + z 7 + 5, k 20. We use the known seres expanson ( n+l ( z l l z n below: (z + z 5 + z 6 + z 7 + 5 (z 5 ( + z + z 2 + z + 5 5

More information

Errors for Linear Systems

Errors for Linear Systems Errors for Lnear Systems When we solve a lnear system Ax b we often do not know A and b exactly, but have only approxmatons  and ˆb avalable. Then the best thng we can do s to solve ˆx ˆb exactly whch

More information

Lecture 12: Discrete Laplacian

Lecture 12: Discrete Laplacian Lecture 12: Dscrete Laplacan Scrbe: Tanye Lu Our goal s to come up wth a dscrete verson of Laplacan operator for trangulated surfaces, so that we can use t n practce to solve related problems We are mostly

More information

Transform Coding. Transform Coding Principle

Transform Coding. Transform Coding Principle Transform Codng Prncple of block-wse transform codng Propertes of orthonormal transforms Dscrete cosne transform (DCT) Bt allocaton for transform coeffcents Entropy codng of transform coeffcents Typcal

More information

4 Analysis of Variance (ANOVA) 5 ANOVA. 5.1 Introduction. 5.2 Fixed Effects ANOVA

4 Analysis of Variance (ANOVA) 5 ANOVA. 5.1 Introduction. 5.2 Fixed Effects ANOVA 4 Analyss of Varance (ANOVA) 5 ANOVA 51 Introducton ANOVA ANOVA s a way to estmate and test the means of multple populatons We wll start wth one-way ANOVA If the populatons ncluded n the study are selected

More information

The Gaussian classifier. Nuno Vasconcelos ECE Department, UCSD

The Gaussian classifier. Nuno Vasconcelos ECE Department, UCSD he Gaussan classfer Nuno Vasconcelos ECE Department, UCSD Bayesan decson theory recall that we have state of the world X observatons g decson functon L[g,y] loss of predctng y wth g Bayes decson rule s

More information

The internal structure of natural numbers and one method for the definition of large prime numbers

The internal structure of natural numbers and one method for the definition of large prime numbers The nternal structure of natural numbers and one method for the defnton of large prme numbers Emmanul Manousos APM Insttute for the Advancement of Physcs and Mathematcs 3 Poulou str. 53 Athens Greece Abstract

More information

NUMERICAL DIFFERENTIATION

NUMERICAL DIFFERENTIATION NUMERICAL DIFFERENTIATION 1 Introducton Dfferentaton s a method to compute the rate at whch a dependent output y changes wth respect to the change n the ndependent nput x. Ths rate of change s called the

More information

princeton univ. F 17 cos 521: Advanced Algorithm Design Lecture 7: LP Duality Lecturer: Matt Weinberg

princeton univ. F 17 cos 521: Advanced Algorithm Design Lecture 7: LP Duality Lecturer: Matt Weinberg prnceton unv. F 17 cos 521: Advanced Algorthm Desgn Lecture 7: LP Dualty Lecturer: Matt Wenberg Scrbe: LP Dualty s an extremely useful tool for analyzng structural propertes of lnear programs. Whle there

More information

Solutions to exam in SF1811 Optimization, Jan 14, 2015

Solutions to exam in SF1811 Optimization, Jan 14, 2015 Solutons to exam n SF8 Optmzaton, Jan 4, 25 3 3 O------O -4 \ / \ / The network: \/ where all lnks go from left to rght. /\ / \ / \ 6 O------O -5 2 4.(a) Let x = ( x 3, x 4, x 23, x 24 ) T, where the varable

More information

Lecture 10 Support Vector Machines II

Lecture 10 Support Vector Machines II Lecture 10 Support Vector Machnes II 22 February 2016 Taylor B. Arnold Yale Statstcs STAT 365/665 1/28 Notes: Problem 3 s posted and due ths upcomng Frday There was an early bug n the fake-test data; fxed

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 12 10/21/2013. Martingale Concentration Inequalities and Applications

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 12 10/21/2013. Martingale Concentration Inequalities and Applications MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.65/15.070J Fall 013 Lecture 1 10/1/013 Martngale Concentraton Inequaltes and Applcatons Content. 1. Exponental concentraton for martngales wth bounded ncrements.

More information

Module 9. Lecture 6. Duality in Assignment Problems

Module 9. Lecture 6. Duality in Assignment Problems Module 9 1 Lecture 6 Dualty n Assgnment Problems In ths lecture we attempt to answer few other mportant questons posed n earler lecture for (AP) and see how some of them can be explaned through the concept

More information

Simulated Power of the Discrete Cramér-von Mises Goodness-of-Fit Tests

Simulated Power of the Discrete Cramér-von Mises Goodness-of-Fit Tests Smulated of the Cramér-von Mses Goodness-of-Ft Tests Steele, M., Chaselng, J. and 3 Hurst, C. School of Mathematcal and Physcal Scences, James Cook Unversty, Australan School of Envronmental Studes, Grffth

More information

princeton univ. F 13 cos 521: Advanced Algorithm Design Lecture 3: Large deviations bounds and applications Lecturer: Sanjeev Arora

princeton univ. F 13 cos 521: Advanced Algorithm Design Lecture 3: Large deviations bounds and applications Lecturer: Sanjeev Arora prnceton unv. F 13 cos 521: Advanced Algorthm Desgn Lecture 3: Large devatons bounds and applcatons Lecturer: Sanjeev Arora Scrbe: Today s topc s devaton bounds: what s the probablty that a random varable

More information

Lossless Compression Performance of a Simple Counter- Based Entropy Coder

Lossless Compression Performance of a Simple Counter- Based Entropy Coder ITB J. ICT, Vol. 5, No. 3, 20, 73-84 73 Lossless Compresson Performance of a Smple Counter- Based Entropy Coder Armen Z. R. Lang,2 ITB Research Center on Informaton and Communcaton Technology 2 Informaton

More information

Limited Dependent Variables

Limited Dependent Variables Lmted Dependent Varables. What f the left-hand sde varable s not a contnuous thng spread from mnus nfnty to plus nfnty? That s, gven a model = f (, β, ε, where a. s bounded below at zero, such as wages

More information

Lecture 4. Instructor: Haipeng Luo

Lecture 4. Instructor: Haipeng Luo Lecture 4 Instructor: Hapeng Luo In the followng lectures, we focus on the expert problem and study more adaptve algorthms. Although Hedge s proven to be worst-case optmal, one may wonder how well t would

More information

Lecture 4: Universal Hash Functions/Streaming Cont d

Lecture 4: Universal Hash Functions/Streaming Cont d CSE 5: Desgn and Analyss of Algorthms I Sprng 06 Lecture 4: Unversal Hash Functons/Streamng Cont d Lecturer: Shayan Oves Gharan Aprl 6th Scrbe: Jacob Schreber Dsclamer: These notes have not been subjected

More information

Mathematical Models for Information Sources A Logarithmic i Measure of Information

Mathematical Models for Information Sources A Logarithmic i Measure of Information Introducton to Informaton Theory Wreless Informaton Transmsson System Lab. Insttute of Communcatons Engneerng g Natonal Sun Yat-sen Unversty Table of Contents Mathematcal Models for Informaton Sources

More information

Introductory Cardinality Theory Alan Kaylor Cline

Introductory Cardinality Theory Alan Kaylor Cline Introductory Cardnalty Theory lan Kaylor Clne lthough by name the theory of set cardnalty may seem to be an offshoot of combnatorcs, the central nterest s actually nfnte sets. Combnatorcs deals wth fnte

More information

Statistics II Final Exam 26/6/18

Statistics II Final Exam 26/6/18 Statstcs II Fnal Exam 26/6/18 Academc Year 2017/18 Solutons Exam duraton: 2 h 30 mn 1. (3 ponts) A town hall s conductng a study to determne the amount of leftover food produced by the restaurants n the

More information

College of Computer & Information Science Fall 2009 Northeastern University 20 October 2009

College of Computer & Information Science Fall 2009 Northeastern University 20 October 2009 College of Computer & Informaton Scence Fall 2009 Northeastern Unversty 20 October 2009 CS7880: Algorthmc Power Tools Scrbe: Jan Wen and Laura Poplawsk Lecture Outlne: Prmal-dual schema Network Desgn:

More information

Complete subgraphs in multipartite graphs

Complete subgraphs in multipartite graphs Complete subgraphs n multpartte graphs FLORIAN PFENDER Unverstät Rostock, Insttut für Mathematk D-18057 Rostock, Germany Floran.Pfender@un-rostock.de Abstract Turán s Theorem states that every graph G

More information

Flexible Quantization

Flexible Quantization wb 06/02/21 1 Flexble Quantzaton Bastaan Klejn KTH School of Electrcal Engneerng Stocholm wb 06/02/21 2 Overvew Motvaton for codng technologes Basc quantzaton and codng Hgh-rate quantzaton theory wb 06/02/21

More information

Economics 101. Lecture 4 - Equilibrium and Efficiency

Economics 101. Lecture 4 - Equilibrium and Efficiency Economcs 0 Lecture 4 - Equlbrum and Effcency Intro As dscussed n the prevous lecture, we wll now move from an envronment where we looed at consumers mang decsons n solaton to analyzng economes full of

More information

Workshop: Approximating energies and wave functions Quantum aspects of physical chemistry

Workshop: Approximating energies and wave functions Quantum aspects of physical chemistry Workshop: Approxmatng energes and wave functons Quantum aspects of physcal chemstry http://quantum.bu.edu/pltl/6/6.pdf Last updated Thursday, November 7, 25 7:9:5-5: Copyrght 25 Dan Dll (dan@bu.edu) Department

More information

Tornado and Luby Transform Codes. Ashish Khisti Presentation October 22, 2003

Tornado and Luby Transform Codes. Ashish Khisti Presentation October 22, 2003 Tornado and Luby Transform Codes Ashsh Khst 6.454 Presentaton October 22, 2003 Background: Erasure Channel Elas[956] studed the Erasure Channel β x x β β x 2 m x 2 k? Capacty of Noseless Erasure Channel

More information

1 Binary Response Models

1 Binary Response Models Bnary and Ordered Multnomal Response Models Dscrete qualtatve response models deal wth dscrete dependent varables. bnary: yes/no, partcpaton/non-partcpaton lnear probablty model LPM, probt or logt models

More information

BOUNDEDNESS OF THE RIESZ TRANSFORM WITH MATRIX A 2 WEIGHTS

BOUNDEDNESS OF THE RIESZ TRANSFORM WITH MATRIX A 2 WEIGHTS BOUNDEDNESS OF THE IESZ TANSFOM WITH MATIX A WEIGHTS Introducton Let L = L ( n, be the functon space wth norm (ˆ f L = f(x C dx d < For a d d matrx valued functon W : wth W (x postve sem-defnte for all

More information

Chapter 8 Indicator Variables

Chapter 8 Indicator Variables Chapter 8 Indcator Varables In general, e explanatory varables n any regresson analyss are assumed to be quanttatve n nature. For example, e varables lke temperature, dstance, age etc. are quanttatve n

More information

FREQUENCY DISTRIBUTIONS Page 1 of The idea of a frequency distribution for sets of observations will be introduced,

FREQUENCY DISTRIBUTIONS Page 1 of The idea of a frequency distribution for sets of observations will be introduced, FREQUENCY DISTRIBUTIONS Page 1 of 6 I. Introducton 1. The dea of a frequency dstrbuton for sets of observatons wll be ntroduced, together wth some of the mechancs for constructng dstrbutons of data. Then

More information

The optimal delay of the second test is therefore approximately 210 hours earlier than =2.

The optimal delay of the second test is therefore approximately 210 hours earlier than =2. THE IEC 61508 FORMULAS 223 The optmal delay of the second test s therefore approxmately 210 hours earler than =2. 8.4 The IEC 61508 Formulas IEC 61508-6 provdes approxmaton formulas for the PF for smple

More information

Convergence of random processes

Convergence of random processes DS-GA 12 Lecture notes 6 Fall 216 Convergence of random processes 1 Introducton In these notes we study convergence of dscrete random processes. Ths allows to characterze phenomena such as the law of large

More information

Lec 02 Entropy and Lossless Coding I

Lec 02 Entropy and Lossless Coding I Multmeda Communcaton, Fall 208 Lec 02 Entroy and Lossless Codng I Zhu L Z. L Multmeda Communcaton, Fall 208. Outlne Lecture 0 ReCa Info Theory on Entroy Lossless Entroy Codng Z. L Multmeda Communcaton,

More information

THE SUMMATION NOTATION Ʃ

THE SUMMATION NOTATION Ʃ Sngle Subscrpt otaton THE SUMMATIO OTATIO Ʃ Most of the calculatons we perform n statstcs are repettve operatons on lsts of numbers. For example, we compute the sum of a set of numbers, or the sum of the

More information

Psychology 282 Lecture #24 Outline Regression Diagnostics: Outliers

Psychology 282 Lecture #24 Outline Regression Diagnostics: Outliers Psychology 282 Lecture #24 Outlne Regresson Dagnostcs: Outlers In an earler lecture we studed the statstcal assumptons underlyng the regresson model, ncludng the followng ponts: Formal statement of assumptons.

More information

U.C. Berkeley CS294: Spectral Methods and Expanders Handout 8 Luca Trevisan February 17, 2016

U.C. Berkeley CS294: Spectral Methods and Expanders Handout 8 Luca Trevisan February 17, 2016 U.C. Berkeley CS94: Spectral Methods and Expanders Handout 8 Luca Trevsan February 7, 06 Lecture 8: Spectral Algorthms Wrap-up In whch we talk about even more generalzatons of Cheeger s nequaltes, and

More information

ESCI 341 Atmospheric Thermodynamics Lesson 10 The Physical Meaning of Entropy

ESCI 341 Atmospheric Thermodynamics Lesson 10 The Physical Meaning of Entropy ESCI 341 Atmospherc Thermodynamcs Lesson 10 The Physcal Meanng of Entropy References: An Introducton to Statstcal Thermodynamcs, T.L. Hll An Introducton to Thermodynamcs and Thermostatstcs, H.B. Callen

More information

COMPLEX NUMBERS AND QUADRATIC EQUATIONS

COMPLEX NUMBERS AND QUADRATIC EQUATIONS COMPLEX NUMBERS AND QUADRATIC EQUATIONS INTRODUCTION We know that x 0 for all x R e the square of a real number (whether postve, negatve or ero) s non-negatve Hence the equatons x, x, x + 7 0 etc are not

More information

Simultaneous Optimization of Berth Allocation, Quay Crane Assignment and Quay Crane Scheduling Problems in Container Terminals

Simultaneous Optimization of Berth Allocation, Quay Crane Assignment and Quay Crane Scheduling Problems in Container Terminals Smultaneous Optmzaton of Berth Allocaton, Quay Crane Assgnment and Quay Crane Schedulng Problems n Contaner Termnals Necat Aras, Yavuz Türkoğulları, Z. Caner Taşkın, Kuban Altınel Abstract In ths work,

More information

Some modelling aspects for the Matlab implementation of MMA

Some modelling aspects for the Matlab implementation of MMA Some modellng aspects for the Matlab mplementaton of MMA Krster Svanberg krlle@math.kth.se Optmzaton and Systems Theory Department of Mathematcs KTH, SE 10044 Stockholm September 2004 1. Consdered optmzaton

More information

Gaussian Mixture Models

Gaussian Mixture Models Lab Gaussan Mxture Models Lab Objectve: Understand the formulaton of Gaussan Mxture Models (GMMs) and how to estmate GMM parameters. You ve already seen GMMs as the observaton dstrbuton n certan contnuous

More information

Linear Regression Analysis: Terminology and Notation

Linear Regression Analysis: Terminology and Notation ECON 35* -- Secton : Basc Concepts of Regresson Analyss (Page ) Lnear Regresson Analyss: Termnology and Notaton Consder the generc verson of the smple (two-varable) lnear regresson model. It s represented

More information

The Multiple Classical Linear Regression Model (CLRM): Specification and Assumptions. 1. Introduction

The Multiple Classical Linear Regression Model (CLRM): Specification and Assumptions. 1. Introduction ECONOMICS 5* -- NOTE (Summary) ECON 5* -- NOTE The Multple Classcal Lnear Regresson Model (CLRM): Specfcaton and Assumptons. Introducton CLRM stands for the Classcal Lnear Regresson Model. The CLRM s also

More information

Which Separator? Spring 1

Which Separator? Spring 1 Whch Separator? 6.034 - Sprng 1 Whch Separator? Mamze the margn to closest ponts 6.034 - Sprng Whch Separator? Mamze the margn to closest ponts 6.034 - Sprng 3 Margn of a pont " # y (w $ + b) proportonal

More information

Problem Set 9 Solutions

Problem Set 9 Solutions Desgn and Analyss of Algorthms May 4, 2015 Massachusetts Insttute of Technology 6.046J/18.410J Profs. Erk Demane, Srn Devadas, and Nancy Lynch Problem Set 9 Solutons Problem Set 9 Solutons Ths problem

More information

EEE 241: Linear Systems

EEE 241: Linear Systems EEE : Lnear Systems Summary #: Backpropagaton BACKPROPAGATION The perceptron rule as well as the Wdrow Hoff learnng were desgned to tran sngle layer networks. They suffer from the same dsadvantage: they

More information

Welfare Properties of General Equilibrium. What can be said about optimality properties of resource allocation implied by general equilibrium?

Welfare Properties of General Equilibrium. What can be said about optimality properties of resource allocation implied by general equilibrium? APPLIED WELFARE ECONOMICS AND POLICY ANALYSIS Welfare Propertes of General Equlbrum What can be sad about optmalty propertes of resource allocaton mpled by general equlbrum? Any crteron used to compare

More information

Asymptotic Quantization: A Method for Determining Zador s Constant

Asymptotic Quantization: A Method for Determining Zador s Constant Asymptotc Quantzaton: A Method for Determnng Zador s Constant Joyce Shh Because of the fnte capacty of modern communcaton systems better methods of encodng data are requred. Quantzaton refers to the methods

More information

CONJUGACY IN THOMPSON S GROUP F. 1. Introduction

CONJUGACY IN THOMPSON S GROUP F. 1. Introduction CONJUGACY IN THOMPSON S GROUP F NICK GILL AND IAN SHORT Abstract. We complete the program begun by Brn and Squer of charactersng conjugacy n Thompson s group F usng the standard acton of F as a group of

More information

HMMT February 2016 February 20, 2016

HMMT February 2016 February 20, 2016 HMMT February 016 February 0, 016 Combnatorcs 1. For postve ntegers n, let S n be the set of ntegers x such that n dstnct lnes, no three concurrent, can dvde a plane nto x regons (for example, S = {3,

More information

Channel Encoder. Channel. Figure 7.1: Communication system

Channel Encoder. Channel. Figure 7.1: Communication system Chapter 7 Processes The model of a communcaton system that we have been developng s shown n Fgure 7.. Ths model s also useful for some computaton systems. The source s assumed to emt a stream of symbols.

More information

Feature Selection: Part 1

Feature Selection: Part 1 CSE 546: Machne Learnng Lecture 5 Feature Selecton: Part 1 Instructor: Sham Kakade 1 Regresson n the hgh dmensonal settng How do we learn when the number of features d s greater than the sample sze n?

More information

Bezier curves. Michael S. Floater. August 25, These notes provide an introduction to Bezier curves. i=0

Bezier curves. Michael S. Floater. August 25, These notes provide an introduction to Bezier curves. i=0 Bezer curves Mchael S. Floater August 25, 211 These notes provde an ntroducton to Bezer curves. 1 Bernsten polynomals Recall that a real polynomal of a real varable x R, wth degree n, s a functon of the

More information

Difference Equations

Difference Equations Dfference Equatons c Jan Vrbk 1 Bascs Suppose a sequence of numbers, say a 0,a 1,a,a 3,... s defned by a certan general relatonshp between, say, three consecutve values of the sequence, e.g. a + +3a +1

More information

ENTROPIC QUESTIONING

ENTROPIC QUESTIONING ENTROPIC QUESTIONING NACHUM. Introucton Goal. Pck the queston that contrbutes most to fnng a sutable prouct. Iea. Use an nformaton-theoretc measure. Bascs. Entropy (a non-negatve real number) measures

More information

Scalar and Vector Quantization

Scalar and Vector Quantization Scalar and Vector Quantzaton Máro A. T. Fgueredo, Departamento de Engenhara Electrotécnca e de Computadores, Insttuto Superor Técnco, Lsboa, Portugal maro.fgueredo@tecnco.ulsboa.pt November 207 Quantzaton

More information

On the set of natural numbers

On the set of natural numbers On the set of natural numbers by Jalton C. Ferrera Copyrght 2001 Jalton da Costa Ferrera Introducton The natural numbers have been understood as fnte numbers, ths wor tres to show that the natural numbers

More information

Games of Threats. Elon Kohlberg Abraham Neyman. Working Paper

Games of Threats. Elon Kohlberg Abraham Neyman. Working Paper Games of Threats Elon Kohlberg Abraham Neyman Workng Paper 18-023 Games of Threats Elon Kohlberg Harvard Busness School Abraham Neyman The Hebrew Unversty of Jerusalem Workng Paper 18-023 Copyrght 2017

More information