Lec 02 Entropy and Lossless Coding I

Size: px
Start display at page:

Download "Lec 02 Entropy and Lossless Coding I"

Transcription

1 Multmeda Communcaton, Fall 208 Lec 02 Entroy and Lossless Codng I Zhu L Z. L Multmeda Communcaton, Fall 208.

2 Outlne Lecture 0 ReCa Info Theory on Entroy Lossless Entroy Codng Z. L Multmeda Communcaton, Fall 208.2

3 Vdeo Comresson n Summary Comresson Pelne: Z. L Multmeda Communcaton, Fall 208.3

4 Vdeo Codng Standards: Rate-Dstorton Performance Pre-HEVC Z. L Multmeda Communcaton, Fall 208.4

5 PSS over managed IP networks Managed moble core IP networks Z. L Multmeda Communcaton, Fall 208.5

6 MPEG DASH OTT HTTP Adatve Streamng of Vdeo Z. L Multmeda Communcaton, Fall 208.6

7 Outlne Lecture 0 ReCa Info Theory on Entroy Self Info of an event Entroy of the source Relatve Entroy Mutual Info Entroy Codng Thanks for SFU s Prof. Je Lang s sld Z. L Multmeda Communcaton, Fall 208.7

8 Entroy and ts Alcaton Entroy codng: the last art of a comresson system Encoder Transform Quantzato n Entroy codng Losslessly reresent symbols Key dea: Assgn short codes for common symbols Assgn long codes for rare symbols Queston: How to evaluate a comresson method? o Need to know the lower bound we can acheve. o Entroy Z. L Multmeda Communcaton, Fall 208.8

9 Claude Shannon: A dstant relatve of Thomas Edson 932: Went to Unversty of Mchgan. 937: Master thess at MIT became the foundaton of dgtal crcut desgn: o The most mortant, and also the most famous, master's thess of the century 940: PhD, MIT : Bell Lab back to MIT after that 948: The brth of Informaton Theory o A mathematcal theory of communcaton, Bell System Techncal Journal. Z. L Multmeda Communcaton, Fall 208.9

10 Aom Defnton of Informaton Informaton s a measure of uncertanty or surrse Aom : Informaton of an event s a functon of ts robablty: A = f PA. What s the eresson of f? Aom 2: Rare events have hgh nformaton content Water found on Mars!!! Common events have low nformaton content It s ranng n Vancouver. Informaton should be a decreasng functon of the robablty: Stll numerous choces of f. Aom 3: Informaton of two ndeendent events = sum of ndvdual nformaton: If PAB=PAPB AB = A + B. Only the logarthmc functon satsfes these condtons. Z. L Multmeda Communcaton, Fall 208.0

11 Self-nformaton Shannon s Defnton [948]: X: dscrete random varable wth alhabet {A, A2,, AN} Probablty mass functon: = Pr{ X = } Self-nformaton of an event X = : = logb = -logb If b = 2, unt of nformaton s bt - log b P Self nformaton ndcates the number of bts needed to reresent an event. 0 P Z. L Multmeda Communcaton, Fall 208.

12 Entroy of a Random Varable H X = å log Also wrte as H : functon of the dstrbuton of X, not the value of X. Recall: the mean of a functon gx: E å g X = g Entroy s the eected self-nformaton of the r.v. X: æ ö H = E çlog = -E log X è X ø The entroy reresents the mnmal number of bts needed to losslessly reresent one outut of the source. Z. L Multmeda Communcaton, Fall 208.2

13 Eamle PX=0 = /2 PX= = /4 PX=2 = /8 PX=3 = /8 Fnd the entroy of X. Soluton: H X = å log = log 2 + log 4 + log 8 + log 8 = = bts/samle Z. L Multmeda Communcaton, Fall 208.3

14 Eamle A bnary source: only two ossble oututs: 0, Source outut eamle: X=0 =, X==. Entroy of X: H = -log log2- H = 0 when = 0 or = ofed outut, no nformaton H s largest when = /2 ohghest uncertanty oh = bt n ths case Proertes: H 0 H concave roved later Entroy Equal rob mamze entroy Z. L Multmeda Communcaton, Fall 208.4

15 Jont entroy We can get better understandng of the source S by lookng at a block of outut XX2 Xn: The jont robablty of a block of outut: Jont entroy X =, X =, X = 2 2 n n H X, X, X = 2 2 åå å X =, X 2 = 2, Xn = n log X =, X 2 = 2, Xn = n = -E n n X,... X log n Jont entroy s the number of bts requred to reresent the sequence XX2 Xn: Ths s the lower bound for entroy codng. Z. L Multmeda Communcaton, Fall 208.5

16 Condtonal Entroy Condtonal Self-Informaton of an event X =, gven that event Y = y has occurred: y y = log = log y, y Note: y,, y and y are three dfferent dstrbutons: y, 2, y and 3y. Condtonal Entroy HY X: Average cond. self-nfo. Remanng uncertanty about Y gven the knowledge of X. H Y X = H Y X = = - y log y = - y log y = -, ylog y = -E åå, y å å å y y y log y åå Z. L Multmeda Communcaton, Fall 208.6

17 Chan Rule HX, Y = HX + HY X = HY + HX Y Proof: H X, Y = - = - = - åå åå å y y = - åå log y, ylog, ylog, ylog y - + H Y åå, y y, ylog y X = H X + H Y X. Total area: HX, Y HX Y HY X HX HY Smler notaton: H X, Y = -Elog X, Y = -Elog X + log Y X = H X + H Y X Z. L Multmeda Communcaton, Fall 208.7

18 Condtonal Entroy Eamle: for the followng jont dstrbuton, y, fnd HY X. X Y PX: [ ½, ¼, /8, /8 ] >> HX = 7/4 b /8 /6 /32 /32 PY: [ ¼, ¼, ¼, ¼ ] >> HY = 2 bts 2 /6 /8 /32 /32 3 /6 /6 /6 /6 4 / Indeed, HX Y = HX, Y HY= 27/8 2 = /8 bts Z. L Multmeda Communcaton, Fall 208.8

19 General Chan Rule Addng HZ: HX, Y Z + Hz = HX, Y, Z = Hz + HX Z + HY X, Z General form of chan rule: H X, X 2,... X n = å H X X -,... X The jont encodng of a sequence can be broken nto the sequental encodng of each samle, e.g. HX, X 2, X 3 =HX + HX 2 X + HX 3 X 2, X Advantages: Jont encodng needs jont robablty: dffcult Sequental encodng only needs condtonal entroy, can use local neghbors to aromate the condtonal entroy contet-adatve arthmetc codng. n = Z. L Multmeda Communcaton, Fall 208.9

20 General Chan Rule,......, = n n n Proof:.,...,... log,...,... log,...,... log,...,... log,...,...,...,...,...,... å å å å å å Õ å = - = - - = = - = - = - = - = - = n n n n n n n n n n n n n n X X X H X X H Z. L Multmeda Communcaton, Fall

21 General Chan Rule The comlety of the condtonal robablty,... - grows as the ncrease of. In many cases we can aromate the cond. robablty wth some nearest neghbors contets:,...», L a b c b c a b c b a b c b a The low-dm cond rob s more manageable How to measure the qualty of the aromaton? Relatve entroy Z. L Multmeda Communcaton, Fall 208.2

22 Relatve Entroy Cost of Codng wth Wrong Dstr D æ X ö q = å log = E çlog q è q X ø Also known as Kullback Lebler K-L Dstance, Informaton Dvergence, Informaton Gan A measure of the dstance between two dstrbutons: In many alcatons, the true dstrbuton X s unknown, and we only know an estmaton dstrbuton qx What s the neffcency n reresentng X? o The true entroy: o The actual rate: o The dfference: R = -å log R = -å q 2 log R2 - R = D q Z. L Multmeda Communcaton, Fall

23 Relatve Entroy D Proertes: æ X ö q = å log = E çlog q è q X ø D q ³ 0. Proved later. D q = 0 f and only f q =. What f >0, but q=0 for some? D q= Cauton: D q s not a true dstance Not symmetrc n general: D q Dq Does not satsfy trangular nequalty. Z. L Multmeda Communcaton, Fall

24 Relatve Entroy How to make t symmetrc? Many ossbltes, for eamle: 2 D q D q + D q D q D q + D q can be useful for attern classfcaton. Z. L Multmeda Communcaton, Fall

25 Mutual Informaton y: condtonal self-nformaton y = -log y Mutual nformaton between two events: y ; y = - y = log = log, y y A measure of the amount of nformaton that one event contans about another one. or the reducton n the uncertanty of one event due to the knowledge of the other. Note: ; y can be negatve, f y <. Z. L Multmeda Communcaton, Fall

26 Mutual Informaton IX; Y: Mutual nformaton between two random varables:, y I X ; Y = åå, y ; y = åå, ylog y y y æ X, Y ö = D, y y = E, y ç log è X Y ø Mutual nformaton s a relatve entroy: But t s symmetrc: IX; Y = IY; X Dfferent from ; y, IX; Y >=0 due to averagng If X, Y are ndeendent:, y = y I X; Y = 0 Knowng X does not reduce the uncertanty of Y. Z. L Multmeda Communcaton, Fall

27 Entroy and Mutual Informaton. I X ; Y = H X - H X Y, y y I X ; Y = åå, ylog = åå, ylog y åå =, ylog y -, ylog = H X - H X Y y y åå y y 2. Smlarly: I X ; Y = H Y - H Y X 3. IX; Y = HX + HY HX, Y Proof: Eand the defnton: I X ; Y = H X + = åå y H Y -, y H X, Y log, y - log - log y Z. L Multmeda Communcaton, Fall

28 Entroy and Mutual Informaton Total area: HX, Y HX Y IX; Y HY X HX HY It can be seen from ths fgure that IX; X = HX: Proof: Let X = Y n IX; Y = HX + HY HX, Y, or n IX; Y = HX HX Y and use HX X=0. Z. L Multmeda Communcaton, Fall

29 Alcaton of Mutual Informaton a b c b c a b c b a b c b a Mutual nformaton can be used n the otmzaton of contet quantzaton. Eamle: If each neghbor has 26 ossble values a to z, then 5 neghbors have 26 5 combnatons: too many cond robs to estmate. To reduce the number, can grou smlar data attern together contet quantzaton,...» f, Z. L Multmeda Communcaton, Fall

30 Alcaton of Mutual Informaton H X, X 2,... X n = å H X X -,... X,...» f, We need to desgn the functon f to mnmze the condtonal entroy But n = H X f X -,... X H X Y = H X - I X ; Y The roblem s equvalent to mamzng the mutual nformaton between {X} and f, -. For further nfo: Lu and Karam, Mutual Informaton-Based Analyss of JPEG2000 Contets, IEEE Trans Image Processng, VOL. 4, NO. 4, APRIL 2005, Z. L Multmeda Communcaton, Fall

31 Outlne Lecture 0 ReCa Info Theory and Entroy Entroy Codng Pref Codng Kraft-McMllan Inequalty Shannon Codes Z. L Multmeda Communcaton, Fall 208.3

32 Varable Length Codng Desgn the mang from source symbols to codewords Lossless mang Dfferent codewords may have dfferent lengths Goal: mnmzng the average codeword length The entroy s the lower bound. Z. L Multmeda Communcaton, Fall

33 Classes of Codes Non-sngular code: Dfferent nuts are maed to dfferent codewords nvertble. Unquely decodable code: any encoded strng has only one ossble source strng, but may need delay to decode. Pref-free code or smly ref, or nstantaneous: No codeword s a ref of any other codeword. The focus of our studes. Questons: o Characterstc? o How to desgn? o Is t otmal? All codes Non-sngular codes Unquely decodable codes Pref-free codes Z. L Multmeda Communcaton, Fall

34 Pref Code Eamles X Sngular Non-sngular, But not unquely decodable Unquely decodable, but not ref-free Pref-free Need unctuaton 00 Need to look at net bt to decode revous code. Z. L Multmeda Communcaton, Fall

35 Carter-Gll s Conjecture [974] Carter-Gll s Conjecture [974] Every unquely decodable code can be relaced by a ref-free code wth the same set of codeword comostons. So we only need to study ref-free code. Z. L Multmeda Communcaton, Fall

36 Pref-free Code Can be unquely decoded. No codeword s a ref of another one. Also called ref code Goal: construct ref code wth mnmal eected length. Can ut all codewords n a bnary tree: Root node Internal node leaf node 0 Pref-free code contans leaves only. How to eress the requrement mathematcally? Z. L Multmeda Communcaton, Fall

37 Kraft-McMllan Inequalty The characterstc of ref-free codes: The codeword lengths l, =, N of a ref code over an alhabet of sze D=2 satsfes the nequalty N å = 2 - l Conversely, f a set of {l} satsfes the nequalty above, then there ests a ref code wth codeword lengths l, =, N. Z. L Multmeda Communcaton, Fall

38 Kraft-McMllan Inequalty Consder D=2: eand the bnary code tree to full deth L = mal : l s the code length L = 3 N å = 2 -l N L l Û å - 2 = 2 L Eamle: {0, 0, 0, } Number of nodes n the last level: Each code corresonds to a sub-tree: The number of off srngs n the last level: 2 L-l K-M nequalty: # of L-th level offsrngs of all codes s less than 2^L! Z. L Multmeda Communcaton, Fall L 2^3=8 {4, 2, 0, 0}

39 Kraft-McMllan Inequalty K-M nequalty: å = 2 - l Invald code: {0, 0,, 0, }: l =[ ] Leads to more than 2^L offsrng: 2> 2 3 Z. L Multmeda Communcaton, Fall

40 Etended Kraft Inequalty Countably nfnte ref code also satsfes the Kraft nequalty: Has nfnte number of code words. å = 2 - l Eamle: 0, 0, 0, 0, 0, 0, Golomb-Rce code, net lecture Each codeword can be maed to a subnterval n [0, ] that s dsjont wth others revsted n arthmetc codng L = Z. L Multmeda Communcaton, Fall

41 Otmal Codes Advanced Toc How to desgn the ref code wth the mnmal eected length? Otmzaton Problem: fnd {l} to mn l s. t. å å D l -l Lagrangan soluton: Ignore the nteger codeword length constrant for now Assume equalty holds n the Kraft nequalty Mnmze J å l å -l = l + D Z. L Multmeda Communcaton, Fall 208.4

42 Otmal Codes J å Let J = - - l = l å -l = l + l D l ln D D 0 å - l Substtut ng nto D = D - l l = = l ln D ln D -l D = or l * = - log D The otmal codeword length s the self-nformaton of an event. Eected codeword length: L * = å l = å - log H X Entroy of X! D = D Z. L Multmeda Communcaton, Fall

43 Otmal Code Theorem: The eected length L of any ref code s greater or equal to the entroy l * = -log D s not nteger n general. wth equalty holds ff Proof: L ³ H D X -l D = P s Dadc, ½, ¼, /8, /6 å å L - H X = l + log D D å å å l = log D + log = log D D D l D - Ths remnds us the defnton of relatve entroy D q, but we need to normalze D -l. Z. L Multmeda Communcaton, Fall

44 Otmal Code because D q >= 0, and N å = l D - The equalty holds ff both terms are 0: -l D = for ref code. or -log D s an nteger. Z. L Multmeda Communcaton, Fall

45 Otmal Code D-adc: a robablty dstrbuton s called D-adc wth resect to D f each robablty s equal to D -n for some nteger n: Eamle: D=2, P= {/2, /4, /8, /8} Therefore the otmalty can be acheved by ref code ff the dstrbuton s D-adc. Prevous eamle: - log = {, 2,3,3} D Possble codewords: o {0, 0, 0, } Z. L Multmeda Communcaton, Fall

46 Shannon Code: Bounds on Otmal Code Code length s not nteger n general. D - l log * = Practcal codewords have to be nteger. ú ú ù ê ê é = D l log Shannon Code: Is ths a vald ref code? Check Kraft nequalty. log log = = = å å å å - ú ú ù ê ê é - - l D D D D D log log + < D D l ú ú ù ê ê é = D l log + < X H L X H D D Yes! Ths s just one choce. May not be otmal see eamle later Z. L Multmeda Communcaton, Fall

47 Otmal Code The otmal code wth nteger lengths should be better than Shannon code H D X * L H X D + To reduce the overhead er symbol: Encode a block of symbols {, 2,, n} together Ln = å, 2,..., n l, 2,..., n = E n n H X, X,..., X E l,,..., H X { l,,..., } 2 n { }, X,..., X 2 n 2 n 2 n + Assume..d. samles: H X, X 2,..., X n = nh X H X L H X + n n L n H X entroy rate f statonary. Z. L Multmeda Communcaton, Fall

48 Otmal Code Imact of wrong df: what s the enalty f the df we use s dfferent from the true df? True df: Codeword length: l Estmated df: q Eected length: ElX + + < + q D H X l E q D H Proof: assume Shannon code ú ú ù ê ê é = log q l å å ø ö ç è æ + < ú ú ù ê ê é = log log q q X E l log + + = ø ö ç ç è æ + ø ö ç è æ = å q D X H q The lower bound s derved smlarly. Z. L Multmeda Communcaton, Fall

49 Shannon Code s not otmal Eamle: Bnary r.v. X: 0=0.9999, = Entroy: bts/samle Assgn bnary codewords by Shannon code: l = é log ê ù ú é ù log 2 =. ê ú é ù log 2 = 4. ê ú Eected length: =.003. Wthn the range of [HX, HX + ]. But we can easly beat ths by the code {0, } Eected length:. Z. L Multmeda Communcaton, Fall

50 Summary Entroy: H Condtonal Entroy: HX Y Mutual Info: IX, Y = Relatve Entroy: DX Y K-M Inequalty: Bounds on Pref Codes: Z. L Multmeda Communcaton, Fall

ECE 534: Elements of Information Theory. Solutions to Midterm Exam (Spring 2006)

ECE 534: Elements of Information Theory. Solutions to Midterm Exam (Spring 2006) ECE 534: Elements of Informaton Theory Solutons to Mdterm Eam (Sprng 6) Problem [ pts.] A dscrete memoryless source has an alphabet of three letters,, =,, 3, wth probabltes.4,.4, and., respectvely. (a)

More information

Entropy Coding. A complete entropy codec, which is an encoder/decoder. pair, consists of the process of encoding or

Entropy Coding. A complete entropy codec, which is an encoder/decoder. pair, consists of the process of encoding or Sgnal Compresson Sgnal Compresson Entropy Codng Entropy codng s also known as zero-error codng, data compresson or lossless compresson. Entropy codng s wdely used n vrtually all popular nternatonal multmeda

More information

EGR 544 Communication Theory

EGR 544 Communication Theory EGR 544 Communcaton Theory. Informaton Sources Z. Alyazcoglu Electrcal and Computer Engneerng Department Cal Poly Pomona Introducton Informaton Source x n Informaton sources Analog sources Dscrete sources

More information

A Mathematical Theory of Communication. Claude Shannon s paper presented by Kate Jenkins 2/19/00

A Mathematical Theory of Communication. Claude Shannon s paper presented by Kate Jenkins 2/19/00 A Mathematcal Theory of Communcaton Claude hannon s aer resented by Kate Jenkns 2/19/00 Publshed n two arts, July 1948 and October 1948 n the Bell ystem Techncal Journal Foundng aer of Informaton Theory

More information

An application of generalized Tsalli s-havrda-charvat entropy in coding theory through a generalization of Kraft inequality

An application of generalized Tsalli s-havrda-charvat entropy in coding theory through a generalization of Kraft inequality Internatonal Journal of Statstcs and Aled Mathematcs 206; (4): 0-05 ISS: 2456-452 Maths 206; (4): 0-05 206 Stats & Maths wwwmathsjournalcom Receved: 0-09-206 Acceted: 02-0-206 Maharsh Markendeshwar Unversty,

More information

Machine Learning. Classification. Theory of Classification and Nonparametric Classifier. Representing data: Hypothesis (classifier) Eric Xing

Machine Learning. Classification. Theory of Classification and Nonparametric Classifier. Representing data: Hypothesis (classifier) Eric Xing Machne Learnng 0-70/5 70/5-78, 78, Fall 008 Theory of Classfcaton and Nonarametrc Classfer Erc ng Lecture, Setember 0, 008 Readng: Cha.,5 CB and handouts Classfcaton Reresentng data: M K Hyothess classfer

More information

Independent Component Analysis

Independent Component Analysis Indeendent Comonent Analyss Mture Data Data that are mngled from multle sources May not now how many sources May not now the mng mechansm Good Reresentaton Uncorrelated, nformaton-bearng comonents PCA

More information

Compression in the Real World :Algorithms in the Real World. Compression in the Real World. Compression Outline

Compression in the Real World :Algorithms in the Real World. Compression in the Real World. Compression Outline Compresson n the Real World 5-853:Algorthms n the Real World Data Compresson: Lectures and 2 Generc Fle Compresson Fles: gzp (LZ77), bzp (Burrows-Wheeler), BOA (PPM) Archvers: ARC (LZW), PKZp (LZW+) Fle

More information

SOME NOISELESS CODING THEOREM CONNECTED WITH HAVRDA AND CHARVAT AND TSALLIS S ENTROPY. 1. Introduction

SOME NOISELESS CODING THEOREM CONNECTED WITH HAVRDA AND CHARVAT AND TSALLIS S ENTROPY. 1. Introduction Kragujevac Journal of Mathematcs Volume 35 Number (20, Pages 7 SOME NOISELESS COING THEOREM CONNECTE WITH HAVRA AN CHARVAT AN TSALLIS S ENTROPY SATISH KUMAR AN RAJESH KUMAR 2 Abstract A new measure L,

More information

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur Module 3 LOSSY IMAGE COMPRESSION SYSTEMS Verson ECE IIT, Kharagpur Lesson 6 Theory of Quantzaton Verson ECE IIT, Kharagpur Instructonal Objectves At the end of ths lesson, the students should be able to:

More information

Mixture of Gaussians Expectation Maximization (EM) Part 2

Mixture of Gaussians Expectation Maximization (EM) Part 2 Mture of Gaussans Eectaton Mamaton EM Part 2 Most of the sldes are due to Chrstoher Bsho BCS Summer School Eeter 2003. The rest of the sldes are based on lecture notes by A. Ng Lmtatons of K-means Hard

More information

On mutual information estimation for mixed-pair random variables

On mutual information estimation for mixed-pair random variables On mutual nformaton estmaton for mxed-par random varables November 3, 218 Aleksandr Beknazaryan, Xn Dang and Haln Sang 1 Department of Mathematcs, The Unversty of Msssspp, Unversty, MS 38677, USA. E-mal:

More information

Outline. Bayesian Networks: Maximum Likelihood Estimation and Tree Structure Learning. Our Model and Data. Outline

Outline. Bayesian Networks: Maximum Likelihood Estimation and Tree Structure Learning. Our Model and Data. Outline Outlne Bayesan Networks: Maxmum Lkelhood Estmaton and Tree Structure Learnng Huzhen Yu janey.yu@cs.helsnk.f Dept. Computer Scence, Unv. of Helsnk Probablstc Models, Sprng, 200 Notces: I corrected a number

More information

VQ widely used in coding speech, image, and video

VQ widely used in coding speech, image, and video at Scalar quantzers are specal cases of vector quantzers (VQ): they are constraned to look at one sample at a tme (memoryless) VQ does not have such constrant better RD perfomance expected Source codng

More information

A Note on Bound for Jensen-Shannon Divergence by Jeffreys

A Note on Bound for Jensen-Shannon Divergence by Jeffreys OPEN ACCESS Conference Proceedngs Paper Entropy www.scforum.net/conference/ecea- A Note on Bound for Jensen-Shannon Dvergence by Jeffreys Takuya Yamano, * Department of Mathematcs and Physcs, Faculty of

More information

Matching Dyadic Distributions to Channels

Matching Dyadic Distributions to Channels Matchng Dyadc Dstrbutons to Channels G. Böcherer and R. Mathar Insttute for Theoretcal Informaton Technology RWTH Aachen Unversty, 5256 Aachen, Germany Emal: {boecherer,mathar}@t.rwth-aachen.de Abstract

More information

Introduction to Information Theory, Data Compression,

Introduction to Information Theory, Data Compression, Introducton to Informaton Theory, Data Compresson, Codng Mehd Ibm Brahm, Laura Mnkova Aprl 5, 208 Ths s the augmented transcrpt of a lecture gven by Luc Devroye on the 3th of March 208 for a Data Structures

More information

Introduction to information theory and data compression

Introduction to information theory and data compression Introducton to nformaton theory and data compresson Adel Magra, Emma Gouné, Irène Woo March 8, 207 Ths s the augmented transcrpt of a lecture gven by Luc Devroye on March 9th 207 for a Data Structures

More information

Lossy Compression. Compromise accuracy of reconstruction for increased compression.

Lossy Compression. Compromise accuracy of reconstruction for increased compression. Lossy Compresson Compromse accuracy of reconstructon for ncreased compresson. The reconstructon s usually vsbly ndstngushable from the orgnal mage. Typcally, one can get up to 0:1 compresson wth almost

More information

Pattern Classification (II) 杜俊

Pattern Classification (II) 杜俊 attern lassfcaton II 杜俊 junu@ustc.eu.cn Revew roalty & Statstcs Bayes theorem Ranom varales: screte vs. contnuous roalty struton: DF an DF Statstcs: mean, varance, moment arameter estmaton: MLE Informaton

More information

Flexible Quantization

Flexible Quantization wb 06/02/21 1 Flexble Quantzaton Bastaan Klejn KTH School of Electrcal Engneerng Stocholm wb 06/02/21 2 Overvew Motvaton for codng technologes Basc quantzaton and codng Hgh-rate quantzaton theory wb 06/02/21

More information

Lecture 3: Probability Distributions

Lecture 3: Probability Distributions Lecture 3: Probablty Dstrbutons Random Varables Let us begn by defnng a sample space as a set of outcomes from an experment. We denote ths by S. A random varable s a functon whch maps outcomes nto the

More information

Lecture 3: Shannon s Theorem

Lecture 3: Shannon s Theorem CSE 533: Error-Correctng Codes (Autumn 006 Lecture 3: Shannon s Theorem October 9, 006 Lecturer: Venkatesan Guruswam Scrbe: Wdad Machmouch 1 Communcaton Model The communcaton model we are usng conssts

More information

Lecture 4: Universal Hash Functions/Streaming Cont d

Lecture 4: Universal Hash Functions/Streaming Cont d CSE 5: Desgn and Analyss of Algorthms I Sprng 06 Lecture 4: Unversal Hash Functons/Streamng Cont d Lecturer: Shayan Oves Gharan Aprl 6th Scrbe: Jacob Schreber Dsclamer: These notes have not been subjected

More information

Mathematical Models for Information Sources A Logarithmic i Measure of Information

Mathematical Models for Information Sources A Logarithmic i Measure of Information Introducton to Informaton Theory Wreless Informaton Transmsson System Lab. Insttute of Communcatons Engneerng g Natonal Sun Yat-sen Unversty Table of Contents Mathematcal Models for Informaton Sources

More information

( ) 2 ( ) ( ) Problem Set 4 Suggested Solutions. Problem 1

( ) 2 ( ) ( ) Problem Set 4 Suggested Solutions. Problem 1 Problem Set 4 Suggested Solutons Problem (A) The market demand functon s the soluton to the followng utlty-maxmzaton roblem (UMP): The Lagrangean: ( x, x, x ) = + max U x, x, x x x x st.. x + x + x y x,

More information

3.1 ML and Empirical Distribution

3.1 ML and Empirical Distribution 67577 Intro. to Machne Learnng Fall semester, 2008/9 Lecture 3: Maxmum Lkelhood/ Maxmum Entropy Dualty Lecturer: Amnon Shashua Scrbe: Amnon Shashua 1 In the prevous lecture we defned the prncple of Maxmum

More information

Solutions to exam in SF1811 Optimization, Jan 14, 2015

Solutions to exam in SF1811 Optimization, Jan 14, 2015 Solutons to exam n SF8 Optmzaton, Jan 4, 25 3 3 O------O -4 \ / \ / The network: \/ where all lnks go from left to rght. /\ / \ / \ 6 O------O -5 2 4.(a) Let x = ( x 3, x 4, x 23, x 24 ) T, where the varable

More information

Confidence intervals for weighted polynomial calibrations

Confidence intervals for weighted polynomial calibrations Confdence ntervals for weghted olynomal calbratons Sergey Maltsev, Amersand Ltd., Moscow, Russa; ur Kalambet, Amersand Internatonal, Inc., Beachwood, OH e-mal: kalambet@amersand-ntl.com htt://www.chromandsec.com

More information

Pulse Coded Modulation

Pulse Coded Modulation Pulse Coded Modulaton PCM (Pulse Coded Modulaton) s a voce codng technque defned by the ITU-T G.711 standard and t s used n dgtal telephony to encode the voce sgnal. The frst step n the analog to dgtal

More information

Predictive Analytics : QM901.1x Prof U Dinesh Kumar, IIMB. All Rights Reserved, Indian Institute of Management Bangalore

Predictive Analytics : QM901.1x Prof U Dinesh Kumar, IIMB. All Rights Reserved, Indian Institute of Management Bangalore Sesson Outlne Introducton to classfcaton problems and dscrete choce models. Introducton to Logstcs Regresson. Logstc functon and Logt functon. Maxmum Lkelhood Estmator (MLE) for estmaton of LR parameters.

More information

Lecture 12: Classification

Lecture 12: Classification Lecture : Classfcaton g Dscrmnant functons g The optmal Bayes classfer g Quadratc classfers g Eucldean and Mahalanobs metrcs g K Nearest Neghbor Classfers Intellgent Sensor Systems Rcardo Guterrez-Osuna

More information

6. Stochastic processes (2)

6. Stochastic processes (2) Contents Markov processes Brth-death processes Lect6.ppt S-38.45 - Introducton to Teletraffc Theory Sprng 5 Markov process Consder a contnuous-tme and dscrete-state stochastc process X(t) wth state space

More information

Complete subgraphs in multipartite graphs

Complete subgraphs in multipartite graphs Complete subgraphs n multpartte graphs FLORIAN PFENDER Unverstät Rostock, Insttut für Mathematk D-18057 Rostock, Germany Floran.Pfender@un-rostock.de Abstract Turán s Theorem states that every graph G

More information

6. Stochastic processes (2)

6. Stochastic processes (2) 6. Stochastc processes () Lect6.ppt S-38.45 - Introducton to Teletraffc Theory Sprng 5 6. Stochastc processes () Contents Markov processes Brth-death processes 6. Stochastc processes () Markov process

More information

Stanford University CS359G: Graph Partitioning and Expanders Handout 4 Luca Trevisan January 13, 2011

Stanford University CS359G: Graph Partitioning and Expanders Handout 4 Luca Trevisan January 13, 2011 Stanford Unversty CS359G: Graph Parttonng and Expanders Handout 4 Luca Trevsan January 3, 0 Lecture 4 In whch we prove the dffcult drecton of Cheeger s nequalty. As n the past lectures, consder an undrected

More information

CSCE 790S Background Results

CSCE 790S Background Results CSCE 790S Background Results Stephen A. Fenner September 8, 011 Abstract These results are background to the course CSCE 790S/CSCE 790B, Quantum Computaton and Informaton (Sprng 007 and Fall 011). Each

More information

Boning Yang. March 8, 2018

Boning Yang. March 8, 2018 Concentraton Inequaltes by concentraton nequalty Introducton to Basc Concentraton Inequaltes by Florda State Unversty March 8, 2018 Framework Concentraton Inequaltes by 1. concentraton nequalty concentraton

More information

Managing Capacity Through Reward Programs. on-line companion page. Byung-Do Kim Seoul National University College of Business Administration

Managing Capacity Through Reward Programs. on-line companion page. Byung-Do Kim Seoul National University College of Business Administration Managng Caacty Through eward Programs on-lne comanon age Byung-Do Km Seoul Natonal Unversty College of Busness Admnstraton Mengze Sh Unversty of Toronto otman School of Management Toronto ON M5S E6 Canada

More information

Lecture 14 (03/27/18). Channels. Decoding. Preview of the Capacity Theorem.

Lecture 14 (03/27/18). Channels. Decoding. Preview of the Capacity Theorem. Lecture 14 (03/27/18). Channels. Decodng. Prevew of the Capacty Theorem. A. Barg The concept of a communcaton channel n nformaton theory s an abstracton for transmttng dgtal (and analog) nformaton from

More information

Dr. Shalabh Department of Mathematics and Statistics Indian Institute of Technology Kanpur

Dr. Shalabh Department of Mathematics and Statistics Indian Institute of Technology Kanpur Analyss of Varance and Desgn of Exerments-I MODULE III LECTURE - 2 EXPERIMENTAL DESIGN MODELS Dr. Shalabh Deartment of Mathematcs and Statstcs Indan Insttute of Technology Kanur 2 We consder the models

More information

Generalized Linear Methods

Generalized Linear Methods Generalzed Lnear Methods 1 Introducton In the Ensemble Methods the general dea s that usng a combnaton of several weak learner one could make a better learner. More formally, assume that we have a set

More information

Complete Variance Decomposition Methods. Cédric J. Sallaberry

Complete Variance Decomposition Methods. Cédric J. Sallaberry Comlete Varance Decomoston Methods Cédrc J. allaberry enstvty Analyss y y [,,, ] [ y, y,, ] y ny s a vector o uncertan nuts s a vector o results s a comle uncton successon o derent codes, systems o de,

More information

j) = 1 (note sigma notation) ii. Continuous random variable (e.g. Normal distribution) 1. density function: f ( x) 0 and f ( x) dx = 1

j) = 1 (note sigma notation) ii. Continuous random variable (e.g. Normal distribution) 1. density function: f ( x) 0 and f ( x) dx = 1 Random varables Measure of central tendences and varablty (means and varances) Jont densty functons and ndependence Measures of assocaton (covarance and correlaton) Interestng result Condtonal dstrbutons

More information

Chapter 8 SCALAR QUANTIZATION

Chapter 8 SCALAR QUANTIZATION Outlne Chapter 8 SCALAR QUANTIZATION Yeuan-Kuen Lee [ CU, CSIE ] 8.1 Overvew 8. Introducton 8.4 Unform Quantzer 8.5 Adaptve Quantzaton 8.6 Nonunform Quantzaton 8.7 Entropy-Coded Quantzaton Ch 8 Scalar

More information

CS47300: Web Information Search and Management

CS47300: Web Information Search and Management CS47300: Web Informaton Search and Management Probablstc Retreval Models Prof. Chrs Clfton 7 September 2018 Materal adapted from course created by Dr. Luo S, now leadng Albaba research group 14 Why probabltes

More information

Lecture 10 Support Vector Machines II

Lecture 10 Support Vector Machines II Lecture 10 Support Vector Machnes II 22 February 2016 Taylor B. Arnold Yale Statstcs STAT 365/665 1/28 Notes: Problem 3 s posted and due ths upcomng Frday There was an early bug n the fake-test data; fxed

More information

PHYS 705: Classical Mechanics. Calculus of Variations II

PHYS 705: Classical Mechanics. Calculus of Variations II 1 PHYS 705: Classcal Mechancs Calculus of Varatons II 2 Calculus of Varatons: Generalzaton (no constrant yet) Suppose now that F depends on several dependent varables : We need to fnd such that has a statonary

More information

Chapter 7 Channel Capacity and Coding

Chapter 7 Channel Capacity and Coding Wreless Informaton Transmsson System Lab. Chapter 7 Channel Capacty and Codng Insttute of Communcatons Engneerng atonal Sun Yat-sen Unversty Contents 7. Channel models and channel capacty 7.. Channel models

More information

Linear Approximation with Regularization and Moving Least Squares

Linear Approximation with Regularization and Moving Least Squares Lnear Approxmaton wth Regularzaton and Movng Least Squares Igor Grešovn May 007 Revson 4.6 (Revson : March 004). 5 4 3 0.5 3 3.5 4 Contents: Lnear Fttng...4. Weghted Least Squares n Functon Approxmaton...

More information

Lecture 3. Ax x i a i. i i

Lecture 3. Ax x i a i. i i 18.409 The Behavor of Algorthms n Practce 2/14/2 Lecturer: Dan Spelman Lecture 3 Scrbe: Arvnd Sankar 1 Largest sngular value In order to bound the condton number, we need an upper bound on the largest

More information

Natural Language Processing and Information Retrieval

Natural Language Processing and Information Retrieval Natural Language Processng and Informaton Retreval Support Vector Machnes Alessandro Moschtt Department of nformaton and communcaton technology Unversty of Trento Emal: moschtt@ds.untn.t Summary Support

More information

Stanford University CS254: Computational Complexity Notes 7 Luca Trevisan January 29, Notes for Lecture 7

Stanford University CS254: Computational Complexity Notes 7 Luca Trevisan January 29, Notes for Lecture 7 Stanford Unversty CS54: Computatonal Complexty Notes 7 Luca Trevsan January 9, 014 Notes for Lecture 7 1 Approxmate Countng wt an N oracle We complete te proof of te followng result: Teorem 1 For every

More information

Markov Chain Monte Carlo Lecture 6

Markov Chain Monte Carlo Lecture 6 where (x 1,..., x N ) X N, N s called the populaton sze, f(x) f (x) for at least one {1, 2,..., N}, and those dfferent from f(x) are called the tral dstrbutons n terms of mportance samplng. Dfferent ways

More information

Hidden Markov Model Cheat Sheet

Hidden Markov Model Cheat Sheet Hdden Markov Model Cheat Sheet (GIT ID: dc2f391536d67ed5847290d5250d4baae103487e) Ths document s a cheat sheet on Hdden Markov Models (HMMs). It resembles lecture notes, excet that t cuts to the chase

More information

INF 5860 Machine learning for image classification. Lecture 3 : Image classification and regression part II Anne Solberg January 31, 2018

INF 5860 Machine learning for image classification. Lecture 3 : Image classification and regression part II Anne Solberg January 31, 2018 INF 5860 Machne learnng for mage classfcaton Lecture 3 : Image classfcaton and regresson part II Anne Solberg January 3, 08 Today s topcs Multclass logstc regresson and softma Regularzaton Image classfcaton

More information

Web-Mining Agents Probabilistic Information Retrieval

Web-Mining Agents Probabilistic Information Retrieval Web-Mnng Agents Probablstc Informaton etreval Prof. Dr. alf Möller Unverstät zu Lübeck Insttut für Informatonssysteme Karsten Martny Übungen Acknowledgements Sldes taken from: Introducton to Informaton

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 12 10/21/2013. Martingale Concentration Inequalities and Applications

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 12 10/21/2013. Martingale Concentration Inequalities and Applications MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.65/15.070J Fall 013 Lecture 1 10/1/013 Martngale Concentraton Inequaltes and Applcatons Content. 1. Exponental concentraton for martngales wth bounded ncrements.

More information

Chapter 7 Channel Capacity and Coding

Chapter 7 Channel Capacity and Coding Chapter 7 Channel Capacty and Codng Contents 7. Channel models and channel capacty 7.. Channel models Bnary symmetrc channel Dscrete memoryless channels Dscrete-nput, contnuous-output channel Waveform

More information

Estimation: Part 2. Chapter GREG estimation

Estimation: Part 2. Chapter GREG estimation Chapter 9 Estmaton: Part 2 9. GREG estmaton In Chapter 8, we have seen that the regresson estmator s an effcent estmator when there s a lnear relatonshp between y and x. In ths chapter, we generalzed the

More information

Which Separator? Spring 1

Which Separator? Spring 1 Whch Separator? 6.034 - Sprng 1 Whch Separator? Mamze the margn to closest ponts 6.034 - Sprng Whch Separator? Mamze the margn to closest ponts 6.034 - Sprng 3 Margn of a pont " # y (w $ + b) proportonal

More information

Errors for Linear Systems

Errors for Linear Systems Errors for Lnear Systems When we solve a lnear system Ax b we often do not know A and b exactly, but have only approxmatons  and ˆb avalable. Then the best thng we can do s to solve ˆx ˆb exactly whch

More information

Combinational Circuit Design

Combinational Circuit Design Combnatonal Crcut Desgn Part I: Desgn Procedure and Examles Part II : Arthmetc Crcuts Part III : Multlexer, Decoder, Encoder, Hammng Code Combnatonal Crcuts n nuts Combnatonal Crcuts m oututs A combnatonal

More information

Probability Theory (revisited)

Probability Theory (revisited) Probablty Theory (revsted) Summary Probablty v.s. plausblty Random varables Smulaton of Random Experments Challenge The alarm of a shop rang. Soon afterwards, a man was seen runnng n the street, persecuted

More information

STAT 3008 Applied Regression Analysis

STAT 3008 Applied Regression Analysis STAT 3008 Appled Regresson Analyss Tutoral : Smple Lnear Regresson LAI Chun He Department of Statstcs, The Chnese Unversty of Hong Kong 1 Model Assumpton To quantfy the relatonshp between two factors,

More information

Week 5: Neural Networks

Week 5: Neural Networks Week 5: Neural Networks Instructor: Sergey Levne Neural Networks Summary In the prevous lecture, we saw how we can construct neural networks by extendng logstc regresson. Neural networks consst of multple

More information

Error Probability for M Signals

Error Probability for M Signals Chapter 3 rror Probablty for M Sgnals In ths chapter we dscuss the error probablty n decdng whch of M sgnals was transmtted over an arbtrary channel. We assume the sgnals are represented by a set of orthonormal

More information

Statistical pattern recognition

Statistical pattern recognition Statstcal pattern recognton Bayes theorem Problem: decdng f a patent has a partcular condton based on a partcular test However, the test s mperfect Someone wth the condton may go undetected (false negatve

More information

On the Correlation between Boolean Functions of Sequences of Random Variables

On the Correlation between Boolean Functions of Sequences of Random Variables On the Correlaton between Boolean Functons of Sequences of Random Varables Farhad Shran Chaharsoogh Electrcal Engneerng and Computer Scence Unversty of Mchgan Ann Arbor, Mchgan, 48105 Emal: fshran@umch.edu

More information

The Minimum Universal Cost Flow in an Infeasible Flow Network

The Minimum Universal Cost Flow in an Infeasible Flow Network Journal of Scences, Islamc Republc of Iran 17(2): 175-180 (2006) Unversty of Tehran, ISSN 1016-1104 http://jscencesutacr The Mnmum Unversal Cost Flow n an Infeasble Flow Network H Saleh Fathabad * M Bagheran

More information

ON A DETERMINATION OF THE INITIAL FUNCTIONS FROM THE OBSERVED VALUES OF THE BOUNDARY FUNCTIONS FOR THE SECOND-ORDER HYPERBOLIC EQUATION

ON A DETERMINATION OF THE INITIAL FUNCTIONS FROM THE OBSERVED VALUES OF THE BOUNDARY FUNCTIONS FOR THE SECOND-ORDER HYPERBOLIC EQUATION Advanced Mathematcal Models & Applcatons Vol.3, No.3, 2018, pp.215-222 ON A DETERMINATION OF THE INITIAL FUNCTIONS FROM THE OBSERVED VALUES OF THE BOUNDARY FUNCTIONS FOR THE SECOND-ORDER HYPERBOLIC EUATION

More information

NP-Completeness : Proofs

NP-Completeness : Proofs NP-Completeness : Proofs Proof Methods A method to show a decson problem Π NP-complete s as follows. (1) Show Π NP. (2) Choose an NP-complete problem Π. (3) Show Π Π. A method to show an optmzaton problem

More information

Lec 12 Rate-Distortion Optimization (RDO) in Video Coding-II

Lec 12 Rate-Distortion Optimization (RDO) in Video Coding-II Sprng 07: Multmeda Communcaton Lec ate-dstorton Optmzaton (DO) n Vdeo Codng-II Zhu L Course Web: http://l.web.umkc.edu/lzhu/ Z. L Multmeda Communcaton, Sprng 07 p. Outlne Lec ecap Lagrangan Method HW-3

More information

U.C. Berkeley CS294: Spectral Methods and Expanders Handout 8 Luca Trevisan February 17, 2016

U.C. Berkeley CS294: Spectral Methods and Expanders Handout 8 Luca Trevisan February 17, 2016 U.C. Berkeley CS94: Spectral Methods and Expanders Handout 8 Luca Trevsan February 7, 06 Lecture 8: Spectral Algorthms Wrap-up In whch we talk about even more generalzatons of Cheeger s nequaltes, and

More information

find (x): given element x, return the canonical element of the set containing x;

find (x): given element x, return the canonical element of the set containing x; COS 43 Sprng, 009 Dsjont Set Unon Problem: Mantan a collecton of dsjont sets. Two operatons: fnd the set contanng a gven element; unte two sets nto one (destructvely). Approach: Canoncal element method:

More information

Simulated Power of the Discrete Cramér-von Mises Goodness-of-Fit Tests

Simulated Power of the Discrete Cramér-von Mises Goodness-of-Fit Tests Smulated of the Cramér-von Mses Goodness-of-Ft Tests Steele, M., Chaselng, J. and 3 Hurst, C. School of Mathematcal and Physcal Scences, James Cook Unversty, Australan School of Envronmental Studes, Grffth

More information

n α j x j = 0 j=1 has a nontrivial solution. Here A is the n k matrix whose jth column is the vector for all t j=0

n α j x j = 0 j=1 has a nontrivial solution. Here A is the n k matrix whose jth column is the vector for all t j=0 MODULE 2 Topcs: Lnear ndependence, bass and dmenson We have seen that f n a set of vectors one vector s a lnear combnaton of the remanng vectors n the set then the span of the set s unchanged f that vector

More information

CIS526: Machine Learning Lecture 3 (Sept 16, 2003) Linear Regression. Preparation help: Xiaoying Huang. x 1 θ 1 output... θ M x M

CIS526: Machine Learning Lecture 3 (Sept 16, 2003) Linear Regression. Preparation help: Xiaoying Huang. x 1 θ 1 output... θ M x M CIS56: achne Learnng Lecture 3 (Sept 6, 003) Preparaton help: Xaoyng Huang Lnear Regresson Lnear regresson can be represented by a functonal form: f(; θ) = θ 0 0 +θ + + θ = θ = 0 ote: 0 s a dummy attrbute

More information

The Gaussian classifier. Nuno Vasconcelos ECE Department, UCSD

The Gaussian classifier. Nuno Vasconcelos ECE Department, UCSD he Gaussan classfer Nuno Vasconcelos ECE Department, UCSD Bayesan decson theory recall that we have state of the world X observatons g decson functon L[g,y] loss of predctng y wth g Bayes decson rule s

More information

Ensemble Methods: Boosting

Ensemble Methods: Boosting Ensemble Methods: Boostng Ncholas Ruozz Unversty of Texas at Dallas Based on the sldes of Vbhav Gogate and Rob Schapre Last Tme Varance reducton va baggng Generate new tranng data sets by samplng wth replacement

More information

Problem Set 9 Solutions

Problem Set 9 Solutions Desgn and Analyss of Algorthms May 4, 2015 Massachusetts Insttute of Technology 6.046J/18.410J Profs. Erk Demane, Srn Devadas, and Nancy Lynch Problem Set 9 Solutons Problem Set 9 Solutons Ths problem

More information

Lectures - Week 4 Matrix norms, Conditioning, Vector Spaces, Linear Independence, Spanning sets and Basis, Null space and Range of a Matrix

Lectures - Week 4 Matrix norms, Conditioning, Vector Spaces, Linear Independence, Spanning sets and Basis, Null space and Range of a Matrix Lectures - Week 4 Matrx norms, Condtonng, Vector Spaces, Lnear Independence, Spannng sets and Bass, Null space and Range of a Matrx Matrx Norms Now we turn to assocatng a number to each matrx. We could

More information

Homework Assignment 3 Due in class, Thursday October 15

Homework Assignment 3 Due in class, Thursday October 15 Homework Assgnment 3 Due n class, Thursday October 15 SDS 383C Statstcal Modelng I 1 Rdge regresson and Lasso 1. Get the Prostrate cancer data from http://statweb.stanford.edu/~tbs/elemstatlearn/ datasets/prostate.data.

More information

2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2: Bayesian pattern classification

2E Pattern Recognition Solutions to Introduction to Pattern Recognition, Chapter 2: Bayesian pattern classification E395 - Pattern Recognton Solutons to Introducton to Pattern Recognton, Chapter : Bayesan pattern classfcaton Preface Ths document s a soluton manual for selected exercses from Introducton to Pattern Recognton

More information

Some Notes on Consumer Theory

Some Notes on Consumer Theory Some Notes on Consumer Theory. Introducton In ths lecture we eamne the theory of dualty n the contet of consumer theory and ts use n the measurement of the benefts of rce and other changes. Dualty s not

More information

3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X

3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X Statstcs 1: Probablty Theory II 37 3 EPECTATION OF SEVERAL RANDOM VARIABLES As n Probablty Theory I, the nterest n most stuatons les not on the actual dstrbuton of a random vector, but rather on a number

More information

Algorithms for factoring

Algorithms for factoring CSA E0 235: Crytograhy Arl 9,2015 Instructor: Arta Patra Algorthms for factorng Submtted by: Jay Oza, Nranjan Sngh Introducton Factorsaton of large ntegers has been a wdely studed toc manly because of

More information

Pattern Recognition. Approximating class densities, Bayesian classifier, Errors in Biometric Systems

Pattern Recognition. Approximating class densities, Bayesian classifier, Errors in Biometric Systems htt://.cubs.buffalo.edu attern Recognton Aromatng class denstes, Bayesan classfer, Errors n Bometrc Systems B. W. Slverman, Densty estmaton for statstcs and data analyss. London: Chaman and Hall, 986.

More information

Variability-Driven Module Selection with Joint Design Time Optimization and Post-Silicon Tuning

Variability-Driven Module Selection with Joint Design Time Optimization and Post-Silicon Tuning Asa and South Pacfc Desgn Automaton Conference 2008 Varablty-Drven Module Selecton wth Jont Desgn Tme Optmzaton and Post-Slcon Tunng Feng Wang, Xaoxa Wu, Yuan Xe The Pennsylvana State Unversty Department

More information

Asymptotic Quantization: A Method for Determining Zador s Constant

Asymptotic Quantization: A Method for Determining Zador s Constant Asymptotc Quantzaton: A Method for Determnng Zador s Constant Joyce Shh Because of the fnte capacty of modern communcaton systems better methods of encodng data are requred. Quantzaton refers to the methods

More information

Michael Batty. Alan Wilson Plenary Session Entropy, Complexity, & Information in Spatial Analysis

Michael Batty. Alan Wilson Plenary Session Entropy, Complexity, & Information in Spatial Analysis Alan Wlson Plenary Sesson Entroy, Comlexty, & Informaton n Satal Analyss Mchael Batty m.batty@ucl.ac.uk @jmchaelbatty htt://www.comlexcty.nfo/ htt://www.satalcomlexty.nfo/ for Advanced Satal Analyss CentreCentre

More information

APPENDIX A Some Linear Algebra

APPENDIX A Some Linear Algebra APPENDIX A Some Lnear Algebra The collecton of m, n matrces A.1 Matrces a 1,1,..., a 1,n A = a m,1,..., a m,n wth real elements a,j s denoted by R m,n. If n = 1 then A s called a column vector. Smlarly,

More information

Lecture 7: Boltzmann distribution & Thermodynamics of mixing

Lecture 7: Boltzmann distribution & Thermodynamics of mixing Prof. Tbbtt Lecture 7 etworks & Gels Lecture 7: Boltzmann dstrbuton & Thermodynamcs of mxng 1 Suggested readng Prof. Mark W. Tbbtt ETH Zürch 13 März 018 Molecular Drvng Forces Dll and Bromberg: Chapters

More information

Bayesian Decision Theory

Bayesian Decision Theory No.4 Bayesan Decson Theory Hu Jang Deartment of Electrcal Engneerng and Comuter Scence Lassonde School of Engneerng York Unversty, Toronto, Canada Outlne attern Classfcaton roblems Bayesan Decson Theory

More information

Source-Channel-Sink Some questions

Source-Channel-Sink Some questions Source-Channel-Snk Soe questons Source Channel Snk Aount of Inforaton avalable Source Entro Generall nos and a be te varng Introduces error and lts the rate at whch data can be transferred ow uch nforaton

More information

More metrics on cartesian products

More metrics on cartesian products More metrcs on cartesan products If (X, d ) are metrc spaces for 1 n, then n Secton II4 of the lecture notes we defned three metrcs on X whose underlyng topologes are the product topology The purpose of

More information

Transform Coding. Transform Coding Principle

Transform Coding. Transform Coding Principle Transform Codng Prncple of block-wse transform codng Propertes of orthonormal transforms Dscrete cosne transform (DCT) Bt allocaton for transform coeffcents Entropy codng of transform coeffcents Typcal

More information

JAB Chain. Long-tail claims development. ASTIN - September 2005 B.Verdier A. Klinger

JAB Chain. Long-tail claims development. ASTIN - September 2005 B.Verdier A. Klinger JAB Chan Long-tal clams development ASTIN - September 2005 B.Verder A. Klnger Outlne Chan Ladder : comments A frst soluton: Munch Chan Ladder JAB Chan Chan Ladder: Comments Black lne: average pad to ncurred

More information

The Bellman Equation

The Bellman Equation The Bellman Eqaton Reza Shadmehr In ths docment I wll rovde an elanaton of the Bellman eqaton, whch s a method for otmzng a cost fncton and arrvng at a control olcy.. Eamle of a game Sose that or states

More information

Switched Quasi-Logarithmic Quantizer with Golomb Rice Coding

Switched Quasi-Logarithmic Quantizer with Golomb Rice Coding http://dx.do.org/10.5755/j01.ee.3.4.1877 Swtched Quas-Logarthmc Quantzer wth Golomb Rce Codng Nkola Vucc 1, Zoran Perc 1, Mlan Dncc 1 1 Faculty of Electronc Engneerng, Unversty of Ns, Aleksandar Medvedev

More information