EGR 544 Communication Theory

Size: px
Start display at page:

Download "EGR 544 Communication Theory"

Transcription

1 EGR 544 Communcaton Theory. Informaton Sources Z. Alyazcoglu Electrcal and Computer Engneerng Department Cal Poly Pomona Introducton Informaton Source x n Informaton sources Analog sources Dscrete sources Source data s usually tme-varyng and unpredctable Dgtal communcaton systems transmt nformaton n dgtal form. Source nformaton must be converted to a format that can be transmtted dgtally, whch performed by source encoder Develop mathematcal models for nformaton sources Cal Poly Pomona Electrcal & Computer Engneerng Dept. EGR 544-

2 Informaton Sources The set X s the fnte source alphabet and x k X An alphabet of possble letters {x, x, x } If the source s bnary f the set X={0,} Each letter n the alphabet has a gven probablty p k and p = PX ( = x), k k k k = p k = Memoryless or (DMS) dscrete memoryless source): The dscrete source output sequence from the source s statstcally ndependent. (The current output letter s statstcally ndependent from the all past and future outputs) Statonary: The dscrete source output sequence from the source s statstcally dependent. Cal Poly Pomona Electrcal & Computer Engneerng Dept. EGR Measure of Informaton et s consder two dscrete random varables wth possble outcomes x, =,,... n, and y, =,,..., m et s gve the probablty of X=x PX ( = x) ) and The condtonal probablty of the event X=x gven that Y=y occurred PX ( = x Y= y) y) The mutually nformaton between x and y can be gven, y) y) = Py ( ) y) Ix ( ; y) = log ) If the base of log s, the unty of I(x ;y ) s bts If the base of log s e, the unty of I(x ;y ) s nats Cal Poly Pomona Electrcal & Computer Engneerng Dept. EGR 544-4

3 Measure of Informaton y) y) Py ( ), y) Py ( x) = = = ) ) Py ( ) ) Py ( ) Py ( ) Ix ( ; y) = Iy ( ; x) The nformaton provded by the occurrence of the event Y=y about the event X=x s dentcal to the nformaton provded by the occurrence of the event X=x about the event Y=y If the random varables X and Y are statstcally ndependent ) Py ( ) y) = = ) Py ( ) If event x y = y then, y) y) = = Py ( ) ) Ix ( ; y) = log = 0 ) Ix ( ; y) = log = log ) ) Cal Poly Pomona Electrcal & Computer Engneerng Dept. EGR Measure of Informaton It s called self nformaton of the event X=x and gven by Ix ( ) = log ) Condtonal self-nformaton s gven by Hgh probablty of event gves less self nformaton than low probablty of event Ix ( y) = log = log y) y) Ix ( ; y) = Ix ( ) Ix ( y) Example Two bts nformaton {00,0,0,}, each event has /4 probablty to occurrence Ix ( ) = log ) = log (/ 4) = bts Cal Poly Pomona Electrcal & Computer Engneerng Dept. EGR 544-6

4 Measure of Informaton Example X and Y are bnary valued {0,} random varables that represents the nput and output of a bnary-nput bnary output channels. The nputs symbols are equally lkely. The output condtonal probabltes PY ( 0 X 0) p = = = 0 PY ( = X= 0) = p0 PY ( X ) p = = = PY ( = 0 X= ) = p The mutual nformaton about the occurrence of the events X=0, gven that Y=0 s observed PY ( = 0) = PY ( = 0 X= 0) P( X= 0) + PY ( = 0 X= ) P( X= ) = ( p0 + p ) PY ( = 0 X= 0) ( p0) Ix ( ; y) = I(0;0) = log = log PY ( = 0) ( p + p) 0 Cal Poly Pomona Electrcal & Computer Engneerng Dept. EGR Measure of Informaton The mutual nformaton about the occurrence of the events X=, gven that Y=0 s observed PY ( = ) = PY ( = X= 0) P( X= 0) + PY ( = X= ) P( X= ) = ( p+ p0 ) PY ( = 0 X= ) p Ix ( ; y) = I(;0) = log = log PY ( = 0) ( p + p) 0 p0 = p = 0 I (0;0) = log = p0 = p = I (0;0) = log = 0 3 p0 = p = I(0;0) = log = Cal Poly Pomona Electrcal & Computer Engneerng Dept. EGR 544-8

5 Average Mutual Informaton and Entropy For the par of events (x,y ) of two random varables X and Y,The average mutual nformaton gven by n m I( XY ; ), y) Ix ( ; y) = = = n m, y) P x y = = Px Py = I( X; Y) (, )log ( ) ( ) If X and Y statstcally ndependent I(X;Y)=0. The average self-nformaton gven by H( X) = P( x ) I( x ) = n = n = ) log ) I(X;Y) 0 always Ths s called the entropy of the varable X Cal Poly Pomona Electrcal & Computer Engneerng Dept. EGR Average Mutual Informaton and Entropy If A source has n dfferent letters and each letter n the source has same probablty ) = for all, then n n HX ( ) = log = logn n n = et s Defne the entropy H(X) of the bnary varable X={0,} for gven P(X=0)=q and P(X=)=-q HX ( ) = { PX ( = 0)log PX ( = 0) + PX ( = )log PX ( = )} = q log q ( q) log ( q) = H( q) Cal Poly Pomona Electrcal & Computer Engneerng Dept. EGR 544-0

6 When q=/ H( ) = log ( ) log ( ) = [bt/letter] H(q)[bts/letter] Cal Poly Pomona Electrcal & Computer Engneerng Dept. EGR 544- Condtonal Entropy Average condtonal self-nformaton or the condtonal entropy s defned as = n m HX ( Y), y) log y) = = It s nterpreted as the average amount of uncertanty n X after Y s observed Therefore, The nformaton can be gven as IXY ( ; ) = HX ( ) HX ( Y) Snce I(X;Y) 0, H(X) H(X Y ) Cal Poly Pomona Electrcal & Computer Engneerng Dept. EGR 544-

7 Example: et s fnd the entropy H(X) Condtonal Entropy X and Y are bnary valued {0,} random varables that represents the nput and output of a bnary-nput bnary output channels. The nputs symbols are equally lkely. The output condtonal probabltes PY ( = 0 X= 0) = p PY ( = X= 0) = p PY ( = X= ) = p PY ( = 0 X= ) = p HX ( ) = { )log ) + )log )} = q log q ( q) log( q) = H( q) et s fnd the condtonal entropy H(X Y) PX ( = 0) = q PX ( = ) = q HX ( Y) =, y)log y), y)log y), y)log y), y)log y) (*) Cal Poly Pomona Electrcal & Computer Engneerng Dept. EGR Condtonal Entropy, y) = Py ( x) Py ( ) ) y) = Py ( x) Py ( ) Py ( ) = Py ( x) ) + Py ( x) ) = ( pq ) + p( q) Py ( ) = Py ( x) ) + Py ( x) ) = pq+ ( p)( q) Cal Poly Pomona Electrcal & Computer Engneerng Dept. EGR 544-4

8 The entropy for multvarable If we have k random varable X, X, X k wth ont probablty P(x x x k ) P(X =x,x =x,,x k =x k ), the entropy of the block s defned as n n nk = H( X X... X )... P( x x... x ) log P( x x... x ) k k k = = k = Pxx (... x) = ) x) xx)... xx... x ) k 3 k k HXX (... X) = HX ( ) + HX ( X) + HX ( XX) HX ( XX... X ) k 3 k k k HXX (... X) = HX ( XX... X ) k = Cal Poly Pomona Electrcal & Computer Engneerng Dept. EGR Summary Consder a par X and Y of dscrete varables H (X) :average nformaton n observng X H (Y) :average nformaton n observng Y H (X,Y) :average nformaton n observng (X,Y) H (X Y) :average nformaton n observng X when Y s known H (Y X) :average nformaton n observng Y when X s known I (X ;Y): average mutual nformaton between X and Y I( X; Y) = H( X) H( X Y) I( XY ; ) = HY ( ) HY ( X) Cal Poly Pomona Electrcal & Computer Engneerng Dept. EGR 544-6

9 Summary H ( XY, ) = H( X) + HY ( X) = H( X) + HY ( X) H(X,Y) H(X Y) I(X;Y) H(Y X) H(X) H(Y) I( X; Y) = H( X) H( X Y) I( XY ; ) = HY ( ) HY ( X) I( XY ; ) = IY ( ; X) I( X; Y) = H( X) + H( Y) H( X, Y) I( X; X) = H( X) Cal Poly Pomona Electrcal & Computer Engneerng Dept. EGR Measure of nformaton for Contnuous Random Varable If X and Y are random varables wth ont PDF p(x,y) and margnal PDF s p(x) and p(y), the average mutual nformaton between X and Y s defne as or pxy (, ) I( XY ; ) = pxy (, )log dxdy pxpy ( ) ( ) p( y, x) p( x) I( XY ; ) = pxpy ( ) ( x)log dxdy pxpy ( ) ( ) Self-nformaton or dfferental entropy of the random varable X s H ( X) = p( x)log p( x) dx The average condtonal entropy of the random varable X gven Y H ( X Y) p( x, y)log p( x y) dx dy = Cal Poly Pomona Electrcal & Computer Engneerng Dept. EGR 544-8

10 Measure of nformaton for Contnuous Random Varable Also, The average mutual nformaton between X and Y s defne as And I( X; Y) = H( X) H( X Y) I( XY ; ) = HY ( ) HY ( X) Cal Poly Pomona Electrcal & Computer Engneerng Dept. EGR Measure of nformaton for dscrete Random Varable X and Contnuous Y X and Y are statstcally dependent, X has possble outcomes x, =,, n, and margnal PDF s of Y s p(y), n p( y) = p( y x) P( x) = Mutual Informaton about event X=x by the occurrence of the event Y=y s p( y x) P( x) p( y x) I( x; y) = log = log p( y) P( x ) p( y) The average mutual nformaton between X and Y n P( y x ) I( XY ; ) = py ( x) )log dy p( y) = Cal Poly Pomona Electrcal & Computer Engneerng Dept. EGR 544-0

11 Example: X s a dscrete random varable wth two equally probable outcomes x =A and x =-A. The condtonal PDF s P(y x ), =, are ( y A) σ p( y A) = e πσ ( y+ A) σ p( y A) = e πσ The average mutual nformaton between X and Y p( y A) p( y A) I( XY ; ) = py ( A)log py ( A)log dy + p( y) p( y) p( y) = p( y A) + p( y A) [ ] Cal Poly Pomona Electrcal & Computer Engneerng Dept. EGR 544- Codng for Dscrete Source Represent source data effectvely n dgtal form for transmsson or storage A measure of the effcency of a source-encodng method can be obtaned by comparng the average number of bnary dgts per output letter from the source to the entropy H(X). Two types of source codng ossless (Huffman codng algorthm, embel-zv Algorthm..) ossy (rate-dstorton, quantzaton, waveform codng..) X Source encodng bts Channel transmsson bts Source decodng X Cal Poly Pomona Electrcal & Computer Engneerng Dept. EGR 544-

12 Codng for Dscrete Memoryless Source DMS source produce an output letter every Ts second. Source has fnte alphabet of symbol x, I=,, wth probabltes P(x) The entropy of the DMS n bts per source symbol s H ( X) = P( x )log P( x ) log = If symbols have same probablty H ( X) = P( x )log P( x ) = log = Cal Poly Pomona Electrcal & Computer Engneerng Dept. EGR Fxed-length code words et s assgn a unque set of R bnary dgts to each symbols Snce there s possble symbols, R wll gves us code rate n bts per symbols as R = log When s not a power of, t s R = log + log denotes the largest nteger less than log Snce H( X) log R H( X) H( X) R Rato If source letters are equally probable R = H( X) Cal Poly Pomona Electrcal & Computer Engneerng Dept. EGR 544-4

13 When s power of and source letters are equally probable R = H( X) Fxed length code of R bts per symbol attans 00 percent effcency Cal Poly Pomona Electrcal & Computer Engneerng Dept. EGR 544-5

Chapter 7 Channel Capacity and Coding

Chapter 7 Channel Capacity and Coding Chapter 7 Channel Capacty and Codng Contents 7. Channel models and channel capacty 7.. Channel models Bnary symmetrc channel Dscrete memoryless channels Dscrete-nput, contnuous-output channel Waveform

More information

Chapter 7 Channel Capacity and Coding

Chapter 7 Channel Capacity and Coding Wreless Informaton Transmsson System Lab. Chapter 7 Channel Capacty and Codng Insttute of Communcatons Engneerng atonal Sun Yat-sen Unversty Contents 7. Channel models and channel capacty 7.. Channel models

More information

ECE 534: Elements of Information Theory. Solutions to Midterm Exam (Spring 2006)

ECE 534: Elements of Information Theory. Solutions to Midterm Exam (Spring 2006) ECE 534: Elements of Informaton Theory Solutons to Mdterm Eam (Sprng 6) Problem [ pts.] A dscrete memoryless source has an alphabet of three letters,, =,, 3, wth probabltes.4,.4, and., respectvely. (a)

More information

Mathematical Models for Information Sources A Logarithmic i Measure of Information

Mathematical Models for Information Sources A Logarithmic i Measure of Information Introducton to Informaton Theory Wreless Informaton Transmsson System Lab. Insttute of Communcatons Engneerng g Natonal Sun Yat-sen Unversty Table of Contents Mathematcal Models for Informaton Sources

More information

Lecture 3: Shannon s Theorem

Lecture 3: Shannon s Theorem CSE 533: Error-Correctng Codes (Autumn 006 Lecture 3: Shannon s Theorem October 9, 006 Lecturer: Venkatesan Guruswam Scrbe: Wdad Machmouch 1 Communcaton Model The communcaton model we are usng conssts

More information

Coding for Discrete Source

Coding for Discrete Source EGR 544 Communication Theory 3. Coding for Discrete Sources Z. Aliyazicioglu Electrical and Computer Engineering Department Cal Poly Pomona Coding for Discrete Source Coding Represent source data effectively

More information

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur

Module 3 LOSSY IMAGE COMPRESSION SYSTEMS. Version 2 ECE IIT, Kharagpur Module 3 LOSSY IMAGE COMPRESSION SYSTEMS Verson ECE IIT, Kharagpur Lesson 6 Theory of Quantzaton Verson ECE IIT, Kharagpur Instructonal Objectves At the end of ths lesson, the students should be able to:

More information

Introduction to Information Theory, Data Compression,

Introduction to Information Theory, Data Compression, Introducton to Informaton Theory, Data Compresson, Codng Mehd Ibm Brahm, Laura Mnkova Aprl 5, 208 Ths s the augmented transcrpt of a lecture gven by Luc Devroye on the 3th of March 208 for a Data Structures

More information

Error Probability for M Signals

Error Probability for M Signals Chapter 3 rror Probablty for M Sgnals In ths chapter we dscuss the error probablty n decdng whch of M sgnals was transmtted over an arbtrary channel. We assume the sgnals are represented by a set of orthonormal

More information

Introduction to information theory and data compression

Introduction to information theory and data compression Introducton to nformaton theory and data compresson Adel Magra, Emma Gouné, Irène Woo March 8, 207 Ths s the augmented transcrpt of a lecture gven by Luc Devroye on March 9th 207 for a Data Structures

More information

} Often, when learning, we deal with uncertainty:

} Often, when learning, we deal with uncertainty: Uncertanty and Learnng } Often, when learnng, we deal wth uncertanty: } Incomplete data sets, wth mssng nformaton } Nosy data sets, wth unrelable nformaton } Stochastcty: causes and effects related non-determnstcally

More information

VQ widely used in coding speech, image, and video

VQ widely used in coding speech, image, and video at Scalar quantzers are specal cases of vector quantzers (VQ): they are constraned to look at one sample at a tme (memoryless) VQ does not have such constrant better RD perfomance expected Source codng

More information

Pulse Coded Modulation

Pulse Coded Modulation Pulse Coded Modulaton PCM (Pulse Coded Modulaton) s a voce codng technque defned by the ITU-T G.711 standard and t s used n dgtal telephony to encode the voce sgnal. The frst step n the analog to dgtal

More information

Statistics and Probability Theory in Civil, Surveying and Environmental Engineering

Statistics and Probability Theory in Civil, Surveying and Environmental Engineering Statstcs and Probablty Theory n Cvl, Surveyng and Envronmental Engneerng Pro. Dr. Mchael Havbro Faber ETH Zurch, Swtzerland Contents o Todays Lecture Overvew o Uncertanty Modelng Random Varables - propertes

More information

Chapter 8 SCALAR QUANTIZATION

Chapter 8 SCALAR QUANTIZATION Outlne Chapter 8 SCALAR QUANTIZATION Yeuan-Kuen Lee [ CU, CSIE ] 8.1 Overvew 8. Introducton 8.4 Unform Quantzer 8.5 Adaptve Quantzaton 8.6 Nonunform Quantzaton 8.7 Entropy-Coded Quantzaton Ch 8 Scalar

More information

THE ARIMOTO-BLAHUT ALGORITHM FOR COMPUTATION OF CHANNEL CAPACITY. William A. Pearlman. References: S. Arimoto - IEEE Trans. Inform. Thy., Jan.

THE ARIMOTO-BLAHUT ALGORITHM FOR COMPUTATION OF CHANNEL CAPACITY. William A. Pearlman. References: S. Arimoto - IEEE Trans. Inform. Thy., Jan. THE ARIMOTO-BLAHUT ALGORITHM FOR COMPUTATION OF CHANNEL CAPACITY Wllam A. Pearlman 2002 References: S. Armoto - IEEE Trans. Inform. Thy., Jan. 1972 R. Blahut - IEEE Trans. Inform. Thy., July 1972 Recall

More information

6. Stochastic processes (2)

6. Stochastic processes (2) Contents Markov processes Brth-death processes Lect6.ppt S-38.45 - Introducton to Teletraffc Theory Sprng 5 Markov process Consder a contnuous-tme and dscrete-state stochastc process X(t) wth state space

More information

6. Stochastic processes (2)

6. Stochastic processes (2) 6. Stochastc processes () Lect6.ppt S-38.45 - Introducton to Teletraffc Theory Sprng 5 6. Stochastc processes () Contents Markov processes Brth-death processes 6. Stochastc processes () Markov process

More information

ÉCOLE POLYTECHNIQUE FÉDÉRALE DE LAUSANNE

ÉCOLE POLYTECHNIQUE FÉDÉRALE DE LAUSANNE ÉCOLE POLYTECHNIQUE FÉDÉRALE DE LAUSANNE School of Computer and Communcaton Scences Handout 0 Prncples of Dgtal Communcatons Solutons to Problem Set 4 Mar. 6, 08 Soluton. If H = 0, we have Y = Z Z = Y

More information

A Mathematical Theory of Communication. Claude Shannon s paper presented by Kate Jenkins 2/19/00

A Mathematical Theory of Communication. Claude Shannon s paper presented by Kate Jenkins 2/19/00 A Mathematcal Theory of Communcaton Claude hannon s aer resented by Kate Jenkns 2/19/00 Publshed n two arts, July 1948 and October 1948 n the Bell ystem Techncal Journal Foundng aer of Informaton Theory

More information

3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X

3.1 Expectation of Functions of Several Random Variables. )' be a k-dimensional discrete or continuous random vector, with joint PMF p (, E X E X1 E X Statstcs 1: Probablty Theory II 37 3 EPECTATION OF SEVERAL RANDOM VARIABLES As n Probablty Theory I, the nterest n most stuatons les not on the actual dstrbuton of a random vector, but rather on a number

More information

On the Correlation between Boolean Functions of Sequences of Random Variables

On the Correlation between Boolean Functions of Sequences of Random Variables On the Correlaton between Boolean Functons of Sequences of Random Varables Farhad Shran Chaharsoogh Electrcal Engneerng and Computer Scence Unversty of Mchgan Ann Arbor, Mchgan, 48105 Emal: fshran@umch.edu

More information

j) = 1 (note sigma notation) ii. Continuous random variable (e.g. Normal distribution) 1. density function: f ( x) 0 and f ( x) dx = 1

j) = 1 (note sigma notation) ii. Continuous random variable (e.g. Normal distribution) 1. density function: f ( x) 0 and f ( x) dx = 1 Random varables Measure of central tendences and varablty (means and varances) Jont densty functons and ndependence Measures of assocaton (covarance and correlaton) Interestng result Condtonal dstrbutons

More information

CS 2750 Machine Learning. Lecture 5. Density estimation. CS 2750 Machine Learning. Announcements

CS 2750 Machine Learning. Lecture 5. Density estimation. CS 2750 Machine Learning. Announcements CS 750 Machne Learnng Lecture 5 Densty estmaton Mlos Hauskrecht mlos@cs.ptt.edu 539 Sennott Square CS 750 Machne Learnng Announcements Homework Due on Wednesday before the class Reports: hand n before

More information

Consider the following passband digital communication system model. c t. modulator. t r a n s m i t t e r. signal decoder.

Consider the following passband digital communication system model. c t. modulator. t r a n s m i t t e r. signal decoder. PASSBAND DIGITAL MODULATION TECHNIQUES Consder the followng passband dgtal communcaton system model. cos( ω + φ ) c t message source m sgnal encoder s modulator s () t communcaton xt () channel t r a n

More information

Lecture 14 (03/27/18). Channels. Decoding. Preview of the Capacity Theorem.

Lecture 14 (03/27/18). Channels. Decoding. Preview of the Capacity Theorem. Lecture 14 (03/27/18). Channels. Decodng. Prevew of the Capacty Theorem. A. Barg The concept of a communcaton channel n nformaton theory s an abstracton for transmttng dgtal (and analog) nformaton from

More information

On mutual information estimation for mixed-pair random variables

On mutual information estimation for mixed-pair random variables On mutual nformaton estmaton for mxed-par random varables November 3, 218 Aleksandr Beknazaryan, Xn Dang and Haln Sang 1 Department of Mathematcs, The Unversty of Msssspp, Unversty, MS 38677, USA. E-mal:

More information

), it produces a response (output function g (x)

), it produces a response (output function g (x) Lnear Systems Revew Notes adapted from notes by Mchael Braun Typcally n electrcal engneerng, one s concerned wth functons of tme, such as a voltage waveform System descrpton s therefore defned n the domans

More information

Lossless Compression Performance of a Simple Counter- Based Entropy Coder

Lossless Compression Performance of a Simple Counter- Based Entropy Coder ITB J. ICT, Vol. 5, No. 3, 20, 73-84 73 Lossless Compresson Performance of a Smple Counter- Based Entropy Coder Armen Z. R. Lang,2 ITB Research Center on Informaton and Communcaton Technology 2 Informaton

More information

Engineering Risk Benefit Analysis

Engineering Risk Benefit Analysis Engneerng Rsk Beneft Analyss.55, 2.943, 3.577, 6.938, 0.86, 3.62, 6.862, 22.82, ESD.72, ESD.72 RPRA 2. Elements of Probablty Theory George E. Apostolaks Massachusetts Insttute of Technology Sprng 2007

More information

ESCI 341 Atmospheric Thermodynamics Lesson 10 The Physical Meaning of Entropy

ESCI 341 Atmospheric Thermodynamics Lesson 10 The Physical Meaning of Entropy ESCI 341 Atmospherc Thermodynamcs Lesson 10 The Physcal Meanng of Entropy References: An Introducton to Statstcal Thermodynamcs, T.L. Hll An Introducton to Thermodynamcs and Thermostatstcs, H.B. Callen

More information

Lecture 3: Probability Distributions

Lecture 3: Probability Distributions Lecture 3: Probablty Dstrbutons Random Varables Let us begn by defnng a sample space as a set of outcomes from an experment. We denote ths by S. A random varable s a functon whch maps outcomes nto the

More information

Expected Value and Variance

Expected Value and Variance MATH 38 Expected Value and Varance Dr. Neal, WKU We now shall dscuss how to fnd the average and standard devaton of a random varable X. Expected Value Defnton. The expected value (or average value, or

More information

Compression in the Real World :Algorithms in the Real World. Compression in the Real World. Compression Outline

Compression in the Real World :Algorithms in the Real World. Compression in the Real World. Compression Outline Compresson n the Real World 5-853:Algorthms n the Real World Data Compresson: Lectures and 2 Generc Fle Compresson Fles: gzp (LZ77), bzp (Burrows-Wheeler), BOA (PPM) Archvers: ARC (LZW), PKZp (LZW+) Fle

More information

Application of Nonbinary LDPC Codes for Communication over Fading Channels Using Higher Order Modulations

Application of Nonbinary LDPC Codes for Communication over Fading Channels Using Higher Order Modulations Applcaton of Nonbnary LDPC Codes for Communcaton over Fadng Channels Usng Hgher Order Modulatons Rong-Hu Peng and Rong-Rong Chen Department of Electrcal and Computer Engneerng Unversty of Utah Ths work

More information

Entropy Coding. A complete entropy codec, which is an encoder/decoder. pair, consists of the process of encoding or

Entropy Coding. A complete entropy codec, which is an encoder/decoder. pair, consists of the process of encoding or Sgnal Compresson Sgnal Compresson Entropy Codng Entropy codng s also known as zero-error codng, data compresson or lossless compresson. Entropy codng s wdely used n vrtually all popular nternatonal multmeda

More information

Homework Assignment 3 Due in class, Thursday October 15

Homework Assignment 3 Due in class, Thursday October 15 Homework Assgnment 3 Due n class, Thursday October 15 SDS 383C Statstcal Modelng I 1 Rdge regresson and Lasso 1. Get the Prostrate cancer data from http://statweb.stanford.edu/~tbs/elemstatlearn/ datasets/prostate.data.

More information

Marginal Effects in Probit Models: Interpretation and Testing. 1. Interpreting Probit Coefficients

Marginal Effects in Probit Models: Interpretation and Testing. 1. Interpreting Probit Coefficients ECON 5 -- NOE 15 Margnal Effects n Probt Models: Interpretaton and estng hs note ntroduces you to the two types of margnal effects n probt models: margnal ndex effects, and margnal probablty effects. It

More information

Limited Dependent Variables

Limited Dependent Variables Lmted Dependent Varables. What f the left-hand sde varable s not a contnuous thng spread from mnus nfnty to plus nfnty? That s, gven a model = f (, β, ε, where a. s bounded below at zero, such as wages

More information

State Amplification and State Masking for the Binary Energy Harvesting Channel

State Amplification and State Masking for the Binary Energy Harvesting Channel State Amplfcaton and State Maskng for the Bnary Energy Harvestng Channel Kaya Tutuncuoglu, Omur Ozel 2, Ayln Yener, and Sennur Ulukus 2 Department of Electrcal Engneerng, The Pennsylvana State Unversty,

More information

Multiple Choice. Choose the one that best completes the statement or answers the question.

Multiple Choice. Choose the one that best completes the statement or answers the question. ECON 56 Homework Multple Choce Choose the one that best completes the statement or answers the queston ) The probablty of an event A or B (Pr(A or B)) to occur equals a Pr(A) Pr(B) b Pr(A) + Pr(B) f A

More information

Finding Dense Subgraphs in G(n, 1/2)

Finding Dense Subgraphs in G(n, 1/2) Fndng Dense Subgraphs n Gn, 1/ Atsh Das Sarma 1, Amt Deshpande, and Rav Kannan 1 Georga Insttute of Technology,atsh@cc.gatech.edu Mcrosoft Research-Bangalore,amtdesh,annan@mcrosoft.com Abstract. Fndng

More information

Modelli Clamfim Equazione del Calore Lezione ottobre 2014

Modelli Clamfim Equazione del Calore Lezione ottobre 2014 CLAMFIM Bologna Modell 1 @ Clamfm Equazone del Calore Lezone 17 15 ottobre 2014 professor Danele Rtell danele.rtell@unbo.t 1/24? Convoluton The convoluton of two functons g(t) and f(t) s the functon (g

More information

Predictive Analytics : QM901.1x Prof U Dinesh Kumar, IIMB. All Rights Reserved, Indian Institute of Management Bangalore

Predictive Analytics : QM901.1x Prof U Dinesh Kumar, IIMB. All Rights Reserved, Indian Institute of Management Bangalore Sesson Outlne Introducton to classfcaton problems and dscrete choce models. Introducton to Logstcs Regresson. Logstc functon and Logt functon. Maxmum Lkelhood Estmator (MLE) for estmaton of LR parameters.

More information

Introduction to Random Variables

Introduction to Random Variables Introducton to Random Varables Defnton of random varable Defnton of random varable Dscrete and contnuous random varable Probablty functon Dstrbuton functon Densty functon Sometmes, t s not enough to descrbe

More information

Quantum and Classical Information Theory with Disentropy

Quantum and Classical Information Theory with Disentropy Quantum and Classcal Informaton Theory wth Dsentropy R V Ramos rubensramos@ufcbr Lab of Quantum Informaton Technology, Department of Telenformatc Engneerng Federal Unversty of Ceara - DETI/UFC, CP 6007

More information

A be a probability space. A random vector

A be a probability space. A random vector Statstcs 1: Probablty Theory II 8 1 JOINT AND MARGINAL DISTRIBUTIONS In Probablty Theory I we formulate the concept of a (real) random varable and descrbe the probablstc behavor of ths random varable by

More information

Tornado and Luby Transform Codes. Ashish Khisti Presentation October 22, 2003

Tornado and Luby Transform Codes. Ashish Khisti Presentation October 22, 2003 Tornado and Luby Transform Codes Ashsh Khst 6.454 Presentaton October 22, 2003 Background: Erasure Channel Elas[956] studed the Erasure Channel β x x β β x 2 m x 2 k? Capacty of Noseless Erasure Channel

More information

Introduction to Algorithms

Introduction to Algorithms Introducton to Algorthms 6.046J/8.40J Lecture 7 Prof. Potr Indyk Data Structures Role of data structures: Encapsulate data Support certan operatons (e.g., INSERT, DELETE, SEARCH) Our focus: effcency of

More information

Lossy Compression. Compromise accuracy of reconstruction for increased compression.

Lossy Compression. Compromise accuracy of reconstruction for increased compression. Lossy Compresson Compromse accuracy of reconstructon for ncreased compresson. The reconstructon s usually vsbly ndstngushable from the orgnal mage. Typcally, one can get up to 0:1 compresson wth almost

More information

Estimation: Part 2. Chapter GREG estimation

Estimation: Part 2. Chapter GREG estimation Chapter 9 Estmaton: Part 2 9. GREG estmaton In Chapter 8, we have seen that the regresson estmator s an effcent estmator when there s a lnear relatonshp between y and x. In ths chapter, we generalzed the

More information

Bit Juggling. Representing Information. representations. - Some other bits. - Representing information using bits - Number. Chapter

Bit Juggling. Representing Information. representations. - Some other bits. - Representing information using bits - Number. Chapter Representng Informaton 1 1 1 1 Bt Jugglng - Representng nformaton usng bts - Number representatons - Some other bts Chapter 3.1-3.3 REMINDER: Problem Set #1 s now posted and s due next Wednesday L3 Encodng

More information

Machine learning: Density estimation

Machine learning: Density estimation CS 70 Foundatons of AI Lecture 3 Machne learnng: ensty estmaton Mlos Hauskrecht mlos@cs.ptt.edu 539 Sennott Square ata: ensty estmaton {.. n} x a vector of attrbute values Objectve: estmate the model of

More information

Chapter 2: Entropy and Mutual Information. University of Illinois at Chicago ECE 534, Natasha Devroye

Chapter 2: Entropy and Mutual Information. University of Illinois at Chicago ECE 534, Natasha Devroye Chapter 2: Entropy and Mutual Information Chapter 2 outline Definitions Entropy Joint entropy, conditional entropy Relative entropy, mutual information Chain rules Jensen s inequality Log-sum inequality

More information

Probability Theory (revisited)

Probability Theory (revisited) Probablty Theory (revsted) Summary Probablty v.s. plausblty Random varables Smulaton of Random Experments Challenge The alarm of a shop rang. Soon afterwards, a man was seen runnng n the street, persecuted

More information

Module 1. Introduction to Digital Communications and Information Theory. Version 2 ECE IIT, Kharagpur

Module 1. Introduction to Digital Communications and Information Theory. Version 2 ECE IIT, Kharagpur Module ntroduction to Digital Communications and nformation Theory Lesson 3 nformation Theoretic Approach to Digital Communications After reading this lesson, you will learn about Scope of nformation Theory

More information

Communication with AWGN Interference

Communication with AWGN Interference Communcaton wth AWG Interference m {m } {p(m } Modulator s {s } r=s+n Recever ˆm AWG n m s a dscrete random varable(rv whch takes m wth probablty p(m. Modulator maps each m nto a waveform sgnal s m=m

More information

Continuous Time Markov Chain

Continuous Time Markov Chain Contnuous Tme Markov Chan Hu Jn Department of Electroncs and Communcaton Engneerng Hanyang Unversty ERICA Campus Contents Contnuous tme Markov Chan (CTMC) Propertes of sojourn tme Relatons Transton probablty

More information

Principles of Communications

Principles of Communications Principles of Communications Weiyao Lin Shanghai Jiao Tong University Chapter 10: Information Theory Textbook: Chapter 12 Communication Systems Engineering: Ch 6.1, Ch 9.1~ 9. 92 2009/2010 Meixia Tao @

More information

Applied Stochastic Processes

Applied Stochastic Processes STAT455/855 Fall 23 Appled Stochastc Processes Fnal Exam, Bref Solutons 1. (15 marks) (a) (7 marks) The dstrbuton of Y s gven by ( ) ( ) y 2 1 5 P (Y y) for y 2, 3,... The above follows because each of

More information

Learning Theory: Lecture Notes

Learning Theory: Lecture Notes Learnng Theory: Lecture Notes Lecturer: Kamalka Chaudhur Scrbe: Qush Wang October 27, 2012 1 The Agnostc PAC Model Recall that one of the constrants of the PAC model s that the data dstrbuton has to be

More information

Entropy of Markov Information Sources and Capacity of Discrete Input Constrained Channels (from Immink, Coding Techniques for Digital Recorders)

Entropy of Markov Information Sources and Capacity of Discrete Input Constrained Channels (from Immink, Coding Techniques for Digital Recorders) Entropy of Marov Informaton Sources and Capacty of Dscrete Input Constraned Channels (from Immn, Codng Technques for Dgtal Recorders). Entropy of Marov Chans We have already ntroduced the noton of entropy

More information

Asymptotic Quantization: A Method for Determining Zador s Constant

Asymptotic Quantization: A Method for Determining Zador s Constant Asymptotc Quantzaton: A Method for Determnng Zador s Constant Joyce Shh Because of the fnte capacty of modern communcaton systems better methods of encodng data are requred. Quantzaton refers to the methods

More information

Using T.O.M to Estimate Parameter of distributions that have not Single Exponential Family

Using T.O.M to Estimate Parameter of distributions that have not Single Exponential Family IOSR Journal of Mathematcs IOSR-JM) ISSN: 2278-5728. Volume 3, Issue 3 Sep-Oct. 202), PP 44-48 www.osrjournals.org Usng T.O.M to Estmate Parameter of dstrbutons that have not Sngle Exponental Famly Jubran

More information

CHAPTER 3: BAYESIAN DECISION THEORY

CHAPTER 3: BAYESIAN DECISION THEORY HATER 3: BAYESIAN DEISION THEORY Decson mang under uncertanty 3 Data comes from a process that s completely not nown The lac of nowledge can be compensated by modelng t as a random process May be the underlyng

More information

Outline. Bayesian Networks: Maximum Likelihood Estimation and Tree Structure Learning. Our Model and Data. Outline

Outline. Bayesian Networks: Maximum Likelihood Estimation and Tree Structure Learning. Our Model and Data. Outline Outlne Bayesan Networks: Maxmum Lkelhood Estmaton and Tree Structure Learnng Huzhen Yu janey.yu@cs.helsnk.f Dept. Computer Scence, Unv. of Helsnk Probablstc Models, Sprng, 200 Notces: I corrected a number

More information

Bayesian Learning. Smart Home Health Analytics Spring Nirmalya Roy Department of Information Systems University of Maryland Baltimore County

Bayesian Learning. Smart Home Health Analytics Spring Nirmalya Roy Department of Information Systems University of Maryland Baltimore County Smart Home Health Analytcs Sprng 2018 Bayesan Learnng Nrmalya Roy Department of Informaton Systems Unversty of Maryland Baltmore ounty www.umbc.edu Bayesan Learnng ombnes pror knowledge wth evdence to

More information

Exercises of Chapter 2

Exercises of Chapter 2 Exercses of Chapter Chuang-Cheh Ln Department of Computer Scence and Informaton Engneerng, Natonal Chung Cheng Unversty, Mng-Hsung, Chay 61, Tawan. Exercse.6. Suppose that we ndependently roll two standard

More information

Lec 02 Entropy and Lossless Coding I

Lec 02 Entropy and Lossless Coding I Multmeda Communcaton, Fall 208 Lec 02 Entroy and Lossless Codng I Zhu L Z. L Multmeda Communcaton, Fall 208. Outlne Lecture 0 ReCa Info Theory on Entroy Lossless Entroy Codng Z. L Multmeda Communcaton,

More information

NUMERICAL DIFFERENTIATION

NUMERICAL DIFFERENTIATION NUMERICAL DIFFERENTIATION 1 Introducton Dfferentaton s a method to compute the rate at whch a dependent output y changes wth respect to the change n the ndependent nput x. Ths rate of change s called the

More information

Complete subgraphs in multipartite graphs

Complete subgraphs in multipartite graphs Complete subgraphs n multpartte graphs FLORIAN PFENDER Unverstät Rostock, Insttut für Mathematk D-18057 Rostock, Germany Floran.Pfender@un-rostock.de Abstract Turán s Theorem states that every graph G

More information

CS 798: Homework Assignment 2 (Probability)

CS 798: Homework Assignment 2 (Probability) 0 Sample space Assgned: September 30, 2009 In the IEEE 802 protocol, the congeston wndow (CW) parameter s used as follows: ntally, a termnal wats for a random tme perod (called backoff) chosen n the range

More information

PhysicsAndMathsTutor.com

PhysicsAndMathsTutor.com PhscsAndMathsTutor.com phscsandmathstutor.com June 005 5. The random varable X has probablt functon k, = 1,, 3, P( X = ) = k ( + 1), = 4, 5, where k s a constant. (a) Fnd the value of k. (b) Fnd the eact

More information

Information Theory. Coding and Information Theory. Information Theory Textbooks. Entropy

Information Theory. Coding and Information Theory. Information Theory Textbooks. Entropy Coding and Information Theory Chris Williams, School of Informatics, University of Edinburgh Overview What is information theory? Entropy Coding Information Theory Shannon (1948): Information theory is

More information

Probability and Random Variable Primer

Probability and Random Variable Primer B. Maddah ENMG 622 Smulaton 2/22/ Probablty and Random Varable Prmer Sample space and Events Suppose that an eperment wth an uncertan outcome s performed (e.g., rollng a de). Whle the outcome of the eperment

More information

Statistics and Quantitative Analysis U4320. Segment 3: Probability Prof. Sharyn O Halloran

Statistics and Quantitative Analysis U4320. Segment 3: Probability Prof. Sharyn O Halloran Statstcs and Quanttatve Analyss U430 Segment 3: Probablty Prof. Sharyn O Halloran Revew: Descrptve Statstcs Code book for Measures Sample Data Relgon Employed 1. Catholc 0. Unemployed. Protestant 1. Employed

More information

Switched Quasi-Logarithmic Quantizer with Golomb Rice Coding

Switched Quasi-Logarithmic Quantizer with Golomb Rice Coding http://dx.do.org/10.5755/j01.ee.3.4.1877 Swtched Quas-Logarthmc Quantzer wth Golomb Rce Codng Nkola Vucc 1, Zoran Perc 1, Mlan Dncc 1 1 Faculty of Electronc Engneerng, Unversty of Ns, Aleksandar Medvedev

More information

On complexity and randomness of Markov-chain prediction

On complexity and randomness of Markov-chain prediction On complexty and randomness of Markov-chan predcton Joel Ratsaby Department of Electrcal and Electroncs Engneerng Arel Unversty Arel, ISRAEL Emal: ratsaby@arelacl Abstract Let {X t : t Z} be a sequence

More information

Chapter 9: Statistical Inference and the Relationship between Two Variables

Chapter 9: Statistical Inference and the Relationship between Two Variables Chapter 9: Statstcal Inference and the Relatonshp between Two Varables Key Words The Regresson Model The Sample Regresson Equaton The Pearson Correlaton Coeffcent Learnng Outcomes After studyng ths chapter,

More information

Channel Encoder. Channel. Figure 7.1: Communication system

Channel Encoder. Channel. Figure 7.1: Communication system Chapter 7 Processes The model of a communcaton system that we have been developng s shown n Fgure 7.. Ths model s also useful for some computaton systems. The source s assumed to emt a stream of symbols.

More information

ANSWERS. Problem 1. and the moment generating function (mgf) by. defined for any real t. Use this to show that E( U) var( U)

ANSWERS. Problem 1. and the moment generating function (mgf) by. defined for any real t. Use this to show that E( U) var( U) Econ 413 Exam 13 H ANSWERS Settet er nndelt 9 deloppgaver, A,B,C, som alle anbefales å telle lkt for å gøre det ltt lettere å stå. Svar er gtt . Unfortunately, there s a prntng error n the hnt of

More information

Approximately achieving Gaussian relay network capacity with lattice codes

Approximately achieving Gaussian relay network capacity with lattice codes Approxmately achevng Gaussan relay network capacty wth lattce codes Ayfer Özgür EFL, Lausanne, Swtzerland ayfer.ozgur@epfl.ch Suhas Dggav UCLA, USA and EFL, Swtzerland suhas@ee.ucla.edu Abstract Recently,

More information

Open Systems: Chemical Potential and Partial Molar Quantities Chemical Potential

Open Systems: Chemical Potential and Partial Molar Quantities Chemical Potential Open Systems: Chemcal Potental and Partal Molar Quanttes Chemcal Potental For closed systems, we have derved the followng relatonshps: du = TdS pdv dh = TdS + Vdp da = SdT pdv dg = VdP SdT For open systems,

More information

Ch. 8 Math Preliminaries for Lossy Coding. 8.4 Info Theory Revisited

Ch. 8 Math Preliminaries for Lossy Coding. 8.4 Info Theory Revisited Ch. 8 Math Preliminaries for Lossy Coding 8.4 Info Theory Revisited 1 Info Theory Goals for Lossy Coding Again just as for the lossless case Info Theory provides: Basis for Algorithms & Bounds on Performance

More information

P R. Lecture 4. Theory and Applications of Pattern Recognition. Dept. of Electrical and Computer Engineering /

P R. Lecture 4. Theory and Applications of Pattern Recognition. Dept. of Electrical and Computer Engineering / Theory and Applcatons of Pattern Recognton 003, Rob Polkar, Rowan Unversty, Glassboro, NJ Lecture 4 Bayes Classfcaton Rule Dept. of Electrcal and Computer Engneerng 0909.40.0 / 0909.504.04 Theory & Applcatons

More information

Transform Coding. Transform Coding Principle

Transform Coding. Transform Coding Principle Transform Codng Prncple of block-wse transform codng Propertes of orthonormal transforms Dscrete cosne transform (DCT) Bt allocaton for transform coeffcents Entropy codng of transform coeffcents Typcal

More information

The Multiple Classical Linear Regression Model (CLRM): Specification and Assumptions. 1. Introduction

The Multiple Classical Linear Regression Model (CLRM): Specification and Assumptions. 1. Introduction ECONOMICS 5* -- NOTE (Summary) ECON 5* -- NOTE The Multple Classcal Lnear Regresson Model (CLRM): Specfcaton and Assumptons. Introducton CLRM stands for the Classcal Lnear Regresson Model. The CLRM s also

More information

Bounds on the Effective-length of Optimal Codes for Interference Channel with Feedback

Bounds on the Effective-length of Optimal Codes for Interference Channel with Feedback Bounds on the Effectve-length of Optmal Codes for Interference Channel wth Feedback Mohsen Hedar EECS Department Unversty of Mchgan Ann Arbor,USA Emal: mohsenhd@umch.edu Farhad Shran ECE Department New

More information

MAXIMUM A POSTERIORI TRANSDUCTION

MAXIMUM A POSTERIORI TRANSDUCTION MAXIMUM A POSTERIORI TRANSDUCTION LI-WEI WANG, JU-FU FENG School of Mathematcal Scences, Peng Unversty, Bejng, 0087, Chna Center for Informaton Scences, Peng Unversty, Bejng, 0087, Chna E-MIAL: {wanglw,

More information

Communication Theory and Engineering

Communication Theory and Engineering Communication Theory and Engineering Master's Degree in Electronic Engineering Sapienza University of Rome A.A. 018-019 Information theory Practice work 3 Review For any probability distribution, we define

More information

An outside barrier option

An outside barrier option Chapter 24 An outsde barrer opton Barrer process: dy (t) Y (t) = dt+ 1 db 1 (t): Stock process: (t) S(t) = dt+ 2 db 1 (t) + 1 ; 2 2 db 2 (t) where 1 > 0 2 > 0 ;1

More information

EM and Structure Learning

EM and Structure Learning EM and Structure Learnng Le Song Machne Learnng II: Advanced Topcs CSE 8803ML, Sprng 2012 Partally observed graphcal models Mxture Models N(μ 1, Σ 1 ) Z X N N(μ 2, Σ 2 ) 2 Gaussan mxture model Consder

More information

An application of generalized Tsalli s-havrda-charvat entropy in coding theory through a generalization of Kraft inequality

An application of generalized Tsalli s-havrda-charvat entropy in coding theory through a generalization of Kraft inequality Internatonal Journal of Statstcs and Aled Mathematcs 206; (4): 0-05 ISS: 2456-452 Maths 206; (4): 0-05 206 Stats & Maths wwwmathsjournalcom Receved: 0-09-206 Acceted: 02-0-206 Maharsh Markendeshwar Unversty,

More information

UNIVERSITY OF TORONTO Faculty of Arts and Science. December 2005 Examinations STA437H1F/STA1005HF. Duration - 3 hours

UNIVERSITY OF TORONTO Faculty of Arts and Science. December 2005 Examinations STA437H1F/STA1005HF. Duration - 3 hours UNIVERSITY OF TORONTO Faculty of Arts and Scence December 005 Examnatons STA47HF/STA005HF Duraton - hours AIDS ALLOWED: (to be suppled by the student) Non-programmable calculator One handwrtten 8.5'' x

More information

A New Scrambling Evaluation Scheme based on Spatial Distribution Entropy and Centroid Difference of Bit-plane

A New Scrambling Evaluation Scheme based on Spatial Distribution Entropy and Centroid Difference of Bit-plane A New Scramblng Evaluaton Scheme based on Spatal Dstrbuton Entropy and Centrod Dfference of Bt-plane Lang Zhao *, Avshek Adhkar Kouch Sakura * * Graduate School of Informaton Scence and Electrcal Engneerng,

More information

Modelli Clamfim Equazioni differenziali 7 ottobre 2013

Modelli Clamfim Equazioni differenziali 7 ottobre 2013 CLAMFIM Bologna Modell 1 @ Clamfm Equazon dfferenzal 7 ottobre 2013 professor Danele Rtell danele.rtell@unbo.t 1/18? Ordnary Dfferental Equatons A dfferental equaton s an equaton that defnes a relatonshp

More information

Probability Theory. The nth coefficient of the Taylor series of f(k), expanded around k = 0, gives the nth moment of x as ( ik) n n!

Probability Theory. The nth coefficient of the Taylor series of f(k), expanded around k = 0, gives the nth moment of x as ( ik) n n! 8333: Statstcal Mechancs I Problem Set # 3 Solutons Fall 3 Characterstc Functons: Probablty Theory The characterstc functon s defned by fk ep k = ep kpd The nth coeffcent of the Taylor seres of fk epanded

More information

Markov chains. Definition of a CTMC: [2, page 381] is a continuous time, discrete value random process such that for an infinitesimal

Markov chains. Definition of a CTMC: [2, page 381] is a continuous time, discrete value random process such that for an infinitesimal Markov chans M. Veeraraghavan; March 17, 2004 [Tp: Study the MC, QT, and Lttle s law lectures together: CTMC (MC lecture), M/M/1 queue (QT lecture), Lttle s law lecture (when dervng the mean response tme

More information

Markov Chain Monte Carlo Lecture 6

Markov Chain Monte Carlo Lecture 6 where (x 1,..., x N ) X N, N s called the populaton sze, f(x) f (x) for at least one {1, 2,..., N}, and those dfferent from f(x) are called the tral dstrbutons n terms of mportance samplng. Dfferent ways

More information