54 D. S. HOODA AND U. S. BHAKER Belis and Guiasu [2] observed that a source is not completely specied by the probability distribution P over the sourc
|
|
- Victoria Dickerson
- 6 years ago
- Views:
Transcription
1 SOOCHOW JOURNAL OF MATHEMATICS Volume 23, No. 1, pp , January 1997 A GENERALIZED `USEFUL' INFORMATION MEASURE AND CODING THEOREMS BY D. S. HOODA AND U. S. BHAKER Abstract. In the present communication a generalized cost function of utilities and lengths of output code words by a memoryless source is dened and its lower and upper bounds in terms of a generalized `useful' information measure of order and type are obtained. Its asymptotic behaviour with reference to the problem of encoding source blocks of increasing lengths is also studied. 1. Introduction Let a nite set of N source symbols X =(x 1 x 2 ::: x N ) be encoded using alphabet of D symbols then it has been shown (Feinstein [4]) that there is a uniquely decipherable/instantaneous code with lengths n 1 n 2 ::: n N if and only if the following Kraft inequality [8] is satised: D ;n i 1: (1.1) If L = P N p i n i be the average codeword length, then for a code which satises (1.1), it has been shown ([4]) that L H(P ) (1.2) with equality if and only if n i = ; log p i, for i =1 2 ::: N: This is Shannon's coding theorem for a noiseless channel. Received March 21, AMS Subject Classication. 94A15, 94A24. Key words. Kraft inequality, instantaneous code, mean codeword length, Holder's inequality and code sequence. 53
2 54 D. S. HOODA AND U. S. BHAKER Belis and Guiasu [2] observed that a source is not completely specied by the probability distribution P over the source alphabet X, in the absence of qualitative character. So it can also be assumed that the source alphabet letters are weighted according to their importance or utilities. Let U =(u 1 u 2 ::: u N ) be the set of positive real numbers, where u i is the utility of the outcome x i. The utility u i, in general, is independent of p i, the probability of encoding of source symbol x i. The information source is thus given by S = x 1 x 2 ::: x N p 1 p 2 ::: p N u 1 u 2 ::: u N u i > 0 0 <p i 1 p i =1: (1.3) Belis and Guiasu [2] introduced the \quantitative-qualitative" measure of information H(P U) =; u i p i log p i (1.4) which can be taken as a measure for the average quantity of `valuable' or `useful' information provided by the information source (1.3). Guiasu and Picard [5] considered the problem of encoding the letter output by the source (1.3) by means of a single letter prex code whose codewords w 1 w 2 ::: w N, are of lengths n 1 n 2 ::: n N respectively and satisfy the Kraft's inequality (1.1). They introduced the following `useful' mean length of the code : L u = P N u i p i n i PN u j p j : (1.5) Further they derived a lower bound for (1.5). However, Longo [9] interpreted (1.5) as the average transmission cost of the letters x i and derived the bounds for this cost function. Taneja, Hooda and Tuteja [12] dened the `useful' average code lengths of order t as given below: L u(t) = 1 t log D h P N u i p i D tn i u j p j i (0 <t<1): (1.6)
3 `USEFUL' INFORMATION MEASURE AND CODING THEOREMS 55 Evidently, when t! 0, (1.6) reduces to (1.5). They derived the bounds for the cost function (1.6) in terms of a generalized ` useful' information measure of order, given by under the condition H (P U) = 1 P N 1 ; log u i p i D 6= 1 >0 (1.7) u j p j u i D ;n i u i p i : (1.8) Actually, inequality (1.8) is a generalization of the Kraft's inequality (1.1) as when u i =1for each i, (1.8) reduces to (1.1). A code satisfying the generalized Kraft's inequality (1.8) is termed as `useful code'. It may be seen that (1.6) satises the additive property as follows: L(P Q U V N + M t) =L(P U N t)+l(q V M t): (1.9) They also obtained the following result regarding upper bound of the `useful' average code length of order t H (P U) L u(t) <H (P U)+1 where = 1 0 <t<1: (1.10) t A Measure of the Generalized Cost It is assumed that the cost is a linear function of code length. However, there are some instances when the cost does not vary linearly with code lengths but it is more nearly an exponential function of n i 's. Such types of functions occur frequently in market equilibrium and growth models in economics. Since linear dependence is the limiting case of exponential function. Therefore it is interesting to minimize the more generalized quantity. C = (u i p i ) D tn i (2.1) where t and are some parameters related to the cost. In order to make the result of the present paper more comparable with the usual noiseless coding theorem, instead of minimizing (2.1), we minimize
4 56 D. S. HOODA AND U. S. BHAKER L u(t) = 1 t log D P N (u i p i ) D tn i 0 <t<1 (2.2) which is a monotonic function of C. We dene (2.2) as the `useful' average code length of order t and type. Clearly, if = 1,(2.2) reduces to (1.6) which further reduces to (1.5) when t! 0. We may also note that (2.2) is a monotonic non-decreasing function of t and if all the n i 's are the same, say, n i = n for each i, then L u(t) = n. This is an important property for any measure of length to possesss. It is additive in analogous to (1.9). Now we derivethelower and upper bounds of the cost function (2.2) in terms of the following `useful' information measure of order and type, which was dened and studied by Hooda and Singh [6] H(P U) = 1 P N 1 ; log u i p+;1 i D 6= 1 > 0 (2.3) under the condition u i p;1 i D ;n i (u i p i ) : (2.4) It may be seen that in case = 1, (2.4) reduces to (1.8) and further when u i =1 for each i, it reduces to Kraft's inequality (1.1). It may also be noted that (2.3) reduces to (1.7) if =1and further reduces to Renyi's entropy [10] of order when utilities are ignored. If we consider u i =1foreach i, then (2.3) reduces H (P U) = 1 1 ; log D P N p +;1 i p 6= 1 > 0 j which is the entropy of order and type characterized by Aczel and Daroczy [1] and Kapur [7]. 3. Coding Theorem for Useful Codes We rst prove the following lemma: Lemma 1. Let fu i g N, fp ig N and fn ig N satisfy the inequality (2:4), then L u(t) H (P U) where = 1 t+1 : (3.1)
5 `USEFUL' INFORMATION MEASURE AND CODING THEOREMS 57 Proof. Holder's inequality, we have ( x p i )1=p ( where 1 p + 1 q =1,p<1andx i, y i > 0. Let p = ;t, x i = y q i )1=q h i (u i p i ) P ; 1 t N D ;n (u jp j ) i where 0 <t<1 h u q =1; and y i = Putting these values in (3:2), we have h P N (u i p i ) D tn i i i p+;1 i P 1; 1 N : i ; 1 t h P N u i i p+;1 i P 1; 1 N x i y i (3.2) P N u i p;1 i D ;n i : (3.3) Using (2.4) and taking logrithm of both sides of (3.3), we get (3.1). It may be shown that there is equality in (3.1) if D ;n i = p i P N u i p+;1 i or n i = ; log D p i +log D [ By putting = 1 in (3.1), we have 1 t log D. PN u i p+;1 i = P N u i p i D tn i u j p j 1 1 ; log D ]: P N u i p i u j p j (3.4) which is a result obtained by Taneja, Hooda and Tuteja [12]. Further, if we ignore the utilities i.e. u i =1foreach i, then (3.4) reduces to 1 t log D( a result obtained by Campbell [3]. p i D nit ) 1 1 ; log D( p i ) Next, we prove a theorem giving the upper bound to the `useful' average code length of order t.
6 58 D. S. HOODA AND U. S. BHAKER Theorem 1. By properly choosing the lengths n 1 n 2 ::: n N in the code of Lemma 1, L u(t) where = 1 t+1 : can be madeto satisfy the following inequality: H (P U) L u(t) H (P U)+1 (3.5) Proof. Let us suppose the codewords lengths n i as the integers satisfying P N u i ; log D p i + log p+;1 i D (u i p i ) From the left inequality of (3.6), we have P N u i n i <; log D p i + log p+;1 i D +1: (3.6) D ;n i p i P N u i p+;1 i. PN i =1 2 N which gives (1.1) when = 1. It proves that there exists a `useful' code with lengths n i for each i. From (3.6), we have p ;t i [ u i p+;1 i = ] t D n it <D t p ;t i [ u i p+;1 i = ] t : (3.7) Multiplying (3.7) by (u i p i ) P N =, summing over i, raising to the power 1=t, taking logrithm and using the relation = 1 1+t, we get (3.5). Hence Theorem 1 is proved. to Remark. When t! 0 (or! 1) and =1,the inequality (3.5) reduces H(P U) u log D L H(P U) u < +1 (3.8) u log D where L u is the average cost function (1.5) dened by Guiasu and Picard [5] and u = P N u j p j. Longo [9] obtained the lower and upper bounds on L u as given below: H(P U) ; u log u + u log u u log D L u < H(P U) ; u log u + u log u u log D +1 (3.9) where the bar means the mean value with respect to the probability distribution P = f(p 1 p 2 p N ) p i 0 and P N p i =1g:
7 `USEFUL' INFORMATION MEASURE AND CODING THEOREMS 59 Since x log x is a convex U function and u log u u log u holds, therefore H(P U) does not seem to be less basic in (3.9) in comparison to it is in (3.8). 4. Encoding for Sequences Now we consider a typical sequence of length M, each symbol x i is generated by the probability distribution P =(p 1 p 2 p N ) the utility distribution U = (u i u 2 u N ), u i > 0. Let s =[x i1 x i2 x im ] denotes the block sequence of length M. Next, we dene the probability p(s) and the utility u(s) for the block sequence s. Since x ij,j = 1 2 M are the outputs by the memoryless source (1.3), therefore these are independent for unique decipherable codes. Thus we dene p(s), the probability of s, as p(s) = MX p ij : (4.1) In general, we dene the utility of a source sequence either as the sum of utilities of its letters refer to Longo [9] or as the sum of utilities of its letters divided by its block length, Sgarro [11]. However, in each case the utility will be a monotonic funcion of the number of joint experiments. We dene u(s), the utility of the source sequence s, as where u ij is the utility ofletterx ij. u(s) = MX u ij (4.2) Let n(s) be the length of the code sequence for s in `useful' code and let the code length of order t and type for the M sequence be L u(t) (s) =1 t log D Ps[u(s)p(s)] D tn(s) Ps[u(s)p(s)] 0 <t<1 (4.3) where P s extends over N M M-sequences s. The generalized measure of useful information of order and type for this product sequence is H[P (s) U(s)] = 1 Ps 1 ; log u (s)p(s) +;1 D Ps[u(s)p(s)] 6= 1: (4.4)
8 60 D. S. HOODA AND U. S. BHAKER Using (4.1) and (4.2), we have Ps u (s)p(s) +;1 Ps[u(s)p(s)] = which implies that h u i p+;1 i. ihp N p +;1 i p j i M;1 H [P (s) U(s)] = H (P U)+(M ; 1)H (p) (4.5) where H (P )= 1 1 ; log D P N p +;1 i p 6= 1 >0 j is the generalized measure of entropy studied by Kapur [7]. Let n(s) betheinteger satisfying the inequality P s ; log D p(s)+log u (s)p +;1 (s) D P s u (s)p (s) P s n(s) < ; log D p(s) + log u (s)p +;1 (s) D P s u (s)p +1: (4.6) (s) From the left inequality of (4.6) we have D ;n(s) p (s) Ps u (s)p +;1 (s)= P s u (s)p (s) : It implies that there exists a ` useful' code with length n(s) which satises (4.6). We can re-write (4.6) as follows p ;t (s) h P s u (s)p +;1 (s) P s u (s)p (s) i t D tn(s) D t p (s)h P ;t s u (s)p +;1 (s) i t: P s u (s)p (s) (4.7) Multiplying (4.7) by [u(s)p(s)] = P s[u(s)p(s)], summing over all s, raising to the power 1=t, taking logrithm and using the relation = 1 t+1 weget H [P (s) U(s)] L u(t) (s) <H [P (s) U(s)] + 1: (4.8) Now (4.8) together with (4.5) gives H (P )+ 1 M [H (P U);H (P )] L u(t) (s) M <H (P )+ 1 M [H (P U);H (P )+1]
9 `USEFUL' INFORMATION MEASURE AND CODING THEOREMS 61 which can be written as H (P )+ 1 M 1 L u(t) (s) M <H (p)+ 1 M 2 (4.9) where 1 = H (P U) ; H (P ), 2 = H (P U) ; H (P )+1: The quantity L u(t) (s)=m may be called the `useful' mean length of order t and type per source letter. If the given system 1 is nite then, 2 is also nite and the average code length per source letter tends to H (P )whenm!1. In case =1,(4.9) reduces to a result obtained by Campbell [3]. In this way we have proved the following theorem: Theorem 2. Given a discrete memoryless source with additional parameter u i, there exists a sequence of `useful' codes for the M -length source sequences whose weighted mean length of order t and type per source letter tends to H (P ) where = 1 t +1 : Note 1. It may be seen that if M is very large, the exibility provided by the utilities disappears, since their inuence dies o. On the other hand if M is very large, one can suspect that there are rather more losses due to ineciency of single letter coding. Thus there must be an immediate optimum length for the blocks at which the function L u(t) (s)=m has a minimum value. Note 2. Since each sequence has M-symbols for each i, it follows that a better measure is L = L u(t) (s)=m. Thus for a suciently large M-sequence of code, we can bring the average codeword length L as close as we please to the generalized weighted entropy of order and type. References [1] J. Aczel and Z. Daroczy, Uber Verallegemeinerte Mittelveste, die mit grewinebtsfunktionen jebildet Sind, Pub. Math. Debreun, 10 (1963), [2] M. Belis and S. Guiasu, A quantitative-qualitative measure of information in cybernetic systems, IEEE Trans. Information Theory, 14 (1968), [3] L. L. Campbell, A coding theorem and Renyi's Entropy, Information and Control, 8 (1965), [4] A. Feinstein, Foundations of Information Theory, McGraw-Hill, New York.
10 62 D. S. HOODA AND U. S. BHAKER [5] S. Guiasu and C. F. Picard, Borne Inferieure de la Longueur de Certain Codes, C. R. Acad. Sci. Paris, 273 (1971), [6] D. S. Hooda and U. Singh, On ` useful' information generating functions, Statistica, XL VI :4 (1986), [7] J. N. Kapur, Generalized entropy of order and type, Maths. Seminar, Delhi, 4 (1967). [8] J. G. Kraft, A device for quantizing, grouping and coding amplitude modulated pulses, M. S. Thesis Electrical Engineering Department, MIT, [9] G. Longo, A noiseless coding theorem for sources having utilities, SIAM J. Appl. Math., 30 (1976), [10] A. Renyi, On measure of entropy and information, Proceeding 4th Berkeley Symp. on Math. Stat. and Probability, University of California Press, 1996, [11] A. Sgarro, Noiseless block-coding of `useful' information, Elektronische Information -Sverabeitung and Kybernetik EIK, 15 (1979), [12] H. C. Taneja, D. S. Hooda, and R. K. Tuteja, Coding theorem on a generalized `useful' information, Soochow Journal of Mathematics, 11 (1985), Department of Mathematics and Statistics, CCS Haryana Agricultural, University Hisar , India. Department of Mathematics, Govt. (P.G.) College, Bhiwani , India.
New Generalized Entropy Measure and its Corresponding Code-word Length and Their Characterizations
New Generalized Entropy Measure and its Corresponding Code-word Length and Their Characterizations Ashiq Hussain Bhat 1, Dr. M. A. K. Baig 2, Dr. Muzafar Hussain Dar 3 1,2 Post Graduate Department of Statistics
More informationCODING THEOREMS ON NEW ADDITIVE INFORMATION MEASURE OF ORDER
Pak. J. Statist. 2018 Vol. 34(2), 137-146 CODING THEOREMS ON NEW ADDITIVE INFORMATION MEASURE OF ORDER Ashiq Hussain Bhat 1 and M.A.K. Baig 2 Post Graduate Department of Statistics, University of Kashmir,
More informationSome Coding Theorems on Fuzzy Entropy Function Depending Upon Parameter R and Ѵ
IOSR Journal of Mathematics IOSR-JM) e-issn: 2278-5728, p-issn:2319-765x. Volume 9, Issue 6 Jan. 2014), PP 119-123 Some Coding Theorems on Fuzzy Entropy Function Depending Upon Parameter R and Ѵ M.A.K.
More informationA Generalized Fuzzy Inaccuracy Measure of Order ɑ and Type β and Coding Theorems
International Journal of Fuzzy Mathematics and Systems. ISSN 2248-9940 Volume 4, Number (204), pp. 27-37 Research India Publications http://www.ripublication.com A Generalized Fuzzy Inaccuracy Measure
More informationA Coding Theorem Connected on R-Norm Entropy
Int. J. Contemp. Math. Sciences, Vol. 6, 2011, no. 17, 825-831 A Coding Theorem Connected on -Norm Entropy Satish Kumar and Arun Choudhary Department of Mathematics Geeta Institute of Management & Technology
More informationKybernetika. Harish C. Taneja; R. K. Tuteja Characterization of a quantitative-qualitative measure of inaccuracy
Kybernetika Harish C. Taneja; R. K. Tuteja Characterization of a quantitative-qualitative measure of inaccuracy Kybernetika, Vol. 22 (1986), o. 5, 393--402 Persistent URL: http://dml.cz/dmlcz/124578 Terms
More informationChapter 2 Source Models and Entropy. Any information-generating process can be viewed as. computer program in executed form: binary 0
Part II Information Theory Concepts Chapter 2 Source Models and Entropy Any information-generating process can be viewed as a source: { emitting a sequence of symbols { symbols from a nite alphabet text:
More information[1] Abramson, N. [1963]: [2] Aczel, J. [1975]: [3] Asadi, M. Ebrahimi N. [2000]: [4] Ash, B.R. [1990]: [5] Atanassov,K. [1983]: [6] Atanassov,
BIBLIOGRAPHY [1] Abramson, N. [1963]: Information theory and coding ; Mc.Graw Hill, New York. and statistical inference, Metrika, vol. 36, pp.129-147. [2] Aczel, J. [1975]: On Shannon s inequality, optimal
More informationCoding for Discrete Source
EGR 544 Communication Theory 3. Coding for Discrete Sources Z. Aliyazicioglu Electrical and Computer Engineering Department Cal Poly Pomona Coding for Discrete Source Coding Represent source data effectively
More information1 Ex. 1 Verify that the function H(p 1,..., p n ) = k p k log 2 p k satisfies all 8 axioms on H.
Problem sheet Ex. Verify that the function H(p,..., p n ) = k p k log p k satisfies all 8 axioms on H. Ex. (Not to be handed in). looking at the notes). List as many of the 8 axioms as you can, (without
More informationChapter 3 Source Coding. 3.1 An Introduction to Source Coding 3.2 Optimal Source Codes 3.3 Shannon-Fano Code 3.4 Huffman Code
Chapter 3 Source Coding 3. An Introduction to Source Coding 3.2 Optimal Source Codes 3.3 Shannon-Fano Code 3.4 Huffman Code 3. An Introduction to Source Coding Entropy (in bits per symbol) implies in average
More informationInformation Theory and Coding Techniques
Information Theory and Coding Techniques Lecture 1.2: Introduction and Course Outlines Information Theory 1 Information Theory and Coding Techniques Prof. Ja-Ling Wu Department of Computer Science and
More informationChapter 9 Fundamental Limits in Information Theory
Chapter 9 Fundamental Limits in Information Theory Information Theory is the fundamental theory behind information manipulation, including data compression and data transmission. 9.1 Introduction o For
More informationVariable Length Codes for Degraded Broadcast Channels
Variable Length Codes for Degraded Broadcast Channels Stéphane Musy School of Computer and Communication Sciences, EPFL CH-1015 Lausanne, Switzerland Email: stephane.musy@ep.ch Abstract This paper investigates
More informationUpper Bounds on the Capacity of Binary Intermittent Communication
Upper Bounds on the Capacity of Binary Intermittent Communication Mostafa Khoshnevisan and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame, Indiana 46556 Email:{mhoshne,
More information1 Introduction This work follows a paper by P. Shields [1] concerned with a problem of a relation between the entropy rate of a nite-valued stationary
Prexes and the Entropy Rate for Long-Range Sources Ioannis Kontoyiannis Information Systems Laboratory, Electrical Engineering, Stanford University. Yurii M. Suhov Statistical Laboratory, Pure Math. &
More informationOn Dependence Balance Bounds for Two Way Channels
On Dependence Balance Bounds for Two Way Channels Ravi Tandon Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College Park, MD 20742 ravit@umd.edu ulukus@umd.edu
More informationEE 121: Introduction to Digital Communication Systems. 1. Consider the following discrete-time communication system. There are two equallly likely
EE 11: Introduction to Digital Communication Systems Midterm Solutions 1. Consider the following discrete-time communication system. There are two equallly likely messages to be transmitted, and they are
More informationTight Bounds on Minimum Maximum Pointwise Redundancy
Tight Bounds on Minimum Maximum Pointwise Redundancy Michael B. Baer vlnks Mountain View, CA 94041-2803, USA Email:.calbear@ 1eee.org Abstract This paper presents new lower and upper bounds for the optimal
More informationOn Some New Measures of Intutionstic Fuzzy Entropy and Directed Divergence
Global Journal of Mathematical Sciences: Theory and Practical. ISSN 0974-3200 Volume 3, Number 5 (20), pp. 473-480 International Research Publication House http://www.irphouse.com On Some New Measures
More informationThe Pacic Institute for the Mathematical Sciences http://www.pims.math.ca pims@pims.math.ca Surprise Maximization D. Borwein Department of Mathematics University of Western Ontario London, Ontario, Canada
More informationarxiv: v1 [cs.it] 5 Sep 2008
1 arxiv:0809.1043v1 [cs.it] 5 Sep 2008 On Unique Decodability Marco Dalai, Riccardo Leonardi Abstract In this paper we propose a revisitation of the topic of unique decodability and of some fundamental
More informationELEMENT OF INFORMATION THEORY
History Table of Content ELEMENT OF INFORMATION THEORY O. Le Meur olemeur@irisa.fr Univ. of Rennes 1 http://www.irisa.fr/temics/staff/lemeur/ October 2010 1 History Table of Content VERSION: 2009-2010:
More informationCSCI 2570 Introduction to Nanocomputing
CSCI 2570 Introduction to Nanocomputing Information Theory John E Savage What is Information Theory Introduced by Claude Shannon. See Wikipedia Two foci: a) data compression and b) reliable communication
More informationarxiv:math-ph/ v1 30 May 2005
Nongeneralizability of Tsallis Entropy by means of Kolmogorov-Nagumo averages under pseudo-additivity arxiv:math-ph/0505078v1 30 May 2005 Ambedkar Dukkipati, 1 M. Narsimha Murty,,2 Shalabh Bhatnagar 3
More informationPerformance Bounds for Joint Source-Channel Coding of Uniform. Departements *Communications et **Signal
Performance Bounds for Joint Source-Channel Coding of Uniform Memoryless Sources Using a Binary ecomposition Seyed Bahram ZAHIR AZAMI*, Olivier RIOUL* and Pierre UHAMEL** epartements *Communications et
More informationChapter 2 Date Compression: Source Coding. 2.1 An Introduction to Source Coding 2.2 Optimal Source Codes 2.3 Huffman Code
Chapter 2 Date Compression: Source Coding 2.1 An Introduction to Source Coding 2.2 Optimal Source Codes 2.3 Huffman Code 2.1 An Introduction to Source Coding Source coding can be seen as an efficient way
More informationPART III. Outline. Codes and Cryptography. Sources. Optimal Codes (I) Jorge L. Villar. MAMME, Fall 2015
Outline Codes and Cryptography 1 Information Sources and Optimal Codes 2 Building Optimal Codes: Huffman Codes MAMME, Fall 2015 3 Shannon Entropy and Mutual Information PART III Sources Information source:
More informationAn introduction to basic information theory. Hampus Wessman
An introduction to basic information theory Hampus Wessman Abstract We give a short and simple introduction to basic information theory, by stripping away all the non-essentials. Theoretical bounds on
More informationA fast algorithm to generate necklaces with xed content
Theoretical Computer Science 301 (003) 477 489 www.elsevier.com/locate/tcs Note A fast algorithm to generate necklaces with xed content Joe Sawada 1 Department of Computer Science, University of Toronto,
More informationInformation and Entropy
Information and Entropy Shannon s Separation Principle Source Coding Principles Entropy Variable Length Codes Huffman Codes Joint Sources Arithmetic Codes Adaptive Codes Thomas Wiegand: Digital Image Communication
More informationOptimization in Information Theory
Optimization in Information Theory Dawei Shen November 11, 2005 Abstract This tutorial introduces the application of optimization techniques in information theory. We revisit channel capacity problem from
More informationQuantum Sphere-Packing Bounds and Moderate Deviation Analysis for Classical-Quantum Channels
Quantum Sphere-Packing Bounds and Moderate Deviation Analysis for Classical-Quantum Channels (, ) Joint work with Min-Hsiu Hsieh and Marco Tomamichel Hao-Chung Cheng University of Technology Sydney National
More informationThe Method of Types and Its Application to Information Hiding
The Method of Types and Its Application to Information Hiding Pierre Moulin University of Illinois at Urbana-Champaign www.ifp.uiuc.edu/ moulin/talks/eusipco05-slides.pdf EUSIPCO Antalya, September 7,
More informationDigital communication system. Shannon s separation principle
Digital communication system Representation of the source signal by a stream of (binary) symbols Adaptation to the properties of the transmission channel information source source coder channel coder modulation
More information1 Introduction to information theory
1 Introduction to information theory 1.1 Introduction In this chapter we present some of the basic concepts of information theory. The situations we have in mind involve the exchange of information through
More informationA characterization of consistency of model weights given partial information in normal linear models
Statistics & Probability Letters ( ) A characterization of consistency of model weights given partial information in normal linear models Hubert Wong a;, Bertrand Clare b;1 a Department of Health Care
More informationSource Coding. Master Universitario en Ingeniería de Telecomunicación. I. Santamaría Universidad de Cantabria
Source Coding Master Universitario en Ingeniería de Telecomunicación I. Santamaría Universidad de Cantabria Contents Introduction Asymptotic Equipartition Property Optimal Codes (Huffman Coding) Universal
More informationExercise 1. = P(y a 1)P(a 1 )
Chapter 7 Channel Capacity Exercise 1 A source produces independent, equally probable symbols from an alphabet {a 1, a 2 } at a rate of one symbol every 3 seconds. These symbols are transmitted over a
More informationResearchers often record several characters in their research experiments where each character has a special significance to the experimenter.
Dimension reduction in multivariate analysis using maximum entropy criterion B. K. Hooda Department of Mathematics and Statistics CCS Haryana Agricultural University Hisar 125 004 India D. S. Hooda Jaypee
More informationRedundancy-Related Bounds for Generalized Huffman Codes
Redundancy-Related Bounds for Generalized Huffman Codes Michael B. Baer, Member, IEEE arxiv:cs/0702059v2 [cs.it] 6 Mar 2009 Abstract This paper presents new lower and upper bounds for the compression rate
More informationCommon Information. Abbas El Gamal. Stanford University. Viterbi Lecture, USC, April 2014
Common Information Abbas El Gamal Stanford University Viterbi Lecture, USC, April 2014 Andrew Viterbi s Fabulous Formula, IEEE Spectrum, 2010 El Gamal (Stanford University) Disclaimer Viterbi Lecture 2
More informationCapacity of the Discrete Memoryless Energy Harvesting Channel with Side Information
204 IEEE International Symposium on Information Theory Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information Omur Ozel, Kaya Tutuncuoglu 2, Sennur Ulukus, and Aylin Yener
More informationCoding of memoryless sources 1/35
Coding of memoryless sources 1/35 Outline 1. Morse coding ; 2. Definitions : encoding, encoding efficiency ; 3. fixed length codes, encoding integers ; 4. prefix condition ; 5. Kraft and Mac Millan theorems
More informationAn instantaneous code (prefix code, tree code) with the codeword lengths l 1,..., l N exists if and only if. 2 l i. i=1
Kraft s inequality An instantaneous code (prefix code, tree code) with the codeword lengths l 1,..., l N exists if and only if N 2 l i 1 Proof: Suppose that we have a tree code. Let l max = max{l 1,...,
More informationIntroduction to Information Theory. Uncertainty. Entropy. Surprisal. Joint entropy. Conditional entropy. Mutual information.
L65 Dept. of Linguistics, Indiana University Fall 205 Information theory answers two fundamental questions in communication theory: What is the ultimate data compression? What is the transmission rate
More informationDept. of Linguistics, Indiana University Fall 2015
L645 Dept. of Linguistics, Indiana University Fall 2015 1 / 28 Information theory answers two fundamental questions in communication theory: What is the ultimate data compression? What is the transmission
More informationChapter 2: Source coding
Chapter 2: meghdadi@ensil.unilim.fr University of Limoges Chapter 2: Entropy of Markov Source Chapter 2: Entropy of Markov Source Markov model for information sources Given the present, the future is independent
More informationSuperposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels
Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Nan Liu and Andrea Goldsmith Department of Electrical Engineering Stanford University, Stanford CA 94305 Email:
More informationAn application of generalized Tsalli s-havrda-charvat entropy in coding theory through a generalization of Kraft inequality
Internatonal Journal of Statstcs and Aled Mathematcs 206; (4): 0-05 ISS: 2456-452 Maths 206; (4): 0-05 206 Stats & Maths wwwmathsjournalcom Receved: 0-09-206 Acceted: 02-0-206 Maharsh Markendeshwar Unversty,
More informationThe Optimal Fix-Free Code for Anti-Uniform Sources
Entropy 2015, 17, 1379-1386; doi:10.3390/e17031379 OPEN ACCESS entropy ISSN 1099-4300 www.mdpi.com/journal/entropy Article The Optimal Fix-Free Code for Anti-Uniform Sources Ali Zaghian 1, Adel Aghajan
More informationArimoto Channel Coding Converse and Rényi Divergence
Arimoto Channel Coding Converse and Rényi Divergence Yury Polyanskiy and Sergio Verdú Abstract Arimoto proved a non-asymptotic upper bound on the probability of successful decoding achievable by any code
More informationEntropy and Ergodic Theory Lecture 3: The meaning of entropy in information theory
Entropy and Ergodic Theory Lecture 3: The meaning of entropy in information theory 1 The intuitive meaning of entropy Modern information theory was born in Shannon s 1948 paper A Mathematical Theory of
More informationMARKOV CHAINS A finite state Markov chain is a sequence of discrete cv s from a finite alphabet where is a pmf on and for
MARKOV CHAINS A finite state Markov chain is a sequence S 0,S 1,... of discrete cv s from a finite alphabet S where q 0 (s) is a pmf on S 0 and for n 1, Q(s s ) = Pr(S n =s S n 1 =s ) = Pr(S n =s S n 1
More informationInformation Theory. Lecture 5 Entropy rate and Markov sources STEFAN HÖST
Information Theory Lecture 5 Entropy rate and Markov sources STEFAN HÖST Universal Source Coding Huffman coding is optimal, what is the problem? In the previous coding schemes (Huffman and Shannon-Fano)it
More informationAPC486/ELE486: Transmission and Compression of Information. Bounds on the Expected Length of Code Words
APC486/ELE486: Transmission and Compression of Information Bounds on the Expected Length of Code Words Scribe: Kiran Vodrahalli September 8, 204 Notations In these notes, denotes a finite set, called the
More informationAn Interpretation of Identification Entropy
4198 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL 52, NO 9, SEPTEMBER 2006 An Interpretation of Identication Entropy Rudolf Ahlswede Ning Cai, Senior Member, IEEE the expected number of checkings in the
More informationLecture 4 Channel Coding
Capacity and the Weak Converse Lecture 4 Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 15, 2014 1 / 16 I-Hsiang Wang NIT Lecture 4 Capacity
More informationConvexity/Concavity of Renyi Entropy and α-mutual Information
Convexity/Concavity of Renyi Entropy and -Mutual Information Siu-Wai Ho Institute for Telecommunications Research University of South Australia Adelaide, SA 5095, Australia Email: siuwai.ho@unisa.edu.au
More information(each row defines a probability distribution). Given n-strings x X n, y Y n we can use the absence of memory in the channel to compute
ENEE 739C: Advanced Topics in Signal Processing: Coding Theory Instructor: Alexander Barg Lecture 6 (draft; 9/6/03. Error exponents for Discrete Memoryless Channels http://www.enee.umd.edu/ abarg/enee739c/course.html
More information(Classical) Information Theory III: Noisy channel coding
(Classical) Information Theory III: Noisy channel coding Sibasish Ghosh The Institute of Mathematical Sciences CIT Campus, Taramani, Chennai 600 113, India. p. 1 Abstract What is the best possible way
More informationA Comparison of Methods for Redundancy Reduction in Recurrence Time Coding
1 1 A Comparison of Methods for Redundancy Reduction in Recurrence Time Coding Hidetoshi Yokoo, Member, IEEE Abstract Recurrence time of a symbol in a string is defined as the number of symbols that have
More informationA Single-letter Upper Bound for the Sum Rate of Multiple Access Channels with Correlated Sources
A Single-letter Upper Bound for the Sum Rate of Multiple Access Channels with Correlated Sources Wei Kang Sennur Ulukus Department of Electrical and Computer Engineering University of Maryland, College
More informationMinimum Shannon Entropy for two specified Moments
International Journal of Scientific and Research Publications, Volume 2, Issue 4, April 2012 1 Minimum Shannon Entropy for two specified Moments Anju Rani*, Shalu Garg** * Department of Mathematics, R.R.
More informationSpurious Chaotic Solutions of Dierential. Equations. Sigitas Keras. September Department of Applied Mathematics and Theoretical Physics
UNIVERSITY OF CAMBRIDGE Numerical Analysis Reports Spurious Chaotic Solutions of Dierential Equations Sigitas Keras DAMTP 994/NA6 September 994 Department of Applied Mathematics and Theoretical Physics
More information10-704: Information Processing and Learning Fall Lecture 10: Oct 3
0-704: Information Processing and Learning Fall 206 Lecturer: Aarti Singh Lecture 0: Oct 3 Note: These notes are based on scribed notes from Spring5 offering of this course. LaTeX template courtesy of
More informationUNIT I INFORMATION THEORY. I k log 2
UNIT I INFORMATION THEORY Claude Shannon 1916-2001 Creator of Information Theory, lays the foundation for implementing logic in digital circuits as part of his Masters Thesis! (1939) and published a paper
More informationInformation Theory in Intelligent Decision Making
Information Theory in Intelligent Decision Making Adaptive Systems and Algorithms Research Groups School of Computer Science University of Hertfordshire, United Kingdom June 7, 2015 Information Theory
More informationShannon Entropy: Axiomatic Characterization and Application
Shannon Entropy: Axiomatic Characterization and Application C. G. Chakrabarti,Indranil Chakrabarty arxiv:quant-ph/0511171v1 17 Nov 2005 We have presented a new axiomatic derivation of Shannon Entropy for
More informationCOMM901 Source Coding and Compression. Quiz 1
German University in Cairo - GUC Faculty of Information Engineering & Technology - IET Department of Communication Engineering Winter Semester 2013/2014 Students Name: Students ID: COMM901 Source Coding
More informationOn Multiple User Channels with State Information at the Transmitters
On Multiple User Channels with State Information at the Transmitters Styrmir Sigurjónsson and Young-Han Kim* Information Systems Laboratory Stanford University Stanford, CA 94305, USA Email: {styrmir,yhk}@stanford.edu
More informationSome Notes On Rissanen's Stochastic Complexity Guoqi Qian zx and Hans R. Kunsch Seminar fur Statistik ETH Zentrum CH-8092 Zurich, Switzerland November
Some Notes On Rissanen's Stochastic Complexity by Guoqi Qian 1 2 and Hans R. Kunsch Research Report No. 79 November 1996 Seminar fur Statistik Eidgenossische Technische Hochschule (ETH) CH-8092 Zurich
More informationREFERENCES. Aczel, J. and Daroczy, Z. (1963). Characterisierung der entropien positiver ordnung
REFERENCES Aczel, J. and Daroczy, Z. (1963). Characterisierung der entropien positiver ordnung under Shannonschen entropie. Acta Mathematica Hungarica 14: 95-121. Aczel, J. and Daroczy, Z. (1975). On Measures
More informationMultimedia Communications. Mathematical Preliminaries for Lossless Compression
Multimedia Communications Mathematical Preliminaries for Lossless Compression What we will see in this chapter Definition of information and entropy Modeling a data source Definition of coding and when
More informationA Comparison of Two Achievable Rate Regions for the Interference Channel
A Comparison of Two Achievable Rate Regions for the Interference Channel Hon-Fah Chong, Mehul Motani, and Hari Krishna Garg Electrical & Computer Engineering National University of Singapore Email: {g030596,motani,eleghk}@nus.edu.sg
More informationIntroduction to Information Theory. By Prof. S.J. Soni Asst. Professor, CE Department, SPCE, Visnagar
Introduction to Information Theory By Prof. S.J. Soni Asst. Professor, CE Department, SPCE, Visnagar Introduction [B.P. Lathi] Almost in all the means of communication, none produces error-free communication.
More informationEntropy and Ergodic Theory Lecture 4: Conditional entropy and mutual information
Entropy and Ergodic Theory Lecture 4: Conditional entropy and mutual information 1 Conditional entropy Let (Ω, F, P) be a probability space, let X be a RV taking values in some finite set A. In this lecture
More informationGeneralized Kraft Inequality and Arithmetic Coding
J. J. Rissanen Generalized Kraft Inequality and Arithmetic Coding Abstract: Algorithms for encoding and decoding finite strings over a finite alphabet are described. The coding operations are arithmetic
More informationModule 1. Introduction to Digital Communications and Information Theory. Version 2 ECE IIT, Kharagpur
Module ntroduction to Digital Communications and nformation Theory Lesson 3 nformation Theoretic Approach to Digital Communications After reading this lesson, you will learn about Scope of nformation Theory
More informationTraining-Based Schemes are Suboptimal for High Rate Asynchronous Communication
Training-Based Schemes are Suboptimal for High Rate Asynchronous Communication The MIT Faculty has made this article openly available. Please share how this access benefits you. Your story matters. Citation
More informationConstellation Shaping for Communication Channels with Quantized Outputs
Constellation Shaping for Communication Channels with Quantized Outputs Chandana Nannapaneni, Matthew C. Valenti, and Xingyu Xiang Lane Department of Computer Science and Electrical Engineering West Virginia
More informationRemote Source Coding with Two-Sided Information
Remote Source Coding with Two-Sided Information Basak Guler Ebrahim MolavianJazi Aylin Yener Wireless Communications and Networking Laboratory Department of Electrical Engineering The Pennsylvania State
More information4. Quantization and Data Compression. ECE 302 Spring 2012 Purdue University, School of ECE Prof. Ilya Pollak
4. Quantization and Data Compression ECE 32 Spring 22 Purdue University, School of ECE Prof. What is data compression? Reducing the file size without compromising the quality of the data stored in the
More informationApproaching Blokh-Zyablov Error Exponent with Linear-Time Encodable/Decodable Codes
Approaching Blokh-Zyablov Error Exponent with Linear-Time Encodable/Decodable Codes 1 Zheng Wang, Student Member, IEEE, Jie Luo, Member, IEEE arxiv:0808.3756v1 [cs.it] 27 Aug 2008 Abstract We show that
More informationCut-Set Bound and Dependence Balance Bound
Cut-Set Bound and Dependence Balance Bound Lei Xiao lxiao@nd.edu 1 Date: 4 October, 2006 Reading: Elements of information theory by Cover and Thomas [1, Section 14.10], and the paper by Hekstra and Willems
More informationTwo Applications of the Gaussian Poincaré Inequality in the Shannon Theory
Two Applications of the Gaussian Poincaré Inequality in the Shannon Theory Vincent Y. F. Tan (Joint work with Silas L. Fong) National University of Singapore (NUS) 2016 International Zurich Seminar on
More informationarxiv: v4 [cs.it] 17 Oct 2015
Upper Bounds on the Relative Entropy and Rényi Divergence as a Function of Total Variation Distance for Finite Alphabets Igal Sason Department of Electrical Engineering Technion Israel Institute of Technology
More informationIEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 53, NO. 12, DECEMBER
IEEE TRANSACTIONS ON INFORMATION THEORY, VOL 53, NO 12, DECEMBER 2007 4457 Joint Source Channel Coding Error Exponent for Discrete Communication Systems With Markovian Memory Yangfan Zhong, Student Member,
More informationSource Coding: Part I of Fundamentals of Source and Video Coding
Foundations and Trends R in sample Vol. 1, No 1 (2011) 1 217 c 2011 Thomas Wiegand and Heiko Schwarz DOI: xxxxxx Source Coding: Part I of Fundamentals of Source and Video Coding Thomas Wiegand 1 and Heiko
More informationOn Unique Decodability, McMillan s Theorem and the Expected Length of Codes
1 On Unique Decodability, McMillan s Theorem and the Expected Length of Codes Technical Report R.T. 200801-58, Department of Electronics for Automation, University of Brescia, Via Branze 38-25123, Brescia,
More informationA One-to-One Code and Its Anti-Redundancy
A One-to-One Code and Its Anti-Redundancy W. Szpankowski Department of Computer Science, Purdue University July 4, 2005 This research is supported by NSF, NSA and NIH. Outline of the Talk. Prefix Codes
More informationP. Fenwick [5] has recently implemented a compression algorithm for English text of this type, with good results. EXERCISE. For those of you that know
Six Lectures on Information Theory John Kieer 2 Prediction & Information Theory Prediction is important in communications, control, forecasting, investment, and other areas. When the data model is known,
More informationVariable-to-Variable Codes with Small Redundancy Rates
Variable-to-Variable Codes with Small Redundancy Rates M. Drmota W. Szpankowski September 25, 2004 This research is supported by NSF, NSA and NIH. Institut f. Diskrete Mathematik und Geometrie, TU Wien,
More informationLecture 1: Shannon s Theorem
Lecture 1: Shannon s Theorem Lecturer: Travis Gagie January 13th, 2015 Welcome to Data Compression! I m Travis and I ll be your instructor this week. If you haven t registered yet, don t worry, we ll work
More informationDigital Communications III (ECE 154C) Introduction to Coding and Information Theory
Digital Communications III (ECE 154C) Introduction to Coding and Information Theory Tara Javidi These lecture notes were originally developed by late Prof. J. K. Wolf. UC San Diego Spring 2014 1 / 8 I
More informationDigital Communications III (ECE 154C) Introduction to Coding and Information Theory
Digital Communications III (ECE 154C) Introduction to Coding and Information Theory Tara Javidi These lecture notes were originally developed by late Prof. J. K. Wolf. UC San Diego Spring 2014 1 / 14 Statement
More informationCONVERGENCE THEOREMS FOR STRICTLY ASYMPTOTICALLY PSEUDOCONTRACTIVE MAPPINGS IN HILBERT SPACES. Gurucharan Singh Saluja
Opuscula Mathematica Vol 30 No 4 2010 http://dxdoiorg/107494/opmath2010304485 CONVERGENCE THEOREMS FOR STRICTLY ASYMPTOTICALLY PSEUDOCONTRACTIVE MAPPINGS IN HILBERT SPACES Gurucharan Singh Saluja Abstract
More informationThese outputs can be written in a more convenient form: with y(i) = Hc m (i) n(i) y(i) = (y(i); ; y K (i)) T ; c m (i) = (c m (i); ; c m K(i)) T and n
Binary Codes for synchronous DS-CDMA Stefan Bruck, Ulrich Sorger Institute for Network- and Signal Theory Darmstadt University of Technology Merckstr. 25, 6428 Darmstadt, Germany Tel.: 49 65 629, Fax:
More informationLow Density Parity Check (LDPC) Codes and the Need for Stronger ECC. August 2011 Ravi Motwani, Zion Kwok, Scott Nelson
Low Density Parity Check (LDPC) Codes and the Need for Stronger ECC August 2011 Ravi Motwani, Zion Kwok, Scott Nelson Agenda NAND ECC History Soft Information What is soft information How do we obtain
More informationELEMENTS O F INFORMATION THEORY
ELEMENTS O F INFORMATION THEORY THOMAS M. COVER JOY A. THOMAS Preface to the Second Edition Preface to the First Edition Acknowledgments for the Second Edition Acknowledgments for the First Edition x
More information