An Interpretation of Identification Entropy
|
|
- Madeleine Sims
- 6 years ago
- Views:
Transcription
1 4198 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL 52, NO 9, SEPTEMBER 2006 An Interpretation of Identication Entropy Rudolf Ahlswede Ning Cai, Senior Member, IEEE the expected number of checkings in the worst case a best code, finally, s are chosen by an RV independent of defined by, we consider (14) Abstract After Ahlswede introduced identication source coding he discovered identication entropy demonstrated that it plays a role analogously to classical entropy in Shannon s noiseless source coding We give now even more insight into this functional interpreting its two factors Index Terms Identication entropy, operational justication, source coding identication the average number of expected checkings, code is used, also the average number of expected checkings a best code A natural special case is the mean number of expected checkings (15) A Terminology I INTRODUCTION (16) Identication in source coding started in [3] Then identication entropy was discovered its operational signicance in noiseless source coding was demonstrated in [4] Familiarity with that paper is helpful, but not necessary here As far as possible we use its notation Dferences come from the fact that we use now a -ary coding alphabet, whereas earlier only the case was considered it was remarked only that all results generalize to arbitrary In particular the identication entropy, abbreviated as ID-entropy, the source has the m Shannon (in 1948) has shown that a source with output satisfying, can be encoded in a prefix code such that the -ary entropy (11) where is the length of We use a prefix code, abbreviated as PC, another purpose, namely, noiseless identication, that is, every user who wants to know whether a of his interest is the actual source output or cannot consider the rom variable (RV) with check whether coincides with in the first, second, etc, letter stop when the first dferent letter occurs or when Let be the expected number of checkings, code is used Related quantities are (12) which equals, (17) Another special case of some intuitive appeal is the case Here we write (18) It is known that Huffman codes minimize the expected code length PC This is not always the case the other quantities in identication In this correspondence an important incentive comes from Theorem 4 of [1] For, that is with -powers as probabilities Here the assumption means that there is a complete prefix code (ie, equality holds in Kraft s equality) B A Terminology Involving Proper Common Prefices The quantity is defined below also the case of not necessarily independent It is conveniently described in a terminology involving proper common prefices For an encoding, we define two words as the number of proper common prefices including the empty word, which equals the length of the maximal proper common prefix plus For example, (since the proper common prefices are ) Now with encoding PC RVs, measures the time steps it takes to decide whether are equal, that is, the checking time or waiting time, which we denote by (19) that is, the expected number of checkings a person in the worst case, code is used Clearly, we can write the expected waiting time as (110) (13) It is readily veried that independent, that is, Manuscript received April 13, 2005; revised April 4, 2006 Both authors are with the Fakultät fur Mathematik, University of Bielefeld, D Bielefeld, Germany ( hollmann@mathuni-bielefeldde; cai@mathematikuni-bielefeldde) Communicated by S A Savari, Associate Editor Source Coding Digital Object Identier /TIT (111) We give now another description For a word a code define as subset of has proper prefix (112) /$ IEEE
2 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL 52, NO 9, SEPTEMBER its indicator function Now Furthermore, sources with the block code, the unim distribution on achieves 1 Another interesting observation on (120) is that as the th component of (resp, ) is (resp, ), application of the Cauchy Schwarz inequality to (120) yields by (111) (113) equality holds f all (122) C Matrix Notation Next we look at the double infinite matrix (114) its minor labeled by sequences in Henceth we assume that are independent have distributions We can then use (111) For a prefix code induces, the distribution induces the distribution, when We state this in equivalent m as follows Lemma 1: equality holds f all which implies This suggests to introduce (123) (115) (116) Viewing both as row vectors, the corresponding column vector (111) can be written in the m (117) It is clear from (110) that a noncomplete prefix code, that is one which the Kraft sum is smaller than, can be improved identication by shortening a suitable codeword Hence, an optimal ID source code is necessarily complete In such a code (119) one can replace by its submatrix This implies (120) where are row vectors obtained by deleting the components Sometimes, the expressions (117) or (119) are more convenient the investigation of For example, it is easy to see that, theree, also are positive semidefinite Indeed, let (resp, ) be a matrix whose rows are labeled by sequences in (resp, ) whose columns are labeled by sequences in (resp, empty sequence ) such that its -entry is Then y is a proper prefix of otherwise (121) hence are positive semidefinite Theree, by (119), is -convex in as a measure of similarity of sources with respect to the code Intuitively, we feel that a good code source as user distribution should be very dissimilar, because then the user waits less time until he knows that the output of is not what he wants This idea will be used later code construction Actually, it is clear even in the general case where are not necessarily independent To simply the discussion, we assume here that the alphabet is binary, ie, Then the first bit of a codeword partitions the source into two parts ; where By (113), to minimize one has to choose a partition such that s are small simultaneously To construct a good code one can continue this line: partition to s such that are as small as possible, so on When are independent the requirement a good code is that the dference between is large We call this the LOCAL UNBALANCE PRINCIPLE in contrast to the GLOBAL BALANCE PRINCIPLE below Another extremal case is that are equal with probability one in this case one may never use the unbalance principle However, in this case the identication the source makes no sense: The user knows that his output definitely comes! But still we can investigate the problem by assuming that with high probability More specically, we consider the limit of a sequence of rom variables such that converges to 1 A proof is given in the thcoming PhD dissertation L-identication sources written by C Heup at the Department of Mathematics of the University of Bielefeld, Bielefeld, Germany 's
3 4200 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL 52, NO 9, SEPTEMBER 2006 in probability Then it follows from Proposition 1 that converges to the average length of codewords, the classical object in source coding! In this sense, identication sources is a generalization of source coding (data compression) One of the discoveries of [4] is that ID-entropy is a lower bound to In Section II we repeat the original proof we give in Section III another proof of this fact via two basic tools, Lemmas 3 4, where is the distribution of a memoryless source It provides a clear inmation-theoretical meaning of the two factors of ID-entropy Next we consider in Section IV sufficient necessary conditions a prefix code to achieve the ID-entropy lower bound Quite surprisingly, it turns out that the ID-entropy bound ID-time is achieved by a variable-length code f the Shannon entropy bound the average length of codewords is achieved by the same code (Theorem 2) Finally, we end the correspondence in Section V with a global balance principle to find good codes (Theorem 3) II AN OPERATIONAL JUSTIFICATION OF ID-ENTROPY AS LOWER BOUND FOR Recall from the Introduction that We repeat the first main result from [4] Central in our derivation is a proof by induction based on a decomposition mula trees Starting from the root a binary tree goes via to the subtree via to the subtree with sets of leaves, respectively A code can be viewed as a tree, where corresponds to the set of codewords, The leaves are labeled so that Using probabilities For the induction step, any code use the decomposition mula in Lemma 2, of course, the desired inequality as induction hypothesis where the grouping identity is used the equality This holds every theree also The approach readily extends also to the -ary case III AN ALTERNATIVE PROOF OF THE ID-ENTROPY LOWER BOUND FOR First, we establish Lemma 3 below, which holds the more general case Let be a discrete memoryless correlated source with a generic pair of variables Again, serves as (rom) source serves as rom user For a given code let be the code obtained by encoding the components of sequence iteratively That is, all Lemma 3: theree, (31) (32) (33) Proof: Since we can give the decomposition in the following Lemma 2: [4] For a code (33) follows from (32) immediately by the summation mula geometric series To show (32), we define first all rom variables otherwise (34) This readily yields the following result Theorem 1: [4] For every source we let be a constant convenience of notation Further, we let be the waiting time the rom user in the th block Conditional on it is defined like in (19) conditional on obviously, because the rom user has made his decision bee the s step Moreover, by the definition of (35) Proof: We proceed by induction on can be established as follows For, but The base case any, consequently, (36)
4 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL 52, NO 9, SEPTEMBER where (36) holds in case, because the rom user has to wait the first outcome Theree it follows that Then we define let be the common th prefix of the s the s in That is, we consider all sharing the same th prefix in as a single element Obviously (313) as we wanted to show Next we consider the case where are independent identically distributed with distribution so that (37) Finally, let be rom variables the new source new rom user with distribution let be a rom variable such that Then both are larger than otherwise More specically, we are looking a lower bound on all prefix codes over Lemma 4: For all there exists an such that sufficiently large all positive integers all prefix codes over Proof: For given we choose such that a sufficiently large familiar sets of typical sequences all (38) (39) (314) where is the rom waiting time, is the common conditional distribution of given, given, ie, is the restriction of to Notice that is a block code of length In order to bound we extend to a set of cardinality in the case of necessity assign zero probabilities a codeword of length not in of This little modication obviously does not change the value Thus, we denote the unim distribution over the Since a prefix code (310) (311) extended set by, we have where is a bijective block code It is clear that f the length of is smaller than (315) However, (38) implies that Then it follows from (113) that (316) This together with (311) yields (312) Next, the distribution the code over we construct a related source a code over as follows The new set contains its elements the new -coding is Now we define the additional elements in with its We partition into subsets according to the th prefix use letter to represent put the set into so that Finally, we combine (312), (313), (314), (315) (316) Lemma 4 follows An immediate consequence is the followig corollary Corollary 1: (317) Furthermore, independent identically distributed rom variables with distribution we have from (33) (317) follows the ID-entropy bound
5 4202 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL 52, NO 9, SEPTEMBER 2006 Corollary 2: (See [4, Theorem 2]) (318) This derivation provides a clear inmation-theoretical meaning to the two factors in ID-entropy: is a universal lower bound on the ID-waiting time a discrete memoryless source with an independent user having the same distribution is the cost paid coding the source componentwise leaving time the rom user in the following sense Let us imagine the following procedure At a unit of time, the rom source outputs a symbol the rom user, who wants to know whether, checks whether coincides with his own symbol He will end not Then the waiting time him is with probability IV SUFFICIENT AND NECESSARY CONDITIONS FOR A PREFIX CODE TO ACHIEVE THE ID-ENTROPY LOWER BOUND OF Quite surprisingly, the ID-entropy bound to ID-waiting time is achieved by a variable-length code f the Shannon entropy bound to the average lengths of codewords is achieved by the same code For the proof, we use a simple consequence of the Cauchy Schwarz inequality, which states two sequences of real numbers that with equality f some constant, say all or all Choosing all one has (41) Letting we obtain a geometric distribution The expected waiting time is with equality f (42) Theorem 2: Let be a prefix code Then the following statements are equivalent i) ii) For all with (43) which equals (319) in the case of independent identically distributed rom variables (Actually (32) holds all stationary sources we choose a memoryless source simplicity) In general (33) has the m iii) all such that such that share the same prefix of length implies (44) (45) (320) Proof: It is well known that i) is equivalent to By monotonicity, the limit at the right-h side theree also at the left-h side exists equals a positive finite or infinite value When it is finite, one may replace in the first lines of (319) by, respectively, obtain i ) For all Notice that i) the code show that (46) is necessarily complete We shall i') ii) iii) i ) the expectation of rom leaving time Thus (320) is rewritten as a stationary source (321) Ad : For all with, the code obtained by deleting the common prefix from all the codewords, is a complete code on, because is a complete code That is, (322) Now the inmation-theoretical meaning of (322) is quite clear One encodes a source with alphabet component by component by a variable length code The first term at the right-h side of (322) is the expected waiting time in a block the second term is the expected waiting time dferent, consequently, by (46)
6 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL 52, NO 9, SEPTEMBER Ad : Suppose (43) holds all we prove iii) by induction on In case, both sides of (45) are one Assume iii) holds all codes with let Let be as in the proof of (111) let be the prefix code the source with alphabet distribution such that all Then (43) (44) imply that ii) holds all all Next, we apply (43) to all with obtain which with (47) yields all Moreover, by the induction hypothesis all as by (43), (47) (48) (49) (410) Assume now that the implication iii) i ) holds all codes with maximum lengths that is a prefix code of maximum length Without loss of generality, we can assume that is complete, because otherwise we can add dummy symbols with probability to assign to them suitable codewords so that the Kraft sum equals, but this does not change equality (45) Having completeness, we can assume that there are symbols in with such that share a prefix of length Let be new symbols not in the original consider the probability distribution defined by some (412) Next we define a prefix code the source by using as follows: Then, Theree, by induction hypothesis some (413) (414) all (say) Finally, like in the proof of (111), we have (411) equality holds f Furthermore, it follows from (42) the definition of that that is (45) where the second equality holds by (410), the third equality holds, because is a partition of, the fourth equality follows from (49) the definition of Ad Again we proceed by induction on the maximum length of codewords Suppose first that a code Then Applying (42) to the ID-entropy we get with equality f is the unim distribution On the other h, since,, the equality holds f Then (45) holds f is unim, ie, (46) (415) By (413), the first inequality holds f ; it follows from (42) that the last inequality holds with equality f In order to have
7 4204 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL 52, NO 9, SEPTEMBER 2006 the two inequalities in (415) must be equalities However, this is equivalent with (46), ie, i ) V A GLOBAL BALANCE PRINCIPLE TO FIND GOOD CODES In case are independent identically distributed, there is no gain in using the local unbalance principle (LUP) But in this case, Corollary 1 (42) provide a way to find a good code We first rewrite Corollary 1 as By the assumptions on with their distribution Notice that in case (51) is a constant is minimized by choosing the s unimly This gives us a global balance principle (GBT) finding good codes We shall see the roles of both, the LUP the GBP in the proof of the following coding theorem discrete memoryless sources (DMSs) Theorem 3: For a DMS with generic distribution, ie, the generic rom variables are independent (52) Proof: Trivially, by Corollary 2, is a lower bound to Hence, we only have to construct codes to achieve asymptotically the bounds in (52) Case We choose a so that sufficiently large a (53) (54) Partition into two parts such that To simply matters, we assume This does not loose generality since enlarging the alphabet cannot make things worse Let Then we define a code by show that is arbitrarily close to one is sufficiently large Actually, it immediately follows from Proposition 1 theree, where the second inequality holds because as (55) the last inequality follows from (54) Case : Now we let For let be the set of -types ( -empirical distributions) on with Then there is a positive such that the empirical distribution of the output (resp, ) is in with probability larger than Next we choose an integer such that (56) Label sequences in by let be a mapping from to, where as follows If has type in has an index with -ary representation, ie,, then let (57) If the type of is not in, we arbitrarily choose a sequence in as For any fixed,, let be the set of sequences in such that is a prefix of Then it is not hard to see that all with More specically, all with Let (here it does not matter whether or not) Thus, we partition into parts as By the the asymptotic equipartition property (AEP), the dference of the conditional probability of the event that the output of is in or
8 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL 52, NO 9, SEPTEMBER than given that the type of is in is not larger with the notation in Corollary 1 with For all, we denote Recalling that with probability has type in the assumption that has the same distribution as, we obtain that all Then we have all by (58) Now we apply Corollary 1 to estimate which implies that all (58) when Recall that is a function from to that the definition of is actually the inverse image of under, ie Let furthermore let be a function on such that its restriction on is an injection into all Then our decoding function is defined as (59) (512) To estimate, we introduce an auxiliary source with alphabet probability distribution such that all Moreover, by definition of We divide the waiting time identication with code into two parts according to the two components in (59), we let be the rom waiting times of the two parts, respectively Now let be a binary rom variable such that in (512) we have shown that Consequently otherwise Then (513) Finally, by combining (510), (511), (512), (513) with the choice of in (56) we have that Let (510) be the code the auxiliary source with encoding function Then we have that (511) the desired inequality It is interesting that the limits of the waiting time of ID-codes on the left-h side of (52) are independent of the generic distributions only depend on whether they are equal In the case that they are not equal, it is even independent of the alphabet size In particular, in the case, we have seen in the proof that the key step is how to distribute the first symbol the LUP is applied in the second step Moreover, a good code the rom user with exponentially vanishing probability needs to wait the second symbol So the remaining parts of codewords are not so important
9 4206 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL 52, NO 9, SEPTEMBER 2006 Similarly, in the case, where we use instead of the LUP the GBP, the key parts of codewords is a relatively small prefix (in the proof it is the th prefix) after that the user with exponentially small probability has to wait Thus, again the remaining part of codewords is less important APPENDIX I COMMENTS ON GENERALIZED ENTROPIES After the discovery of ID-entropies in [4] work of Tsallis [13] also [14] was brought to our attention The equalities (1) (2) in [14] are here (A1) (A2) The letter used there corresponds to our letter, because us gives the alphabet size The generalization of Boltzmann s entropy We have been told by several experts in physics that the operational signicance of the quantities ( ) in statistical physics is not undisputed In contrast, the signicance of identication entropy was demonstrated in [4] (see Section II), which is mally close to, but essentially dferent from two reasons: always is uniquely determined depends on the alphabet size! We also have discussed the coding theoretical meanings of the factors More recently, we learned from referees that already in 1967 Havrda Charvát [7] introduced the entropies of type (A5) is the Boltzmann/Shannon entropy So, it is reasonable to define (A1) any real Notice that, which can be named One readily veries that product distributions independent rom variables This is a generalization of the BGS-entropy dferent from the Rényi entropies of order (which according to [2] were introduced by Schützenberger [9]) given by (A2) Since in all cases, respectively, correspond to superadditivity, additivity, subadditivity (also called the purposes in statistical physics superextensitivity, extensitivity, subextensitivity) We recall the grouping identity of [4] For a partition of Comparison shows that where This implies (A3) So, while the entropies of order the entropies of type are dferent, we see that the bijection since connects them Theree, we may ask what the advantage is in dealing with entropies of type We meanwhile also learned that the book [1] gives a comprehensive discussion Also Daróczy s contribution [6], where type is named degree, gives an enlightening analysis Note that Rényi entropies are additive, but not subadditive (except ) not recursive, they have not the branching property nor the sum property, that is, there exists a measurable function on such that we get Entropies of type, on the other h, are not additive but do have the subadditivity property the sum property furthermore are additive of degree : (A4) which is (A2)
10 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL 52, NO 9, SEPTEMBER strong additive of degree : recursive of degree : [9] M P Schützenberger, Contribution aux applications statistiques de la thorie de l inmation Paris, France: Publ Inst Statist Univ Paris 3, 1954, vol 3, 1 2, pp [10] C E Shannon, A mathematical theory of communication, Bell Syst Tech J, vol 27, pp , , 1948 [11] B D Sharma H C Gupta, Entropy as an optimal measure, Inmation theory, in Proc Int CNRS Colloq, Cachan, 1977), Paris, French, 1978, pp , Colloq Internat CNRS, 276, CNRS [12] F Topsøe, Game-theoretical equilibrium, maximum entropy minimum inmation discrimination, Maximum entropy Bayesian methods, Fund Theor Phys, vol 53, pp 15 23, 1992, Published by Kluwer Acad, Paris, France, 1993 [13] C Tsallis, Possible generalization of Boltzmann-Gibbs statistics, J Statist Phys, vol 52, no 1 2, pp , 1988 [14] C Tsallis, R S Mendes, A R Plastino, The role of constraints within generalized nonextensive statistics, Phys A, vol 261, pp , 1998 with (In consequence, entropies of type also have the branching property) It is clear now that binary alphabet the ID-entropy is exactly the entropy of type However, prior to [13] there are hardly any applications or operational justications of the entropy of type Moreover, the -ary case did not exist at all theree the name ID-entropy is well justied We feel that it must be said that in many papers (with several coauthors) Tsallis at least developed ideas to promote non-stard-equilibrium theory in Statistical Physics using generalized entropies generalized concepts of inner energy Our attention has been drawn also to the papers [5], [11], [12] with possibilities of connections to our work Recently, clear-cut progress was made by C Heup in his thcoming thesis with a generalization of ID-entropy motivated by L-identication REFERENCES [1] S Abe, Axioms uniqueness theorem Tsallis entropy, Phys Lett, vol A 271, no 1 2, pp 74 79, 2000 [2] J Aczél Z Daróczy, On Measures of Inmation their Characterizations, Mathematics in Science Engineering New York/ London: Academic, 1975, vol 115 [3] R Ahlswede, General theory of inmation transfer, Discr Appl Math, updated Special issue General Theory of Inmation Transfer Combinatorics, to be published [4] R Ahlswede, Identication Entropy, General Theory of Inmation Transfer Combinatorics, Oct 1, 2002 Aug 31, 2004, Report on a Research Project at the ZIF (Center of Interdisciplinary Studies) in Bielefeld, edited by R Ahlswede with the assistance of L Bäumer N Cai, Lecture Notes in Computer Sceince, no 4123, to be published [5] L L Campbell, A coding theorem Rényi s entropy, Inf Contr, vol 8, pp , 1965 [6] Z Daróczy, Generalized inmation functions, Inf Contr, vol 16, pp 36 51, 1970 [7] J Havrda F Charvát, Quantication method of classication processes, concept of structural -entropy, Kybernetika (Prague), vol 3, pp 30 35, 1967 [8] A Rényi, On measures of entropy inmation, in Proc 4th Berkeley Symp Mathematical Statistics Probability, vol I, pp , Berkeley, CA: Univ Calinia Press, 1961 Coding the Feedback Gel f Pinsker Channel the Feedward Wyner Ziv Source Neri Merhav, Fellow, IEEE, Tsachy Weissman, Member, IEEE Abstract We consider both channel coding source coding, with perfect past feedback/feedward, in the presence of side inmation It is first observed that feedback does not increase the capacity of the Gel f Pinsker channel, nor does feedward improve the achievable rate distortion permance in the Wyner Ziv problem We then focus on the Gaussian case showing that, as in the absence of side inmation, feedback/feedward allows to efficiently attain the respective permance limits In particular, we derive schemes via variations on that of Schalkwijk Kailath These variants, which are as simple as their origin require no binning, are shown to achieve, respectively, the capacity of Costa s channel, the Wyner Ziv rate distortion function Finally, we consider the finite-alphabet setting derive schemes both the channel the source coding problems that attain the fundamental limits, using variations on schemes of Ahlswede Ooi Wornell, of Martinian Wornell, respectively Index Terms Dirty paper, feedback, feedward, Schalkwijk Kailath scheme, side inmation, source channel duality I INTRODUCTION That feedback does not increase the capacity of a memoryless channel, yet can dramatically simply the schemes achieving it, is a well known fact (cf [6] the literature survey therein) More recently, an analogous phenomenon was shown to hold the dual problem of lossy source coding with perfect past feedback, also known as feedward, at the decoder [12], [7], [5], a problem arising in contexts as diverse as prediction theory, remote sensing, control In this work, we revisit these problems to accommodate the presence of side inmation As is the case problems without feedback/feed- Manuscript received September 1, 2005 The material in this correspondence was presented in part at the IEEE International Symposium on Inmation theory, Adelaide, Australia, September 2005 N Merhav is with the Department of Electrical Engineering, Technion-Israel Institute of Technology, Technion City, Haa 32000, Israel ( merhav@ eetechnionacil) T Weissman is with the Department of Electrical Engineering, Stand University, Stand, CA USA ( tsachy@standedu) Communicated by K Kobayashi, Associate Editor Shannon Theory Digital Object Identier /TIT /$ IEEE
10-704: Information Processing and Learning Fall Lecture 10: Oct 3
0-704: Information Processing and Learning Fall 206 Lecturer: Aarti Singh Lecture 0: Oct 3 Note: These notes are based on scribed notes from Spring5 offering of this course. LaTeX template courtesy of
More informationCODING THEOREMS ON NEW ADDITIVE INFORMATION MEASURE OF ORDER
Pak. J. Statist. 2018 Vol. 34(2), 137-146 CODING THEOREMS ON NEW ADDITIVE INFORMATION MEASURE OF ORDER Ashiq Hussain Bhat 1 and M.A.K. Baig 2 Post Graduate Department of Statistics, University of Kashmir,
More informationChapter 3 Source Coding. 3.1 An Introduction to Source Coding 3.2 Optimal Source Codes 3.3 Shannon-Fano Code 3.4 Huffman Code
Chapter 3 Source Coding 3. An Introduction to Source Coding 3.2 Optimal Source Codes 3.3 Shannon-Fano Code 3.4 Huffman Code 3. An Introduction to Source Coding Entropy (in bits per symbol) implies in average
More informationLecture 1: September 25, A quick reminder about random variables and convexity
Information and Coding Theory Autumn 207 Lecturer: Madhur Tulsiani Lecture : September 25, 207 Administrivia This course will cover some basic concepts in information and coding theory, and their applications
More informationEntropy and Ergodic Theory Lecture 3: The meaning of entropy in information theory
Entropy and Ergodic Theory Lecture 3: The meaning of entropy in information theory 1 The intuitive meaning of entropy Modern information theory was born in Shannon s 1948 paper A Mathematical Theory of
More informationChapter 9 Fundamental Limits in Information Theory
Chapter 9 Fundamental Limits in Information Theory Information Theory is the fundamental theory behind information manipulation, including data compression and data transmission. 9.1 Introduction o For
More informationEntropy as a measure of surprise
Entropy as a measure of surprise Lecture 5: Sam Roweis September 26, 25 What does information do? It removes uncertainty. Information Conveyed = Uncertainty Removed = Surprise Yielded. How should we quantify
More informationIN this paper, we consider the capacity of sticky channels, a
72 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 54, NO. 1, JANUARY 2008 Capacity Bounds for Sticky Channels Michael Mitzenmacher, Member, IEEE Abstract The capacity of sticky channels, a subclass of insertion
More informationONE approach to improving the performance of a quantizer
640 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 52, NO. 2, FEBRUARY 2006 Quantizers With Unim Decoders Channel-Optimized Encoders Benjamin Farber Kenneth Zeger, Fellow, IEEE Abstract Scalar quantizers
More informationAmobile satellite communication system, like Motorola s
I TRANSACTIONS ON INFORMATION THORY, VOL. 45, NO. 4, MAY 1999 1111 Distributed Source Coding for Satellite Communications Raymond W. Yeung, Senior Member, I, Zhen Zhang, Senior Member, I Abstract Inspired
More information5218 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 52, NO. 12, DECEMBER 2006
5218 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 52, NO. 12, DECEMBER 2006 Source Coding With Limited-Look-Ahead Side Information at the Decoder Tsachy Weissman, Member, IEEE, Abbas El Gamal, Fellow,
More information4488 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 54, NO. 10, OCTOBER /$ IEEE
4488 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 54, NO. 10, OCTOBER 2008 List Decoding of Biorthogonal Codes the Hadamard Transform With Linear Complexity Ilya Dumer, Fellow, IEEE, Grigory Kabatiansky,
More informationSuperposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels
Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Nan Liu and Andrea Goldsmith Department of Electrical Engineering Stanford University, Stanford CA 94305 Email:
More informationVariable Length Codes for Degraded Broadcast Channels
Variable Length Codes for Degraded Broadcast Channels Stéphane Musy School of Computer and Communication Sciences, EPFL CH-1015 Lausanne, Switzerland Email: stephane.musy@ep.ch Abstract This paper investigates
More information1590 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 48, NO. 6, JUNE Source Coding, Large Deviations, and Approximate Pattern Matching
1590 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 48, NO. 6, JUNE 2002 Source Coding, Large Deviations, and Approximate Pattern Matching Amir Dembo and Ioannis Kontoyiannis, Member, IEEE Invited Paper
More informationOptimum Binary-Constrained Homophonic Coding
Optimum Binary-Constrained Homophonic Coding Valdemar C. da Rocha Jr. and Cecilio Pimentel Communications Research Group - CODEC Department of Electronics and Systems, P.O. Box 7800 Federal University
More informationSIGNAL COMPRESSION Lecture 7. Variable to Fix Encoding
SIGNAL COMPRESSION Lecture 7 Variable to Fix Encoding 1. Tunstall codes 2. Petry codes 3. Generalized Tunstall codes for Markov sources (a presentation of the paper by I. Tabus, G. Korodi, J. Rissanen.
More information1 Introduction to information theory
1 Introduction to information theory 1.1 Introduction In this chapter we present some of the basic concepts of information theory. The situations we have in mind involve the exchange of information through
More informationCoding of memoryless sources 1/35
Coding of memoryless sources 1/35 Outline 1. Morse coding ; 2. Definitions : encoding, encoding efficiency ; 3. fixed length codes, encoding integers ; 4. prefix condition ; 5. Kraft and Mac Millan theorems
More informationSUCCESSIVE refinement of information, or scalable
IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 49, NO. 8, AUGUST 2003 1983 Additive Successive Refinement Ertem Tuncel, Student Member, IEEE, Kenneth Rose, Fellow, IEEE Abstract Rate-distortion bounds for
More informationUpper Bounds on the Capacity of Binary Intermittent Communication
Upper Bounds on the Capacity of Binary Intermittent Communication Mostafa Khoshnevisan and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame, Indiana 46556 Email:{mhoshne,
More informationA Generalized Fuzzy Inaccuracy Measure of Order ɑ and Type β and Coding Theorems
International Journal of Fuzzy Mathematics and Systems. ISSN 2248-9940 Volume 4, Number (204), pp. 27-37 Research India Publications http://www.ripublication.com A Generalized Fuzzy Inaccuracy Measure
More informationThe Duality Between Information Embedding and Source Coding With Side Information and Some Applications
IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 49, NO. 5, MAY 2003 1159 The Duality Between Information Embedding and Source Coding With Side Information and Some Applications Richard J. Barron, Member,
More information1 Ex. 1 Verify that the function H(p 1,..., p n ) = k p k log 2 p k satisfies all 8 axioms on H.
Problem sheet Ex. Verify that the function H(p,..., p n ) = k p k log p k satisfies all 8 axioms on H. Ex. (Not to be handed in). looking at the notes). List as many of the 8 axioms as you can, (without
More informationOn Competitive Prediction and Its Relation to Rate-Distortion Theory
IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 49, NO. 12, DECEMBER 2003 3185 On Competitive Prediction and Its Relation to Rate-Distortion Theory Tsachy Weissman, Member, IEEE, and Neri Merhav, Fellow,
More informationEE5139R: Problem Set 4 Assigned: 31/08/16, Due: 07/09/16
EE539R: Problem Set 4 Assigned: 3/08/6, Due: 07/09/6. Cover and Thomas: Problem 3.5 Sets defined by probabilities: Define the set C n (t = {x n : P X n(x n 2 nt } (a We have = P X n(x n P X n(x n 2 nt
More informationUC Riverside UC Riverside Previously Published Works
UC Riverside UC Riverside Previously Published Works Title Soft-decision decoding of Reed-Muller codes: A simplied algorithm Permalink https://escholarship.org/uc/item/5v71z6zr Journal IEEE Transactions
More informationLecture 4 : Adaptive source coding algorithms
Lecture 4 : Adaptive source coding algorithms February 2, 28 Information Theory Outline 1. Motivation ; 2. adaptive Huffman encoding ; 3. Gallager and Knuth s method ; 4. Dictionary methods : Lempel-Ziv
More informationDecomposing Bent Functions
2004 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 49, NO. 8, AUGUST 2003 Decomposing Bent Functions Anne Canteaut and Pascale Charpin Abstract In a recent paper [1], it is shown that the restrictions
More informationDuality Between Channel Capacity and Rate Distortion With Two-Sided State Information
IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 48, NO. 6, JUNE 2002 1629 Duality Between Channel Capacity Rate Distortion With Two-Sided State Information Thomas M. Cover, Fellow, IEEE, Mung Chiang, Student
More informationOptimal Block-Type-Decodable Encoders for Constrained Systems
IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 49, NO. 5, MAY 2003 1231 Optimal Block-Type-Decodable Encoders for Constrained Systems Panu Chaichanavong, Student Member, IEEE, Brian H. Marcus, Fellow, IEEE
More informationConvexity/Concavity of Renyi Entropy and α-mutual Information
Convexity/Concavity of Renyi Entropy and -Mutual Information Siu-Wai Ho Institute for Telecommunications Research University of South Australia Adelaide, SA 5095, Australia Email: siuwai.ho@unisa.edu.au
More informationLecture 16. Error-free variable length schemes (contd.): Shannon-Fano-Elias code, Huffman code
Lecture 16 Agenda for the lecture Error-free variable length schemes (contd.): Shannon-Fano-Elias code, Huffman code Variable-length source codes with error 16.1 Error-free coding schemes 16.1.1 The Shannon-Fano-Elias
More informationOn Lossless Coding With Coded Side Information Daniel Marco, Member, IEEE, and Michelle Effros, Fellow, IEEE
3284 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 55, NO. 7, JULY 2009 On Lossless Coding With Coded Side Information Daniel Marco, Member, IEEE, Michelle Effros, Fellow, IEEE Abstract This paper considers
More informationAsymptotic redundancy and prolixity
Asymptotic redundancy and prolixity Yuval Dagan, Yuval Filmus, and Shay Moran April 6, 2017 Abstract Gallager (1978) considered the worst-case redundancy of Huffman codes as the maximum probability tends
More informationThe Poisson Channel with Side Information
The Poisson Channel with Side Information Shraga Bross School of Enginerring Bar-Ilan University, Israel brosss@macs.biu.ac.il Amos Lapidoth Ligong Wang Signal and Information Processing Laboratory ETH
More informationInformation Theory and Coding Prof. S. N. Merchant Department of Electrical Engineering Indian Institute of Technology, Bombay
Information Theory and Coding Prof. S. N. Merchant Department of Electrical Engineering Indian Institute of Technology, Bombay Lecture - 13 Competitive Optimality of the Shannon Code So, far we have studied
More informationIN this paper, we study the problem of universal lossless compression
4008 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 52, NO. 9, SEPTEMBER 2006 An Algorithm for Universal Lossless Compression With Side Information Haixiao Cai, Member, IEEE, Sanjeev R. Kulkarni, Fellow,
More informationWITH advances in communications media and technologies
IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 45, NO. 6, SEPTEMBER 1999 1887 Distortion-Rate Bounds for Fixed- Variable-Rate Multiresolution Source Codes Michelle Effros, Member, IEEE Abstract The source
More informationL-Identification for Sources
L-Identification for Sources Dissertation zur Erlangung des Doktorgrades der Fakultät für Mathematik der Universität Bielefeld vorgelegt von hristian Heup November 2006 Acknowledgement I would like to
More informationEntropy measures of physics via complexity
Entropy measures of physics via complexity Giorgio Kaniadakis and Flemming Topsøe Politecnico of Torino, Department of Physics and University of Copenhagen, Department of Mathematics 1 Introduction, Background
More informationInformation Theory with Applications, Math6397 Lecture Notes from September 30, 2014 taken by Ilknur Telkes
Information Theory with Applications, Math6397 Lecture Notes from September 3, 24 taken by Ilknur Telkes Last Time Kraft inequality (sep.or) prefix code Shannon Fano code Bound for average code-word length
More informationA Generalized Uncertainty Principle and Sparse Representation in Pairs of Bases
2558 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL 48, NO 9, SEPTEMBER 2002 A Generalized Uncertainty Principle Sparse Representation in Pairs of Bases Michael Elad Alfred M Bruckstein Abstract An elementary
More informationLecture 3 : Algorithms for source coding. September 30, 2016
Lecture 3 : Algorithms for source coding September 30, 2016 Outline 1. Huffman code ; proof of optimality ; 2. Coding with intervals : Shannon-Fano-Elias code and Shannon code ; 3. Arithmetic coding. 1/39
More informationOn Scalable Coding in the Presence of Decoder Side Information
On Scalable Coding in the Presence of Decoder Side Information Emrah Akyol, Urbashi Mitra Dep. of Electrical Eng. USC, CA, US Email: {eakyol, ubli}@usc.edu Ertem Tuncel Dep. of Electrical Eng. UC Riverside,
More informationYAMAMOTO [1] considered the cascade source coding
IEEE TRANSACTIONS ON INFORMATION THEORY, VOL 58, NO 6, JUNE 2012 3339 Cascade Triangular Source Coding With Side Information at the First Two Nodes Haim H Permuter, Member, IEEE, Tsachy Weissman, Senior
More informationCSCI 2570 Introduction to Nanocomputing
CSCI 2570 Introduction to Nanocomputing Information Theory John E Savage What is Information Theory Introduced by Claude Shannon. See Wikipedia Two foci: a) data compression and b) reliable communication
More informationMultiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets
Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets Shivaprasad Kotagiri and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame,
More informationTight Upper Bounds on the Redundancy of Optimal Binary AIFV Codes
Tight Upper Bounds on the Redundancy of Optimal Binary AIFV Codes Weihua Hu Dept. of Mathematical Eng. Email: weihua96@gmail.com Hirosuke Yamamoto Dept. of Complexity Sci. and Eng. Email: Hirosuke@ieee.org
More information6196 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 57, NO. 9, SEPTEMBER 2011
6196 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 57, NO. 9, SEPTEMBER 2011 On the Structure of Real-Time Encoding and Decoding Functions in a Multiterminal Communication System Ashutosh Nayyar, Student
More information3F1 Information Theory, Lecture 3
3F1 Information Theory, Lecture 3 Jossy Sayir Department of Engineering Michaelmas 2011, 28 November 2011 Memoryless Sources Arithmetic Coding Sources with Memory 2 / 19 Summary of last lecture Prefix-free
More informationChapter 2 Date Compression: Source Coding. 2.1 An Introduction to Source Coding 2.2 Optimal Source Codes 2.3 Huffman Code
Chapter 2 Date Compression: Source Coding 2.1 An Introduction to Source Coding 2.2 Optimal Source Codes 2.3 Huffman Code 2.1 An Introduction to Source Coding Source coding can be seen as an efficient way
More informationIEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 57, NO. 11, NOVEMBER On the Performance of Sparse Recovery
IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 57, NO. 11, NOVEMBER 2011 7255 On the Performance of Sparse Recovery Via `p-minimization (0 p 1) Meng Wang, Student Member, IEEE, Weiyu Xu, and Ao Tang, Senior
More informationAn Approximation Algorithm for Constructing Error Detecting Prefix Codes
An Approximation Algorithm for Constructing Error Detecting Prefix Codes Artur Alves Pessoa artur@producao.uff.br Production Engineering Department Universidade Federal Fluminense, Brazil September 2,
More informationELEMENTS O F INFORMATION THEORY
ELEMENTS O F INFORMATION THEORY THOMAS M. COVER JOY A. THOMAS Preface to the Second Edition Preface to the First Edition Acknowledgments for the Second Edition Acknowledgments for the First Edition x
More informationInformation Theory, Statistics, and Decision Trees
Information Theory, Statistics, and Decision Trees Léon Bottou COS 424 4/6/2010 Summary 1. Basic information theory. 2. Decision trees. 3. Information theory and statistics. Léon Bottou 2/31 COS 424 4/6/2010
More information1 Background on Information Theory
Review of the book Information Theory: Coding Theorems for Discrete Memoryless Systems by Imre Csiszár and János Körner Second Edition Cambridge University Press, 2011 ISBN:978-0-521-19681-9 Review by
More informationSource Coding. Master Universitario en Ingeniería de Telecomunicación. I. Santamaría Universidad de Cantabria
Source Coding Master Universitario en Ingeniería de Telecomunicación I. Santamaría Universidad de Cantabria Contents Introduction Asymptotic Equipartition Property Optimal Codes (Huffman Coding) Universal
More informationA NOTE ON STRATEGY ELIMINATION IN BIMATRIX GAMES
A NOTE ON STRATEGY ELIMINATION IN BIMATRIX GAMES Donald E. KNUTH Department of Computer Science. Standford University. Stanford2 CA 94305. USA Christos H. PAPADIMITRIOU Department of Computer Scrence and
More informationComplexity Theory VU , SS The Polynomial Hierarchy. Reinhard Pichler
Complexity Theory Complexity Theory VU 181.142, SS 2018 6. The Polynomial Hierarchy Reinhard Pichler Institut für Informationssysteme Arbeitsbereich DBAI Technische Universität Wien 15 May, 2018 Reinhard
More informationarxiv: v1 [cs.it] 5 Sep 2008
1 arxiv:0809.1043v1 [cs.it] 5 Sep 2008 On Unique Decodability Marco Dalai, Riccardo Leonardi Abstract In this paper we propose a revisitation of the topic of unique decodability and of some fundamental
More informationOutline. Complexity Theory EXACT TSP. The Class DP. Definition. Problem EXACT TSP. Complexity of EXACT TSP. Proposition VU 181.
Complexity Theory Complexity Theory Outline Complexity Theory VU 181.142, SS 2018 6. The Polynomial Hierarchy Reinhard Pichler Institut für Informationssysteme Arbeitsbereich DBAI Technische Universität
More informationPerfect Two-Fault Tolerant Search with Minimum Adaptiveness 1
Advances in Applied Mathematics 25, 65 101 (2000) doi:10.1006/aama.2000.0688, available online at http://www.idealibrary.com on Perfect Two-Fault Tolerant Search with Minimum Adaptiveness 1 Ferdinando
More informationRényi Information Dimension: Fundamental Limits of Almost Lossless Analog Compression
IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 8, AUGUST 2010 3721 Rényi Information Dimension: Fundamental Limits of Almost Lossless Analog Compression Yihong Wu, Student Member, IEEE, Sergio Verdú,
More informationSHANNON S information measures refer to entropies, conditional
1924 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 43, NO. 6, NOVEMBER 1997 A Framework for Linear Information Inequalities Raymond W. Yeung, Senior Member, IEEE Abstract We present a framework for information
More informationOn the Cross-Correlation of a p-ary m-sequence of Period p 2m 1 and Its Decimated
IEEE TRANSACTIONS ON INFORMATION THEORY, VOL 58, NO 3, MARCH 01 1873 On the Cross-Correlation of a p-ary m-sequence of Period p m 1 Its Decimated Sequences by (p m +1) =(p +1) Sung-Tai Choi, Taehyung Lim,
More informationLecture 9 Polar Coding
Lecture 9 Polar Coding I-Hsiang ang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw December 29, 2015 1 / 25 I-Hsiang ang IT Lecture 9 In Pursuit of Shannon s Limit Since
More informationLOW-density parity-check (LDPC) codes were invented
IEEE TRANSACTIONS ON INFORMATION THEORY, VOL 54, NO 1, JANUARY 2008 51 Extremal Problems of Information Combining Yibo Jiang, Alexei Ashikhmin, Member, IEEE, Ralf Koetter, Senior Member, IEEE, and Andrew
More informationThe Optimal Fix-Free Code for Anti-Uniform Sources
Entropy 2015, 17, 1379-1386; doi:10.3390/e17031379 OPEN ACCESS entropy ISSN 1099-4300 www.mdpi.com/journal/entropy Article The Optimal Fix-Free Code for Anti-Uniform Sources Ali Zaghian 1, Adel Aghajan
More informationCoding for Discrete Source
EGR 544 Communication Theory 3. Coding for Discrete Sources Z. Aliyazicioglu Electrical and Computer Engineering Department Cal Poly Pomona Coding for Discrete Source Coding Represent source data effectively
More informationarxiv: v4 [cs.it] 17 Oct 2015
Upper Bounds on the Relative Entropy and Rényi Divergence as a Function of Total Variation Distance for Finite Alphabets Igal Sason Department of Electrical Engineering Technion Israel Institute of Technology
More informationECE Advanced Communication Theory, Spring 2009 Homework #1 (INCOMPLETE)
ECE 74 - Advanced Communication Theory, Spring 2009 Homework #1 (INCOMPLETE) 1. A Huffman code finds the optimal codeword to assign to a given block of source symbols. (a) Show that cannot be a Huffman
More informationThe Capacity of Finite Abelian Group Codes Over Symmetric Memoryless Channels Giacomo Como and Fabio Fagnani
IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 55, NO. 5, MAY 2009 2037 The Capacity of Finite Abelian Group Codes Over Symmetric Memoryless Channels Giacomo Como and Fabio Fagnani Abstract The capacity
More informationWE study the capacity of peak-power limited, single-antenna,
1158 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 3, MARCH 2010 Gaussian Fading Is the Worst Fading Tobias Koch, Member, IEEE, and Amos Lapidoth, Fellow, IEEE Abstract The capacity of peak-power
More informationChain Independence and Common Information
1 Chain Independence and Common Information Konstantin Makarychev and Yury Makarychev Abstract We present a new proof of a celebrated result of Gács and Körner that the common information is far less than
More informationInformation Theory CHAPTER. 5.1 Introduction. 5.2 Entropy
Haykin_ch05_pp3.fm Page 207 Monday, November 26, 202 2:44 PM CHAPTER 5 Information Theory 5. Introduction As mentioned in Chapter and reiterated along the way, the purpose of a communication system is
More informationLecture 4 Noisy Channel Coding
Lecture 4 Noisy Channel Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 9, 2015 1 / 56 I-Hsiang Wang IT Lecture 4 The Channel Coding Problem
More informationASIGNIFICANT research effort has been devoted to the. Optimal State Estimation for Stochastic Systems: An Information Theoretic Approach
IEEE TRANSACTIONS ON AUTOMATIC CONTROL, VOL 42, NO 6, JUNE 1997 771 Optimal State Estimation for Stochastic Systems: An Information Theoretic Approach Xiangbo Feng, Kenneth A Loparo, Senior Member, IEEE,
More informationJoint Universal Lossy Coding and Identification of Stationary Mixing Sources With General Alphabets Maxim Raginsky, Member, IEEE
IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 55, NO. 5, MAY 2009 1945 Joint Universal Lossy Coding and Identification of Stationary Mixing Sources With General Alphabets Maxim Raginsky, Member, IEEE Abstract
More informationLectures 6, 7 and part of 8
Lectures 6, 7 and part of 8 Uriel Feige April 26, May 3, May 10, 2015 1 Linear programming duality 1.1 The diet problem revisited Recall the diet problem from Lecture 1. There are n foods, m nutrients,
More informationTight Bounds on Minimum Maximum Pointwise Redundancy
Tight Bounds on Minimum Maximum Pointwise Redundancy Michael B. Baer vlnks Mountain View, CA 94041-2803, USA Email:.calbear@ 1eee.org Abstract This paper presents new lower and upper bounds for the optimal
More informationSeminaar Abstrakte Wiskunde Seminar in Abstract Mathematics Lecture notes in progress (27 March 2010)
http://math.sun.ac.za/amsc/sam Seminaar Abstrakte Wiskunde Seminar in Abstract Mathematics 2009-2010 Lecture notes in progress (27 March 2010) Contents 2009 Semester I: Elements 5 1. Cartesian product
More informationLattices for Distributed Source Coding: Jointly Gaussian Sources and Reconstruction of a Linear Function
Lattices for Distributed Source Coding: Jointly Gaussian Sources and Reconstruction of a Linear Function Dinesh Krithivasan and S. Sandeep Pradhan Department of Electrical Engineering and Computer Science,
More informationChannel Polarization and Blackwell Measures
Channel Polarization Blackwell Measures Maxim Raginsky Abstract The Blackwell measure of a binary-input channel (BIC is the distribution of the posterior probability of 0 under the uniform input distribution
More informationUNIT I INFORMATION THEORY. I k log 2
UNIT I INFORMATION THEORY Claude Shannon 1916-2001 Creator of Information Theory, lays the foundation for implementing logic in digital circuits as part of his Masters Thesis! (1939) and published a paper
More informationIntro to Information Theory
Intro to Information Theory Math Circle February 11, 2018 1. Random variables Let us review discrete random variables and some notation. A random variable X takes value a A with probability P (a) 0. Here
More informationAn Alternative Proof for the Capacity Region of the Degraded Gaussian MIMO Broadcast Channel
IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 58, NO. 4, APRIL 2012 2427 An Alternative Proof for the Capacity Region of the Degraded Gaussian MIMO Broadcast Channel Ersen Ekrem, Student Member, IEEE,
More informationCOMPLEMENTARY graph entropy was introduced
IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 55, NO. 6, JUNE 2009 2537 On Complementary Graph Entropy Ertem Tuncel, Member, IEEE, Jayanth Nayak, Prashant Koulgi, Student Member, IEEE, Kenneth Rose, Fellow,
More informationarxiv:cs/ v1 [cs.it] 15 Sep 2005
On Hats and other Covers (Extended Summary) arxiv:cs/0509045v1 [cs.it] 15 Sep 2005 Hendrik W. Lenstra Gadiel Seroussi Abstract. We study a game puzzle that has enjoyed recent popularity among mathematicians,
More informationAn introduction to basic information theory. Hampus Wessman
An introduction to basic information theory Hampus Wessman Abstract We give a short and simple introduction to basic information theory, by stripping away all the non-essentials. Theoretical bounds on
More information(Classical) Information Theory III: Noisy channel coding
(Classical) Information Theory III: Noisy channel coding Sibasish Ghosh The Institute of Mathematical Sciences CIT Campus, Taramani, Chennai 600 113, India. p. 1 Abstract What is the best possible way
More informationOptimal matching in wireless sensor networks
Optimal matching in wireless sensor networks A. Roumy, D. Gesbert INRIA-IRISA, Rennes, France. Institute Eurecom, Sophia Antipolis, France. Abstract We investigate the design of a wireless sensor network
More informationInformation Theory and Statistics Lecture 2: Source coding
Information Theory and Statistics Lecture 2: Source coding Łukasz Dębowski ldebowsk@ipipan.waw.pl Ph. D. Programme 2013/2014 Injections and codes Definition (injection) Function f is called an injection
More informationOn the Cost of Worst-Case Coding Length Constraints
On the Cost of Worst-Case Coding Length Constraints Dror Baron and Andrew C. Singer Abstract We investigate the redundancy that arises from adding a worst-case length-constraint to uniquely decodable fixed
More informationInformation measures in simple coding problems
Part I Information measures in simple coding problems in this web service in this web service Source coding and hypothesis testing; information measures A(discrete)source is a sequence {X i } i= of random
More informationChapter 2: Source coding
Chapter 2: meghdadi@ensil.unilim.fr University of Limoges Chapter 2: Entropy of Markov Source Chapter 2: Entropy of Markov Source Markov model for information sources Given the present, the future is independent
More informationSecurity in Locally Repairable Storage
1 Security in Locally Repairable Storage Abhishek Agarwal and Arya Mazumdar Abstract In this paper we extend the notion of locally repairable codes to secret sharing schemes. The main problem we consider
More informationNew Generalized Entropy Measure and its Corresponding Code-word Length and Their Characterizations
New Generalized Entropy Measure and its Corresponding Code-word Length and Their Characterizations Ashiq Hussain Bhat 1, Dr. M. A. K. Baig 2, Dr. Muzafar Hussain Dar 3 1,2 Post Graduate Department of Statistics
More information4. Quantization and Data Compression. ECE 302 Spring 2012 Purdue University, School of ECE Prof. Ilya Pollak
4. Quantization and Data Compression ECE 32 Spring 22 Purdue University, School of ECE Prof. What is data compression? Reducing the file size without compromising the quality of the data stored in the
More informationAlgebraic Soft-Decision Decoding of Reed Solomon Codes
IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 49, NO. 11, NOVEMBER 2003 2809 Algebraic Soft-Decision Decoding of Reed Solomon Codes Ralf Koetter, Member, IEEE, Alexer Vardy, Fellow, IEEE Abstract A polynomial-time
More information