A Coding Theorem Connected on R-Norm Entropy
|
|
- Randall Perry
- 5 years ago
- Views:
Transcription
1 Int. J. Contemp. Math. Sciences, Vol. 6, 2011, no. 17, A Coding Theorem Connected on -Norm Entropy Satish Kumar and Arun Choudhary Department of Mathematics Geeta Institute of Management & Technology Kanipla , Kurushetra, Haryana, India arunchoudhary07@gmail.com Abstract A relation between Shannon entropy and Kerridge inaccuracy, which is nown as Shannon inequality, is well nown in information theory. In this communication, first we generalized Shannon inequality and then given its application in coding theory. Mathematics Subject Classification: 94A15, 94A17, 94A24, 26D15 Keywords: Shannon inequality, Codeword length, Holder s inequality, Kraft inequality and Optimal code length 1 Introduction We consider the following set of positive real numbers : = { : >0, 1}. Let Δ n = {P =p 1,p 2,..., p n ); p 0 and p =1}. Boeee and Lubbe [4] studied -Norm entropy of the distribution P is given by H P )= n 1 p. 1.1) Actually, the -norm entropy 1.1) is a real function from Δ n, where n 2. This measure is different from the entropies of Shannon s [14], enyi s [13], Havrda and Charvat [8] and Daroczy [6]. The most interesting property of this measure is that when 1, it approaches to Shannon s [14] entropy and in case, H P ) 1 max p ), =1, 2,..., n. in 1.1), we get Setting r = 1 H r P )= 1 [ n ) r ] 1 p 1 r 1 r, r > 0 1), 1.2)
2 826 S. Kumar and A. Choudhary which is a measure mentioned by Arimoto [1] as an example of a generalized class of information measure. It may be mared that 1.2) also approaches to Shannon s [14] entropy as r 1. For P Δ n, Shannon s measure of information [14] is defined as H P )= p log D p. 1.3) The measure 1.3) has been generalized by various authors and has found applications in various disciplines such as economics, accounting, crime and physics etc. For P,Q Δ n, Kerridge [10] introduced a quantity nown as inaccuracy defined as: H P, Q) = p log D q. 1.4) There is well nown relation between HP) and HP, Q) which is given by HP) HP, Q). 1.5) The relation 1.5) is nown as Shannon inequality and its importance is well nown in coding theory. In the literature of information theory, there are many approaches to extend the relation 1.5) for other measures. Nath and Mittal [12] extended the relation 1.5) in the case of entropy of type β. Using the method of Nath and Mittal [12], Lubbe [18] generalized 1.5) in the case of enyi s entropy. On the other hand, using the method of Campbell [5], Lubbe [18] generalized 1.5) for the case of entropy of type β. Using these generalizations, coding theorems are proved by these authors for these measures. The objective of this communication is to generalize 1.5) for 1.1) and give its application in coding theory. 2 Generalization of Shannon Inequality For P, Q Δ n, we define a measure of inaccuracy, denoted by HP,Q;) as H P, Q; ) = n 1 p q 1),>0, ) Since H P, Q; ) H P ; ), we will not interpret 2.1) as a measure of inaccuracy. But H P, Q; ) is a generalization of the measure of inaccuracy
3 A coding theorem connected on -norm entropy 827 defined in 1.1). In spite of the fact that H P, Q; ) is not a measure of inaccuracy in its usual sense, its study is justified because it leads to meaningful new measures of length. In the following theorem, we will determine a relation between 1.1) and 2.1) of the type 1.5). Since 2.1) is not a measure of inaccuracy in its usual sense, we will call the generalized relation as pseudo-generalization of the Shannon inequality. Theorem 1. IfP,Q Δ n, then it holds that H P ; ) H P, Q; ) 2.2) under the condition n p 2.3) q and equality holds if q = p ;,2,...,n. Proof : a) If 0 <<1. By Holder s inequality [15] x p p n y q q x y 2.4) for all x,y > 0, i=1, 2,..., n and =1,p<1 0),q<0or p q q<1 0),p<0. We see that equality holds if and only if there exists a positive constant c such that x p = cyq. Maing the substitutions p =, q = in 2.4), we get x = p 1 q, y = p ) p q 1 1 Using the condition 2.3), we get p q ; >0, 1. p q 1 ) 1 p p ; >0, )
4 828 S. Kumar and A. Choudhary Since 0 <<1, 2.5) becomes p q 1 p 2.6) using 2.6) and the fact that 0 <<1, we get 2.2). b) If >1, the proof follows on the similar lines. 3 Application in Coding Theory We will now give an application of theorem 1 in coding theory. Let a finite set of n-input symbols with probabilities p 1,p 2,..., p n be encoded in terms of symbols taen from the alphabet {a 1,a 2,..., a n }. Then it is nown Feinstein [7] that there always exist a uniquely decipherable code with lengths N 1,N 2,..., N n iff D N ) If L = p N is the average codeword length, then for a code which satisfies 2.7), it has been shown that Feinstein [7], L H P ) 2.8) with equality iff N = log D p ; =1, 2,..., n and that by suitable encoded into words of long sequences, the average length can be made arbitrary close to H P ). This is Shannon s noiseless coding theorem. By considering enyi s [13] entropy, a coding theorem and analogous to the above noiseless coding theorem has been established by Campbell [5] and the log 1 α D p α authors obtained bounds for it in terms of H α P )= 1 ; α 1,α>0. Kieffer [11] defined a class rules and showed H α P ) is the best decision rule for deciding which of the two sources can be coded with expected cost of sequences of length n when n, where the cost of encoding a sequence is assumed to be a function of length only. Further Jeline [9] showed that coding with respect to Campbell [5] mean length is useful in minimizing the problem of buffer overflow which occurs when the source symbol are being produced at a fixed rate and the code words are stored temporarily in a finite buffer. Further, Boeee and Lubbe [4] and Lubbe [17] defined mean codeword length L P )= [ 1 p D N 1 ) ], > 0, 1 2.9)
5 A coding theorem connected on -norm entropy 829 and L P )= p 1 n, > 0, ) p DN ) and proved H P ) L P ) H P )+1 under condition 2.7). It may be seen that the mean codeword length L = p N had been generalized parametrically and their bounds had been studied in terms of generalized measures of entropies. We define the measure of length L ) by L ) = n 1 p D N ), > 0, ) Also, we have used the condition D N p 2.12) to find the bounds. when = 1, then 2.12) reduces to Kraft Inequality 2.7). Theorem ), then If N, =1, 2,..., n are the lengths of codewords satisfying H P ; ) L ) <D H P ; )+ Proof : In 2.2) choose Q =q 1,q 2,..., q n ) where ) 1 D. 2.13) q = D N 2.14) with choice of Q, 2.2) becomes H P ; α) n 1 p D N ) i.e. H P ; ) L ) which proves the first part of 2.13). The equality holds iff D N = p,,2,...,n which is equivalent to N = log D p ; =1, 2,..., n. 2.15) Choose all N such that log D p N < log D p +1.
6 830 S. Kumar and A. Choudhary Using the above relation, it follows that D N >p D ) We now have two possibilities: 1) If >1; 2.16) gives us ) p D N ) > p D 2.17) using 2.17) and the fact >1, we get right hand side in 2.13). 2) If 0 <<1, the proof follows on the same lines. Particular s Case If 1, then 2.13) becomes HP ) log D L log D <HP )+1. Which is the Shannon [14] classical noiseless coding theorem. 4 Conclusion We now that optimal code is that code for which the value L ) is equal to its lower bound. From the result of the theorem 2, it can be seen that the mean codeword length of the optimal code is dependent on parameter, while in the case of Shannon s theorem it does not depend on any parameter. So it can be reduced significantly by taing suitable values of parameter. eferences [1] S. Arimoto, Information Theoretical Considerations on Estimation Problems, Information and Control, ), [2] C. Arndt, Information Measure-Information and its description in Science and Engineering, Springer, Berlin, 2001). [3] M.A.K. Baig and ayeees Ahmad Dar, Coding theorems on a generalized information measures, J. KSIAM, 11 2) 2007), 3-8. [4] E. Boeee and J.C.A. Van Der Lubbe, The -norm Information Measure, Information and Control, ),
7 A coding theorem connected on -norm entropy 831 [5] L.L. Campbell, A coding theorem and enyi s entropy, Information and Control, ), [6] Z. Daroczy, Generalized Information Functions, Information and Control, ), [7] A. Feinstein, Foundations of Information Theory, McGraw-Hill, New Yor, [8] J.F. Havrda and F. Charvat, Quantification Methods of Classification Process, The Concept of structural α-entropy, Kybernetia, ), [9] F. Jeline, Buffer overflow in variable lengths coding of fixed rate sources, IEEE, ), [10] D. F. Kerridge, Inaccuracy and inference, J. oy. Statist Soc. Sec. B ), [11] J.C. Kieffer, Variable lengths source coding with a cost depending only on the codeword length, Information and Control, ), [12] P. Nath and D. P. Mittal, A generalization of Shannon s inequality and its application in coding theory, Inform. and Control, ), [13] A. enyi, On Measure of entropy and information, Proc. 4th Bereley Symp. Maths. Stat. Prob., ), [14] C. E. Shannon, A mathematical theory of information, Bell System Techn. J., ), , [15] O. Shisha, Inequalities, Academic Press, New Yor, [16].P. Singh,. Kumar and.k. Tuteja, Application of Holder s In equality in Information Theory, Information Sciences, ), [17] J. C. A. Van Der Lubbe, A generalized probabilistic theory of the measurement of certainty and information, Delft university Press, [18] J. C. A. Van Der Lubbe, On certain coding theorems for the information of order α and of type β, In: Trans. Eighth Prague Conf. Inform. Theory, Statist. Dec. Functions, andom Processes, Vol. C. Academia, Prague, 1978), eceived: October, 2010
A Generalized Fuzzy Inaccuracy Measure of Order ɑ and Type β and Coding Theorems
International Journal of Fuzzy Mathematics and Systems. ISSN 2248-9940 Volume 4, Number (204), pp. 27-37 Research India Publications http://www.ripublication.com A Generalized Fuzzy Inaccuracy Measure
More informationSome Coding Theorems on Fuzzy Entropy Function Depending Upon Parameter R and Ѵ
IOSR Journal of Mathematics IOSR-JM) e-issn: 2278-5728, p-issn:2319-765x. Volume 9, Issue 6 Jan. 2014), PP 119-123 Some Coding Theorems on Fuzzy Entropy Function Depending Upon Parameter R and Ѵ M.A.K.
More informationCODING THEOREMS ON NEW ADDITIVE INFORMATION MEASURE OF ORDER
Pak. J. Statist. 2018 Vol. 34(2), 137-146 CODING THEOREMS ON NEW ADDITIVE INFORMATION MEASURE OF ORDER Ashiq Hussain Bhat 1 and M.A.K. Baig 2 Post Graduate Department of Statistics, University of Kashmir,
More information54 D. S. HOODA AND U. S. BHAKER Belis and Guiasu [2] observed that a source is not completely specied by the probability distribution P over the sourc
SOOCHOW JOURNAL OF MATHEMATICS Volume 23, No. 1, pp. 53-62, January 1997 A GENERALIZED `USEFUL' INFORMATION MEASURE AND CODING THEOREMS BY D. S. HOODA AND U. S. BHAKER Abstract. In the present communication
More informationNew Generalized Entropy Measure and its Corresponding Code-word Length and Their Characterizations
New Generalized Entropy Measure and its Corresponding Code-word Length and Their Characterizations Ashiq Hussain Bhat 1, Dr. M. A. K. Baig 2, Dr. Muzafar Hussain Dar 3 1,2 Post Graduate Department of Statistics
More informationOn Some New Measures of Intutionstic Fuzzy Entropy and Directed Divergence
Global Journal of Mathematical Sciences: Theory and Practical. ISSN 0974-3200 Volume 3, Number 5 (20), pp. 473-480 International Research Publication House http://www.irphouse.com On Some New Measures
More information[1] Abramson, N. [1963]: [2] Aczel, J. [1975]: [3] Asadi, M. Ebrahimi N. [2000]: [4] Ash, B.R. [1990]: [5] Atanassov,K. [1983]: [6] Atanassov,
BIBLIOGRAPHY [1] Abramson, N. [1963]: Information theory and coding ; Mc.Graw Hill, New York. and statistical inference, Metrika, vol. 36, pp.129-147. [2] Aczel, J. [1975]: On Shannon s inequality, optimal
More informationAn application of generalized Tsalli s-havrda-charvat entropy in coding theory through a generalization of Kraft inequality
Internatonal Journal of Statstcs and Aled Mathematcs 206; (4): 0-05 ISS: 2456-452 Maths 206; (4): 0-05 206 Stats & Maths wwwmathsjournalcom Receved: 0-09-206 Acceted: 02-0-206 Maharsh Markendeshwar Unversty,
More informationTight Bounds on Minimum Maximum Pointwise Redundancy
Tight Bounds on Minimum Maximum Pointwise Redundancy Michael B. Baer vlnks Mountain View, CA 94041-2803, USA Email:.calbear@ 1eee.org Abstract This paper presents new lower and upper bounds for the optimal
More informationUpper Bounds on the Capacity of Binary Intermittent Communication
Upper Bounds on the Capacity of Binary Intermittent Communication Mostafa Khoshnevisan and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame, Indiana 46556 Email:{mhoshne,
More informationSome New Information Inequalities Involving f-divergences
BULGARIAN ACADEMY OF SCIENCES CYBERNETICS AND INFORMATION TECHNOLOGIES Volume 12, No 2 Sofia 2012 Some New Information Inequalities Involving f-divergences Amit Srivastava Department of Mathematics, Jaypee
More informationAlgorithmic probability, Part 1 of n. A presentation to the Maths Study Group at London South Bank University 09/09/2015
Algorithmic probability, Part 1 of n A presentation to the Maths Study Group at London South Bank University 09/09/2015 Motivation Effective clustering the partitioning of a collection of objects such
More informationCSCI 2570 Introduction to Nanocomputing
CSCI 2570 Introduction to Nanocomputing Information Theory John E Savage What is Information Theory Introduced by Claude Shannon. See Wikipedia Two foci: a) data compression and b) reliable communication
More informationSOME NOISELESS CODING THEOREM CONNECTED WITH HAVRDA AND CHARVAT AND TSALLIS S ENTROPY. 1. Introduction
Kragujevac Journal of Mathematcs Volume 35 Number (20, Pages 7 SOME NOISELESS COING THEOREM CONNECTE WITH HAVRA AN CHARVAT AN TSALLIS S ENTROPY SATISH KUMAR AN RAJESH KUMAR 2 Abstract A new measure L,
More informationFuzzy directed divergence measure and its application to decision making
Songklanakarin J. Sci. Technol. 40 (3), 633-639, May - Jun. 2018 Original Article Fuzzy directed divergence measure and its application to decision making Priti Gupta 1, Hari Darshan Arora 2*, Pratiksha
More informationBandwidth: Communicate large complex & highly detailed 3D models through lowbandwidth connection (e.g. VRML over the Internet)
Compression Motivation Bandwidth: Communicate large complex & highly detailed 3D models through lowbandwidth connection (e.g. VRML over the Internet) Storage: Store large & complex 3D models (e.g. 3D scanner
More informationA View on Extension of Utility-Based on Links with Information Measures
Communications of the Korean Statistical Society 2009, Vol. 16, No. 5, 813 820 A View on Extension of Utility-Based on Links with Information Measures A.R. Hoseinzadeh a, G.R. Mohtashami Borzadaran 1,b,
More informationMinimum Shannon Entropy for two specified Moments
International Journal of Scientific and Research Publications, Volume 2, Issue 4, April 2012 1 Minimum Shannon Entropy for two specified Moments Anju Rani*, Shalu Garg** * Department of Mathematics, R.R.
More informationReserved-Length Prefix Coding
Reserved-Length Prefix Coding Michael B. Baer Ocarina Networks 42 Airport Parkway San Jose, California 95110-1009 USA Email:icalbear@ 1eee.org arxiv:0801.0102v1 [cs.it] 30 Dec 2007 Abstract Huffman coding
More informationMULTINOMIAL AGENT S TRUST MODELING USING ENTROPY OF THE DIRICHLET DISTRIBUTION
MULTINOMIAL AGENT S TRUST MODELING USING ENTROPY OF THE DIRICHLET DISTRIBUTION Mohammad Anisi 1 and Morteza Analoui 2 1 School of Computer Engineering, Iran University of Science and Technology, Narmak,
More informationOptimization in Information Theory
Optimization in Information Theory Dawei Shen November 11, 2005 Abstract This tutorial introduces the application of optimization techniques in information theory. We revisit channel capacity problem from
More informationSuperposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels
Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels Nan Liu and Andrea Goldsmith Department of Electrical Engineering Stanford University, Stanford CA 94305 Email:
More informationEntropy as a measure of surprise
Entropy as a measure of surprise Lecture 5: Sam Roweis September 26, 25 What does information do? It removes uncertainty. Information Conveyed = Uncertainty Removed = Surprise Yielded. How should we quantify
More informationMINIMUM UNORTHODOX MEASURE OF ENTROPY FOR RESCRIBED ARITHMETIC MEAN AND SECOND ORDER MOMENT
International Journal of Scientific and Research Publications, Volume 7, Issue 4, April 2017 230 MINIMUM UNORTHODOX MEASURE OF ENTROPY FOR RESCRIBED ARITHMETIC MEAN AND SECOND ORDER MOMENT Shalu Garg**
More informationBasic Principles of Lossless Coding. Universal Lossless coding. Lempel-Ziv Coding. 2. Exploit dependences between successive symbols.
Universal Lossless coding Lempel-Ziv Coding Basic principles of lossless compression Historical review Variable-length-to-block coding Lempel-Ziv coding 1 Basic Principles of Lossless Coding 1. Exploit
More informationarxiv: v1 [cs.it] 5 Sep 2008
1 arxiv:0809.1043v1 [cs.it] 5 Sep 2008 On Unique Decodability Marco Dalai, Riccardo Leonardi Abstract In this paper we propose a revisitation of the topic of unique decodability and of some fundamental
More informationTight Bounds for Symmetric Divergence Measures and a New Inequality Relating f-divergences
Tight Bounds for Symmetric Divergence Measures and a New Inequality Relating f-divergences Igal Sason Department of Electrical Engineering Technion, Haifa 3000, Israel E-mail: sason@ee.technion.ac.il Abstract
More informationAn Extended Fano s Inequality for the Finite Blocklength Coding
An Extended Fano s Inequality for the Finite Bloclength Coding Yunquan Dong, Pingyi Fan {dongyq8@mails,fpy@mail}.tsinghua.edu.cn Department of Electronic Engineering, Tsinghua University, Beijing, P.R.
More informationAPC486/ELE486: Transmission and Compression of Information. Bounds on the Expected Length of Code Words
APC486/ELE486: Transmission and Compression of Information Bounds on the Expected Length of Code Words Scribe: Kiran Vodrahalli September 8, 204 Notations In these notes, denotes a finite set, called the
More informationOptimum Binary-Constrained Homophonic Coding
Optimum Binary-Constrained Homophonic Coding Valdemar C. da Rocha Jr. and Cecilio Pimentel Communications Research Group - CODEC Department of Electronics and Systems, P.O. Box 7800 Federal University
More informationInformation Theory and Coding Techniques
Information Theory and Coding Techniques Lecture 1.2: Introduction and Course Outlines Information Theory 1 Information Theory and Coding Techniques Prof. Ja-Ling Wu Department of Computer Science and
More informationAn instantaneous code (prefix code, tree code) with the codeword lengths l 1,..., l N exists if and only if. 2 l i. i=1
Kraft s inequality An instantaneous code (prefix code, tree code) with the codeword lengths l 1,..., l N exists if and only if N 2 l i 1 Proof: Suppose that we have a tree code. Let l max = max{l 1,...,
More informationWeighted Composition Operators on Sobolev - Lorentz Spaces
Int. J. Contemp. Math. Sciences, Vol. 6, 2011, no. 22, 1071-1078 Weighted Composition Operators on Sobolev - Lorentz Spaces S. C. Arora Department of Mathematics University of Delhi, Delhi - 110007, India
More informationA SYMMETRIC INFORMATION DIVERGENCE MEASURE OF CSISZAR'S F DIVERGENCE CLASS
Journal of the Applied Mathematics, Statistics Informatics (JAMSI), 7 (2011), No. 1 A SYMMETRIC INFORMATION DIVERGENCE MEASURE OF CSISZAR'S F DIVERGENCE CLASS K.C. JAIN AND R. MATHUR Abstract Information
More informationBinary Puzzles as an Erasure Decoding Problem
Binary Puzzles as an Erasure Decoding Problem Putranto Hadi Utomo Ruud Pellikaan Eindhoven University of Technology Dept. of Math. and Computer Science PO Box 513. 5600 MB Eindhoven p.h.utomo@tue.nl g.r.pellikaan@tue.nl
More informationUncertainity, Information, and Entropy
Uncertainity, Information, and Entropy Probabilistic experiment involves the observation of the output emitted by a discrete source during every unit of time. The source output is modeled as a discrete
More informationCapacity of the Discrete Memoryless Energy Harvesting Channel with Side Information
204 IEEE International Symposium on Information Theory Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information Omur Ozel, Kaya Tutuncuoglu 2, Sennur Ulukus, and Aylin Yener
More informationJensen-Shannon Divergence and Hilbert space embedding
Jensen-Shannon Divergence and Hilbert space embedding Bent Fuglede and Flemming Topsøe University of Copenhagen, Department of Mathematics Consider the set M+ 1 (A) of probability distributions where A
More informationConvexity/Concavity of Renyi Entropy and α-mutual Information
Convexity/Concavity of Renyi Entropy and -Mutual Information Siu-Wai Ho Institute for Telecommunications Research University of South Australia Adelaide, SA 5095, Australia Email: siuwai.ho@unisa.edu.au
More informationReserved-Length Prefix Coding
Reserved-Length Prefix Coding Michael B. Baer vlnks Mountain View, CA 94041-2803, USA Email:.calbear@ 1eee.org Abstract Huffman coding finds an optimal prefix code for a given probability mass function.
More informationFisher information and Stam inequality on a finite group
Fisher information and Stam inequality on a finite group Paolo Gibilisco and Tommaso Isola February, 2008 Abstract We prove a discrete version of Stam inequality for random variables taking values on a
More information(Classical) Information Theory III: Noisy channel coding
(Classical) Information Theory III: Noisy channel coding Sibasish Ghosh The Institute of Mathematical Sciences CIT Campus, Taramani, Chennai 600 113, India. p. 1 Abstract What is the best possible way
More informationTight Upper Bounds on the Redundancy of Optimal Binary AIFV Codes
Tight Upper Bounds on the Redundancy of Optimal Binary AIFV Codes Weihua Hu Dept. of Mathematical Eng. Email: weihua96@gmail.com Hirosuke Yamamoto Dept. of Complexity Sci. and Eng. Email: Hirosuke@ieee.org
More informationFeedback Capacity of a Class of Symmetric Finite-State Markov Channels
Feedback Capacity of a Class of Symmetric Finite-State Markov Channels Nevroz Şen, Fady Alajaji and Serdar Yüksel Department of Mathematics and Statistics Queen s University Kingston, ON K7L 3N6, Canada
More informationChapter 3 Source Coding. 3.1 An Introduction to Source Coding 3.2 Optimal Source Codes 3.3 Shannon-Fano Code 3.4 Huffman Code
Chapter 3 Source Coding 3. An Introduction to Source Coding 3.2 Optimal Source Codes 3.3 Shannon-Fano Code 3.4 Huffman Code 3. An Introduction to Source Coding Entropy (in bits per symbol) implies in average
More information1 Ex. 1 Verify that the function H(p 1,..., p n ) = k p k log 2 p k satisfies all 8 axioms on H.
Problem sheet Ex. Verify that the function H(p,..., p n ) = k p k log p k satisfies all 8 axioms on H. Ex. (Not to be handed in). looking at the notes). List as many of the 8 axioms as you can, (without
More information1440 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 44, NO. 4, JULY On Characterization of Entropy Function via Information Inequalities
1440 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 44, NO. 4, JULY 1998 On Characterization of Entropy Function via Information Inequalities Zhen Zhang, Senior Member, IEEE, Raymond W. Yeung, Senior Member,
More informationarxiv:math-ph/ v1 30 May 2005
Nongeneralizability of Tsallis Entropy by means of Kolmogorov-Nagumo averages under pseudo-additivity arxiv:math-ph/0505078v1 30 May 2005 Ambedkar Dukkipati, 1 M. Narsimha Murty,,2 Shalabh Bhatnagar 3
More informationSource Coding. Master Universitario en Ingeniería de Telecomunicación. I. Santamaría Universidad de Cantabria
Source Coding Master Universitario en Ingeniería de Telecomunicación I. Santamaría Universidad de Cantabria Contents Introduction Asymptotic Equipartition Property Optimal Codes (Huffman Coding) Universal
More information4. Quantization and Data Compression. ECE 302 Spring 2012 Purdue University, School of ECE Prof. Ilya Pollak
4. Quantization and Data Compression ECE 32 Spring 22 Purdue University, School of ECE Prof. What is data compression? Reducing the file size without compromising the quality of the data stored in the
More informationUsing an innovative coding algorithm for data encryption
Using an innovative coding algorithm for data encryption Xiaoyu Ruan and Rajendra S. Katti Abstract This paper discusses the problem of using data compression for encryption. We first propose an algorithm
More information(Classical) Information Theory II: Source coding
(Classical) Information Theory II: Source coding Sibasish Ghosh The Institute of Mathematical Sciences CIT Campus, Taramani, Chennai 600 113, India. p. 1 Abstract The information content of a random variable
More informationLecture 4 Noisy Channel Coding
Lecture 4 Noisy Channel Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 9, 2015 1 / 56 I-Hsiang Wang IT Lecture 4 The Channel Coding Problem
More informationCommon Information. Abbas El Gamal. Stanford University. Viterbi Lecture, USC, April 2014
Common Information Abbas El Gamal Stanford University Viterbi Lecture, USC, April 2014 Andrew Viterbi s Fabulous Formula, IEEE Spectrum, 2010 El Gamal (Stanford University) Disclaimer Viterbi Lecture 2
More informationAn Interpretation of Identification Entropy
4198 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL 52, NO 9, SEPTEMBER 2006 An Interpretation of Identication Entropy Rudolf Ahlswede Ning Cai, Senior Member, IEEE the expected number of checkings in the
More informationFixed-Length-Parsing Universal Compression with Side Information
Fixed-ength-Parsing Universal Compression with Side Information Yeohee Im and Sergio Verdú Dept. of Electrical Eng., Princeton University, NJ 08544 Email: yeoheei,verdu@princeton.edu Abstract This paper
More informationCLASSICAL error control codes have been designed
IEEE TRANSACTIONS ON INFORMATION THEORY, VOL 56, NO 3, MARCH 2010 979 Optimal, Systematic, q-ary Codes Correcting All Asymmetric and Symmetric Errors of Limited Magnitude Noha Elarief and Bella Bose, Fellow,
More informationA Mathematical Theory of Communication
A Mathematical Theory of Communication Ben Eggers Abstract This paper defines information-theoretic entropy and proves some elementary results about it. Notably, we prove that given a few basic assumptions
More informationWE start with a general discussion. Suppose we have
646 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 43, NO. 2, MARCH 1997 Minimax Redundancy for the Class of Memoryless Sources Qun Xie and Andrew R. Barron, Member, IEEE Abstract Let X n = (X 1 ; 111;Xn)be
More informationShannon s Noisy-Channel Coding Theorem
Shannon s Noisy-Channel Coding Theorem Lucas Slot Sebastian Zur February 13, 2015 Lucas Slot, Sebastian Zur Shannon s Noisy-Channel Coding Theorem February 13, 2015 1 / 29 Outline 1 Definitions and Terminology
More informationEntropy power inequality for a family of discrete random variables
20 IEEE International Symposium on Information Theory Proceedings Entropy power inequality for a family of discrete random variables Naresh Sharma, Smarajit Das and Siddharth Muthurishnan School of Technology
More informationNON-LINEAR CONTROL OF OUTPUT PROBABILITY DENSITY FUNCTION FOR LINEAR ARMAX SYSTEMS
Control 4, University of Bath, UK, September 4 ID-83 NON-LINEAR CONTROL OF OUTPUT PROBABILITY DENSITY FUNCTION FOR LINEAR ARMAX SYSTEMS H. Yue, H. Wang Control Systems Centre, University of Manchester
More informationTight Bounds for Symmetric Divergence Measures and a Refined Bound for Lossless Source Coding
APPEARS IN THE IEEE TRANSACTIONS ON INFORMATION THEORY, FEBRUARY 015 1 Tight Bounds for Symmetric Divergence Measures and a Refined Bound for Lossless Source Coding Igal Sason Abstract Tight bounds for
More informationLecture 4 Channel Coding
Capacity and the Weak Converse Lecture 4 Coding I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw October 15, 2014 1 / 16 I-Hsiang Wang NIT Lecture 4 Capacity
More informationSHANNON S information measures refer to entropies, conditional
1924 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 43, NO. 6, NOVEMBER 1997 A Framework for Linear Information Inequalities Raymond W. Yeung, Senior Member, IEEE Abstract We present a framework for information
More informationDispersion of the Gilbert-Elliott Channel
Dispersion of the Gilbert-Elliott Channel Yury Polyanskiy Email: ypolyans@princeton.edu H. Vincent Poor Email: poor@princeton.edu Sergio Verdú Email: verdu@princeton.edu Abstract Channel dispersion plays
More informationAn introduction to basic information theory. Hampus Wessman
An introduction to basic information theory Hampus Wessman Abstract We give a short and simple introduction to basic information theory, by stripping away all the non-essentials. Theoretical bounds on
More informationData Compression. Limit of Information Compression. October, Examples of codes 1
Data Compression Limit of Information Compression Radu Trîmbiţaş October, 202 Outline Contents Eamples of codes 2 Kraft Inequality 4 2. Kraft Inequality............................ 4 2.2 Kraft inequality
More informationChannel Polarization and Blackwell Measures
Channel Polarization Blackwell Measures Maxim Raginsky Abstract The Blackwell measure of a binary-input channel (BIC is the distribution of the posterior probability of 0 under the uniform input distribution
More informationInformation-Theoretic Limits of Group Testing: Phase Transitions, Noisy Tests, and Partial Recovery
Information-Theoretic Limits of Group Testing: Phase Transitions, Noisy Tests, and Partial Recovery Jonathan Scarlett jonathan.scarlett@epfl.ch Laboratory for Information and Inference Systems (LIONS)
More informationInformation Theory. David Rosenberg. June 15, New York University. David Rosenberg (New York University) DS-GA 1003 June 15, / 18
Information Theory David Rosenberg New York University June 15, 2015 David Rosenberg (New York University) DS-GA 1003 June 15, 2015 1 / 18 A Measure of Information? Consider a discrete random variable
More informationCOMM901 Source Coding and Compression. Quiz 1
German University in Cairo - GUC Faculty of Information Engineering & Technology - IET Department of Communication Engineering Winter Semester 2013/2014 Students Name: Students ID: COMM901 Source Coding
More informationArimoto Channel Coding Converse and Rényi Divergence
Arimoto Channel Coding Converse and Rényi Divergence Yury Polyanskiy and Sergio Verdú Abstract Arimoto proved a non-asymptotic upper bound on the probability of successful decoding achievable by any code
More informationLecture 3. Mathematical methods in communication I. REMINDER. A. Convex Set. A set R is a convex set iff, x 1,x 2 R, θ, 0 θ 1, θx 1 + θx 2 R, (1)
3- Mathematical methods in communication Lecture 3 Lecturer: Haim Permuter Scribe: Yuval Carmel, Dima Khaykin, Ziv Goldfeld I. REMINDER A. Convex Set A set R is a convex set iff, x,x 2 R, θ, θ, θx + θx
More informationPART III. Outline. Codes and Cryptography. Sources. Optimal Codes (I) Jorge L. Villar. MAMME, Fall 2015
Outline Codes and Cryptography 1 Information Sources and Optimal Codes 2 Building Optimal Codes: Huffman Codes MAMME, Fall 2015 3 Shannon Entropy and Mutual Information PART III Sources Information source:
More informationUncertainty relations expressed by Shannon-like entropies
CEJP 3 (2003) 393{420 Uncertainty relations expressed by Shannon-like entropies V. Majern k 12, Eva Majern kov a 13, S. Shpyrko 3 1 Department of Theoretical Physics, Palack y University, T r. 17. listopadu
More informationLecture 18: Shanon s Channel Coding Theorem. Lecture 18: Shanon s Channel Coding Theorem
Channel Definition (Channel) A channel is defined by Λ = (X, Y, Π), where X is the set of input alphabets, Y is the set of output alphabets and Π is the transition probability of obtaining a symbol y Y
More informationVariable Length Codes for Degraded Broadcast Channels
Variable Length Codes for Degraded Broadcast Channels Stéphane Musy School of Computer and Communication Sciences, EPFL CH-1015 Lausanne, Switzerland Email: stephane.musy@ep.ch Abstract This paper investigates
More informationEE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15
EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15 1. Cascade of Binary Symmetric Channels The conditional probability distribution py x for each of the BSCs may be expressed by the transition probability
More informationMultiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets
Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets Shivaprasad Kotagiri and J. Nicholas Laneman Department of Electrical Engineering University of Notre Dame Notre Dame,
More informationKybernetika. Harish C. Taneja; R. K. Tuteja Characterization of a quantitative-qualitative measure of inaccuracy
Kybernetika Harish C. Taneja; R. K. Tuteja Characterization of a quantitative-qualitative measure of inaccuracy Kybernetika, Vol. 22 (1986), o. 5, 393--402 Persistent URL: http://dml.cz/dmlcz/124578 Terms
More informationFRAMES IN QUANTUM AND CLASSICAL INFORMATION THEORY
FRAMES IN QUANTUM AND CLASSICAL INFORMATION THEORY Emina Soljanin Mathematical Sciences Research Center, Bell Labs April 16, 23 A FRAME 1 A sequence {x i } of vectors in a Hilbert space with the property
More informationSequences and Information
Sequences and Information Rahul Siddharthan The Institute of Mathematical Sciences, Chennai, India http://www.imsc.res.in/ rsidd/ Facets 16, 04/07/2016 This box says something By looking at the symbols
More informationComputation of Information Rates from Finite-State Source/Channel Models
Allerton 2002 Computation of Information Rates from Finite-State Source/Channel Models Dieter Arnold arnold@isi.ee.ethz.ch Hans-Andrea Loeliger loeliger@isi.ee.ethz.ch Pascal O. Vontobel vontobel@isi.ee.ethz.ch
More informationIN this paper, we study the problem of universal lossless compression
4008 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 52, NO. 9, SEPTEMBER 2006 An Algorithm for Universal Lossless Compression With Side Information Haixiao Cai, Member, IEEE, Sanjeev R. Kulkarni, Fellow,
More informationInformation Theoretic Limits of Randomness Generation
Information Theoretic Limits of Randomness Generation Abbas El Gamal Stanford University Shannon Centennial, University of Michigan, September 2016 Information theory The fundamental problem of communication
More information1 Background on Information Theory
Review of the book Information Theory: Coding Theorems for Discrete Memoryless Systems by Imre Csiszár and János Körner Second Edition Cambridge University Press, 2011 ISBN:978-0-521-19681-9 Review by
More informationA Relation Between Weight Enumerating Function and Number of Full Rank Sub-matrices
A Relation Between Weight Enumerating Function and Number of Full Ran Sub-matrices Mahesh Babu Vaddi and B Sundar Rajan Department of Electrical Communication Engineering, Indian Institute of Science,
More informationEntropy and Ergodic Theory Lecture 3: The meaning of entropy in information theory
Entropy and Ergodic Theory Lecture 3: The meaning of entropy in information theory 1 The intuitive meaning of entropy Modern information theory was born in Shannon s 1948 paper A Mathematical Theory of
More information3F1 Information Theory, Lecture 3
3F1 Information Theory, Lecture 3 Jossy Sayir Department of Engineering Michaelmas 2011, 28 November 2011 Memoryless Sources Arithmetic Coding Sources with Memory 2 / 19 Summary of last lecture Prefix-free
More informationExercise 1. = P(y a 1)P(a 1 )
Chapter 7 Channel Capacity Exercise 1 A source produces independent, equally probable symbols from an alphabet {a 1, a 2 } at a rate of one symbol every 3 seconds. These symbols are transmitted over a
More informationEntropy measures of physics via complexity
Entropy measures of physics via complexity Giorgio Kaniadakis and Flemming Topsøe Politecnico of Torino, Department of Physics and University of Copenhagen, Department of Mathematics 1 Introduction, Background
More informationNote on Absolute Matrix Summability Factors
Int. J. Contemp. Math. Sciences, Vol. 5, 2010, no. 35, 1737-1745 Note on Absolute Matrix Summability Factors Pragati Sinha, Hirdesh Kumar 1 and Vipin Kumar Gupta Department of Mathematics S. M. P.G.) College
More informationOn the properness of some optimal binary linear codes and their dual codes
Eleventh International Workshop on Algebraic and Combinatorial Coding Theory June 16-22, 2008, Pamporovo, Bulgaria pp. 76-81 On the properness of some optimal binary linear codes and their dual codes Rossitza
More informationLow Density Parity Check (LDPC) Codes and the Need for Stronger ECC. August 2011 Ravi Motwani, Zion Kwok, Scott Nelson
Low Density Parity Check (LDPC) Codes and the Need for Stronger ECC August 2011 Ravi Motwani, Zion Kwok, Scott Nelson Agenda NAND ECC History Soft Information What is soft information How do we obtain
More information2018/5/3. YU Xiangyu
2018/5/3 YU Xiangyu yuxy@scut.edu.cn Entropy Huffman Code Entropy of Discrete Source Definition of entropy: If an information source X can generate n different messages x 1, x 2,, x i,, x n, then the
More informationON THE SUM OF ELEMENT ORDERS OF FINITE ABELIAN GROUPS
ANALELE ŞTIINŢIFICE ALE UNIVERSITĂŢII AL.I. CUZA DIN IAŞI (S.N.) MATEMATICĂ, Tomul...,..., f... DOI: 10.2478/aicu-2013-0013 ON THE SUM OF ELEMENT ORDERS OF FINITE ABELIAN GROUPS BY MARIUS TĂRNĂUCEANU and
More informationSECOND TERM OF ASYMPTOTICS FOR KdVB EQUATION WITH LARGE INITIAL DATA
Kaikina, I. and uiz-paredes, F. Osaka J. Math. 4 (5), 47 4 SECOND TEM OF ASYMPTOTICS FO KdVB EQUATION WITH LAGE INITIAL DATA ELENA IGOEVNA KAIKINA and HECTO FANCISCO UIZ-PAEDES (eceived December, 3) Abstract
More informationUNIT I INFORMATION THEORY. I k log 2
UNIT I INFORMATION THEORY Claude Shannon 1916-2001 Creator of Information Theory, lays the foundation for implementing logic in digital circuits as part of his Masters Thesis! (1939) and published a paper
More informationON SOME RESULTS ON LINEAR ORTHOGONALITY SPACES
ASIAN JOURNAL OF MATHEMATICS AND APPLICATIONS Volume 2014, Article ID ama0155, 11 pages ISSN 2307-7743 http://scienceasia.asia ON SOME RESULTS ON LINEAR ORTHOGONALITY SPACES KANU, RICHMOND U. AND RAUF,
More information