SOME PROPERTIES OF ENTROPY OF ORDER a AND TYPE P

Similar documents
A Gentle Introduction to Gradient Boosting. Cheng Li College of Computer and Information Science Northeastern University

3. Identify and find the general solution of each of the following first order differential equations.

UNIVERSIDAD CARLOS III DE MADRID Escuela Politécnica Superior Departamento de Matemáticas

(308 ) EXAMPLES. 1. FIND the quotient and remainder when. II. 1. Find a root of the equation x* = +J Find a root of the equation x 6 = ^ - 1.

417 P. ERDÖS many positive integers n for which fln) is l-th power free, i.e. fl n) is not divisible by any integral l-th power greater than 1. In fac

!"#$%&'(&)*$%&+",#$$-$%&+./#-+ (&)*$%&+%"-$+0!#1%&

A Parent s Guide to Understanding Family Life Education in Catholic Schools (and how it connects with Ontario s revised Health and Physical Education

REAL LINEAR ALGEBRA: PROBLEMS WITH SOLUTIONS

Interpolation and Polynomial Approximation I

Lycée des arts Math Grade Algebraic expressions S.S.4

Mathematics Course 111: Algebra I Part I: Algebraic Structures, Sets and Permutations

PUTNAM TRAINING POLYNOMIALS. Exercises 1. Find a polynomial with integral coefficients whose zeros include

GENERALIZATION OF PARTITION FUNCTION, INTRODUCING SMARANDACHE FACTOR PARTITION

3. Identify and find the general solution of each of the following first order differential equations.

Embedding Trees in the Rationals


Lecture 7. Please note. Additional tutorial. Please note that there is no lecture on Tuesday, 15 November 2011.

Appendix A. Math Reviews 03Jan2007. A.1 From Simple to Complex. Objectives. 1. Review tools that are needed for studying models for CLDVs.

Linear DifferentiaL Equation

Linear maps. Matthew Macauley. Department of Mathematical Sciences Clemson University Math 8530, Spring 2017

Two dimensional manifolds

MATH spring 2008 lecture 3 Answers to selected problems. 0 sin14 xdx = x dx. ; (iv) x +

Skew-symmetric tensor decomposition

Sec. 14.3: Partial Derivatives. All of the following are ways of representing the derivative. y dx

(1) Recap of Differential Calculus and Integral Calculus (2) Preview of Calculus in three dimensional space (3) Tools for Calculus 3

Class IX Chapter 2 Polynomials Maths

Random Variables and Probability Distributions

2xy 0 y = ln y 0. d(p 2 x p) =0. p = 1 ± p 1+Cx. 1 ± 1+Cx 1+Cx ln

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

(b) g(x) = 4 + 6(x 3) (x 3) 2 (= x x 2 ) M1A1 Note: Accept any alternative form that is correct. Award M1A0 for a substitution of (x + 3).

Module Two: Differential Calculus(continued) synopsis of results and problems (student copy)

ABELIAN FUNCTIONS Abel's theorem and the allied theory of theta functions

Functions and Equations

1. 4 2y 1 2 = x = x 1 2 x + 1 = x x + 1 = x = 6. w = 2. 5 x

Mathematics (Course B) Lent Term 2005 Examples Sheet 2

2 Functions of random variables

Moore-Penrose-invertible normal and Hermitian elements in rings

2. A die is rolled 3 times, the probability of getting a number larger than the previous number each time is

SOLUTION FOR HOMEWORK 11, ACTS 4306

The Distributions of Sums, Products and Ratios of Inverted Bivariate Beta Distribution 1

WBJEE Answer Keys by Aakash Institute, Kolkata Centre

VANDERBILT UNIVERSITY. MATH 2300 MULTIVARIABLE CALCULUS Practice Test 1 Solutions

Summer Review Packet. for students entering. IB Math SL

Exam 1 Review: Questions and Answers. Part I. Finding solutions of a given differential equation.

P. ERDÖS 2. We need several lemmas or the proo o the upper bound in (1). LEMMA 1. { d ( (k)) r 2 < x (log x)c5. This result is due to van der Corput*.

Lesson 3: Linear differential equations of the first order Solve each of the following differential equations by two methods.

PRE-LEAVING CERTIFICATE EXAMINATION, 2010

Intrinsic Four-Point Properties

SYSTEM OF CIRCLES OBJECTIVES (a) Touch each other internally (b) Touch each other externally

The Theory of Second Order Linear Differential Equations 1 Michael C. Sullivan Math Department Southern Illinois University

Midterm 1 Solutions Thursday, February 26

ON POSITIVE, LINEAR AND QUADRATIC BOOLEAN FUNCTIONS

function provided the associated graph function g:x -) X X Y defined

Boosting. 1 Boosting. 2 AdaBoost. 2.1 Intuition

CS156: The Calculus of Computation Zohar Manna Winter 2010

Find quadratic function which pass through the following points (0,1),(1,1),(2, 3)... 11

Department of mathematics MA201 Mathematics III

DO NOT BEGIN THIS TEST UNTIL INSTRUCTED TO START

Moral Hazard: Characterization of SB

ON THE SHORTEST PATH THROUGH A NUMBER OF POINTS

T k b p M r will so ordered by Ike one who quits squuv. fe2m per year, or year, jo ad vaoce. Pleaie and THE ALTO SOLO

A Revised Denotational Semantics for the Dataflow Algebra. A. J. Cowling

Exercise 6.2. Q. 1. x = 3. Q. 2. y = 2. Q. 3. 2x = 8 x = 4. Q. 4. 3a = 27 a = 9. Q. 5. 3x = 3 x = 1. Q. 6. 4x = 20 x = 5. Q. 7.

SOLUTIONS FOR ADMISSIONS TEST IN MATHEMATICS, COMPUTER SCIENCE AND JOINT SCHOOLS WEDNESDAY 31 OCTOBER 2018

Math 4381 / 6378 Symmetry Analysis

Math 416 Lecture 2 DEFINITION. Here are the multivariate versions: X, Y, Z iff P(X = x, Y = y, Z =z) = p(x, y, z) of X, Y, Z iff for all sets A, B, C,

Prerna Tower, Road No 2, Contractors Area, Bistupur, Jamshedpur , Tel (0657) , PART III MATHEMATICS

MAT 570 REAL ANALYSIS LECTURE NOTES. Contents. 1. Sets Functions Countability Axiom of choice Equivalence relations 9

APPLICATION OF DERIVATIVES

Mathematics 426 Robert Gross Homework 9 Answers

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

Entropy and Ergodic Theory Lecture 4: Conditional entropy and mutual information

A CLASSROOM NOTE: ENTROPY, INFORMATION, AND MARKOV PROPERTY. Zoran R. Pop-Stojanović. 1. Introduction

Tutorial 1, B. Tech. Sem III, 24 July, (Root Findings and Linear System of Equations)

Isometric Invariants of Conics in the Isotropic Plane Classification of Conics

Nondegenerate three-dimensional complex Euclidean superintegrable systems and algebraic varieties

Math 205b Homework 2 Solutions

DISCUSSION CLASS OF DAX IS ON 22ND MARCH, TIME : 9-12 BRING ALL YOUR DOUBTS [STRAIGHT OBJECTIVE TYPE]

5 Years (10 Semester) Integrated UG/PG Program in Physics & Electronics

1. Group Theory Permutations.

Some Observations on Interpolation in Higher Dimensions

CALCULUS AB/BC SUMMER REVIEW PACKET (Answers)

Seventeen generic formulas that may generate prime-producing quadratic polynomials

DICKSON POLYNOMIALS OVER FINITE FIELDS. n n i. i ( a) i x n 2i. y, a = yn+1 a n+1 /y n+1

Notes: DERIVATIVES. Velocity and Other Rates of Change

CHAPTER III THE CURVE AND THE EQUATION. The bisectors of the adjacent angles formed by two lines is. To solve any locus problem involves two things :

EIGENFUNCTIONS WITH FEW CRITICAL POINTS DMITRY JAKOBSON & NIKOLAI NADIRASHVILI. Abstract

Math512 PDE Homework 2

Lecture 5 - Fundamental Theorem for Line Integrals and Green s Theorem

Sample Math 115 Midterm Exam Spring, 2014

+ 2gx + 2fy + c = 0 if S

Application of Information Theory, Lecture 7. Relative Entropy. Handout Mode. Iftach Haitner. Tel Aviv University.

(D) (A) Q.3 To which of the following circles, the line y x + 3 = 0 is normal at the point ? 2 (A) 2

Practice Final Exam Solutions for Calculus II, Math 1502, December 5, 2013

LOWELL WEEKLY JOURNAL. ^Jberxy and (Jmott Oao M d Ccmsparftble. %m >ai ruv GEEAT INDUSTRIES

Solutions to old Exam 3 problems

1. Definition of a Polynomial

Simple derivation of the parameter formulas for motions and transfers

ANALYTIC TRANSFORMATIONS OF EVERYWHERE DENSE POINT SETS*

EE4601 Communication Systems

Transcription:

SOME PROPERTIES OF ETROPY OF ORDER a AD TYPE P BY J.. KAPUR, F.A.Sc. (Indian Institute of Technology, Kanpur) Received March 10, 1967 ABSTRACT In a recent paper, 3 we defined entropy of order a and type P. For a = 1, it reduces to Renyi's 2 entropy of order a, while for a = 1, $= 1 and for a complete probability distribution, it reduces to Shannon's definition of entropy. In this paper we have studied some properties of this generalised entropy. We have also discussed the variability of Renyi's entropy in the continuous case with co-ordinate systems and invariance of transinformation of order a and type 8 for linear transformations. LET P = (Pr, P2'..., Px) 1. DEFIITIOS be a generalised probability distribution, so that then p 0, (i=1,2,...); 0< Ep ^1, (1) (i) Shannon's' entropy is defined as {6j H (P) _ -- E Pa log2pi ; 7 Pt = 1 (2) (ii) Renyi's entropy of order a is defined as 1 E Pi ' Ha (P) = logz -1-1-- ; 0< L' Pi < 1; a >0 a+1 (3) 1 a 2JPt ia2 A3 201

202 J.. KAPUR (iii) Renyi's 2 entropy of order 1 is defined as the limit of H. (P) as a 1 so that Pi loge Pi Hl (P)= ''1 0< ^7Pi< 1 (4) 2p 4-1 (iv) Ours entropy of order a and type 13 is given by 'i Pia +t-1 HP (P) = 1 1 a log e f --, 0< E Pi < 1, a ^ 1, (=1 c=i j3>0, a+9>1, (5) (v) Ours entropy of order '1 and type 14 is obtained as a limiting case of HI (P) as a * 1 so that H,fl (P) _ ` 1 E Pi ogs Pi E Pifl r ^ (6) The above entropies are defined for the discrete case. For the continuous case: (vi) Shannon'sl entropy is given by H (f) _ f.f (x) logf (x) dx;.f.f (x) dx = 1, (7) a a where _, '(x) is the density function of the random variate x which can lie between a and b. (vii) Renyi's4 entropy of order a is given by b x)adx f(f() Ha (.f) = 1 log b ---- ' f.f (x) dx < 1; a>0, a ^ l. 1 a f.f(x)dx a a (8)

Some Properties of Entropy of Order a and Type /3 203 (viii)renyi's4 entropy of order 1 is given by b f f(x) log f (x) dx b Hl (f) = b ; 0 < f f (x) dx < 1 (9) 51(x) dx (ix) Our4 entropy of order a and type /3 is given by f (f (x))a +P-1 dx Has (f) = 11 log" a ; a7 1; i9>0 a+ #> 1 f (.f (x)) gx a (10) (x) Our4 entropy of order 1 and type /3 is given by f (f (x)) glog f (x) dx Hip(f)_ b 0< f.f(x)dx<1. (11) f (1(x)) Pdx a The properties of Shannon's entropy are well known. It is proposed to give below similar properties of Renyi's and our entropies. 2. PROPERTIES OF REYI's ETROPY (i) Ha (pl, P2, ' ' ' I P) is a symmetric function of its arguments. (ii) H. (p,, P2,, P) is a continuous function of its arguments. (iii) Ha (P*Q) = Ha (P) + Ha (Q), (12) where P and Q are independent distributions. W (P) g (Ha (P)) + W (Q) g (Ha (Q)) (iv) g (H" (PUQ)) W (P) -F W (Q) (13) where PUQ = (Pi, P2,., P; q1, 92,..., 4M) (14) W (P) = 2 pi; W (Q) _ ' 43 (15 a)

204 J.. KAPUR 0<W(P)<1; 0<W(Q)<1; 0 <W(P)±W(Q)<1 (15b) 2(1...)x g (x))= (16) (v) Hs {(I)} = 1 (17) where {1} refers to the case when = 1 and the corresponding probability is 1. These five were used by Renyi as his postulates. We give below some other properties: (vi) Hs (P, p,.., P) = log2 p, (18) so that in the case of equal probabilities, H o, is independent of a. (vii) For a generalised probability distribution with weight K, and K K K so that all the probabilities are equal, H. (' '. ) = log K- = log (19) In this case Ha (P) is a monotonic increasing function of. (viii) If we denote this function by F 4 (), then Lt Fs = 0, Lt (Fa () Fa ( 1)) = 0. (20) x-^oe AI -"M E Pk (ix) H Q (P) = 0 =) 1 a loge R = 0 2,` Pk 7Pk6 = EPk Tint x=x _) E Pk (l Pk'-1) = 0. (21) keg If a> 1, all terms are > 0, If a < 1, all terms are < 0. In both cases Ha (P) = 0 implies that there is exactly one pt equal to unity and the rest are all zero.

Some Properties of Entropy of Order a and Type i3 205 Thus entropy of order a cannot vanish for an incomplete probability distribution. For a complete probability distribution, it Can - vanish if and only if one of the probabilities is unity and the rest are zero. (x) Ha (pl, Ps, ' P 0) = H e (Pie P2..., Px) (22) so that entropy of order a is not changed by adding an impossible event. (xi) He (P) is a monotonic decreasing function3 of a. Its maximum value is log (/W (P)) and its minimum value is log PM where pu is the maximum probability. This shows that He (P) > 0. (xii) Let pi < p j and let pf change to (pi + x) and pi change to (p3 -- x), [0<pt+x<1, 0 ^pj x^1, x>01 and let then if if.f (x) _ (Pi + X)e + (Pj X)a Pi g Pi, f(0) = 0,.f (Pj -- pt) = 0. f' (x) = a (pi + x) Q.-i a (pi x)a'' 1' (0) = a (p Pj "), 1' (Pj _` Pi) = a (Pã'' Pl') a> 1, f' (0) < 0,.f' (Pj P1) > 0. a < 1, f' (0) > 0, f' (Pj P6) <0. In the first case f (x) is negative for 0 < x < (p3 A. In the second case f (x) is positive for 0 < x < (p pi), ow Pi. + x)a + (P1 x)a + E Pk' 1 logy (x) = a ( Pi+x -f Pj x+ E Pk uw, s f (x) + E Pk 1 1 loss --- ^p (23)

206 J.. KAPUR If.f (x) + 2: Pka 0 (x) 0 (0) = 1 loge k1 (24) 2: P ka k=1 a > 1, f (x) < 0, 1 a < 0, 0 (x) 0 (0) > 0. If a< 1, f(x)>0, 1 a>0, c&(x) ^(0)>0. Thus in both cases, we find that replacing of pi, pj by pi + x, pj x increases entropy. Thus nearer probabilities come to one another, the greater the entropy becomes; of course, for a fixed W (P). (xiii)for a given W (P), the entropy is maximum when the probabilities are equal. (xiv)the maximum entropy is a non-increasing function of W (P). 3. PROPERTIES OF REYI's ETROPY OF ORDER UITY Most of the above properties are true for all a and they remain true in the limit when a 1, Thus (i), (iii), (v), (vi), (vii), (viii), (ix), (x), (xiii) and (xiv) are true. (ii) is true when the arguments are > 0. (iv) is true when g (x) = ax -j- b, a> 0. To see the truth of (xii), we consider.f (x) = (Pi + x) log (Pi ±x) +(p x) log (Ps x) and proceed as before. In addition the following properties are also true. (xv) H i (p, P2'..., Px-i, q1, q2,..., qm) = H 1 (P1, Pz,... Px) -f- 1 ç!! H, \px, PK (25) where del

Some Properties of Entropy of Order a and Type 207 The proof is straightforward Hi (pi, P2,..., P- q1, q2, '.., qm) -1 M _ f-1 _ iol Pi loge Pi + E q9 log2 qj r-1-1 2Jpi+ E qq i 1 11 Y M Pi loge Pi ` ^' qi log2 Px -I- 4 q; loge qi 1=1 }1-1 E Pi +Px i-1 P141-1 Px loge PH Hi (Pi, P2.... Px) + x q-i 2'p j PH = Hi (Pi, P2, -.. P) +W (P) Hl (Px ' P '..., P141 (xvi) H1 (q11, q12, ', glm 1 ; q21, q22,, q2m,, ', q11, q,,, qxm ) r... 9tml = H1(Pi, Ps'..' P) + E W ^P) (el Hl \ P1 ' Pi ' Pi J where gij+q{2+ - +qim,=pi This is a straight generalisation of the last result. (26) (xvii) H1(Pi, P2' - - - 1 P) = Hi (pi -1- P2+... + p) + Hi (W(P), W(P) (xviii) H 1 (pt, P2' ' ' ' I Pr; Pr+i' ' ' ', PT) = Hi (Pi, P2' '. ' 2 Pr, PT-i + PT+2 + - + Px) Pr+1 + PT+2 + * ± Px } W (P) x..., W(P)) - (27) H 1 ( Pr+i.,, Px (28) ^T+^ + "+P+q' ' P + " +PIr

208 J,. KAFUR 4. PROPERTIES OF OUR ETROPY OR ORDER a AD TYPE Most of the properties for H. (P) = H al (P) continue to hold for Hag (P). Thus (i), (ii) and (iii) are easily seen to be true. In (iv) W (P) and W (Q) have to be replaced respectively by Wf (p) = P1P +... +Pxs; Wp (Q) = 41P +... + qxp (29) (v) is true. (vi) is true and Has (p, p,, p) is independent of both a and /3. (vii) For a generalised probability distribution with given weight K. Ha (p, p,..., P) = log p = log K. (30) This is independent of a and /3 and is a monotonic increasing function of. (viii), (x), (xiii) and (xv) are true. (ix) HaP (P) = 0 =) F PiP (pi" 1) = 0. This implies that exactly one probability is unity and the rest are all zero. (xi) Ha, 9 (P) is a monotonic decreasing functions of a for a fixed P. 5. LACK OF IVARIACE OF REYI's ETROPY I THE COTIUOUS CASE WITH CO-ORDIATE SYSTEMS Let X be transformed into a new variable Y by a continuous one-to-one transformation. If the density functions of the two variates are f (x) and p (y), then p(y) _.f (x)j dy (31) I1a (P) =1 i a log f (p (Y))" dy f.p(y)dy

Some Properties of Entropy of Order a and Type P 209 1 00 f (f (X))_ dx dy ^ dx dy dx, f i dx j_yjd f(x)^dy11 dxj x 00 1 r1 1x = 1 _a log 3 (32) f f (x) dx The entropy therefore changes. If however we consider a linear transformation. we get Y = AX + B, (33) Ha (p) = Ha (.f) -I- log I A I, (34) so that the change is independent of a. Also the difference in two entropies does not change so that Ha (Pi) Ha (P2) = Ha (f1) -' Ha (f2) (35) For the case of multivariate distributions 00 00 (XI, xa1 (^... f(f(xi,x2,, xx.. dx1dx2,... a J (Yi, Yz> > Yx)... Xp ) s i Ha (p) = 1 a log -00-00 00 V... f.f (x1, x2,.., X) dx,dx,... dx where -00 P (Yi, Y2,..., Y) and f (x^, x2,..., xx) are the density functions. For an orthogonal transformation, the absolute value of the Jacobian has the value unity and as such the entropy is invariant for translation and orthogonal transformations. The transinformation I (7^, Y) _ I (X) -f- H (Y) -- H (XY) (37) (36)

210 J.. KAPUR is also invariant for linear transformations for if then Z = AX + B, W = CY -I- D (38) I(Z, W) = H(Z)+H(W) H(ZW) = H (X) + log I A I -I- H (Y) + log I C H (XY) log I AC = H (X) + H (Y) H (XY) = I (X, Y) (39) 6. IVARIACE OF OUR TRASIFORMATIO FOR LIEAR TRASFORMATIO As above Ce 1 _f (P (y))a+p -1 dy Has (P) = 1 a log 00 f (p (y))s dy -00 = I a log 00 J (J (x s+s 1 dx (x))a+'-1 Iy f (f(x))s dx '-1 dy dx = Has (1) + log I A I, ( 40) so that the change is independent of both a and P. Also Has (P2) Hap (Pr) = Hap (f2) Has (fi) (41) For multivariate distributions, the entropy is invariant for translations and linear orthogonal transformations. For the same reason transinformation remains invariant for linear transformations. 7. ACKOWLEDGEMET The author is grateful to the referee for his useful suggestions,

Some Properties of Entropy of Order a and Type fi 211 8. REFERECES 1. Shannon, E. C... Bell. System Tech. Journal, 1948, 27, 379, 623. 2. Renyi, A... Proc. 4th Berkeley Symp. Math. Statistics and Prob., 1961. 1, 547. 3. Kapur,1... The Maths. Seminar, 1967, 4, 58. 4... LLT./K. Maths. Research Report, 1967, o. 2.