Infinite anti-uniform sources

Similar documents
A new simple recursive algorithm for finding prime numbers using Rosser s theorem

On the longest path in a recursively partitionable graph

Methylation-associated PHOX2B gene silencing is a rare event in human neuroblastoma.

A Simple Proof of P versus NP

Case report on the article Water nanoelectrolysis: A simple model, Journal of Applied Physics (2017) 122,

Full-order observers for linear systems with unknown inputs

On path partitions of the divisor graph

On size, radius and minimum degree

The Accelerated Euclidean Algorithm

Analysis of Boyer and Moore s MJRTY algorithm

Soundness of the System of Semantic Trees for Classical Logic based on Fitting and Smullyan

Smart Bolometer: Toward Monolithic Bolometer with Smart Functions

b-chromatic number of cacti

The sound power output of a monopole source in a cylindrical pipe containing area discontinuities

Completeness of the Tree System for Propositional Classical Logic

Tight Upper Bounds on the Redundancy of Optimal Binary AIFV Codes

Exact Comparison of Quadratic Irrationals

Passerelle entre les arts : la sculpture sonore

A new approach of the concept of prime number

Easter bracelets for years

Vibro-acoustic simulation of a car window

Can we reduce health inequalities? An analysis of the English strategy ( )

Learning an Adaptive Dictionary Structure for Efficient Image Sparse Coding

On the Griesmer bound for nonlinear codes

L institution sportive : rêve et illusion

Evolution of the cooperation and consequences of a decrease in plant diversity on the root symbiont diversity

The FLRW cosmological model revisited: relation of the local time with th e local curvature and consequences on the Heisenberg uncertainty principle

Hook lengths and shifted parts of partitions

A Slice Based 3-D Schur-Cohn Stability Criterion

On Newton-Raphson iteration for multiplicative inverses modulo prime powers

Dispersion relation results for VCS at JLab

Sound intensity as a function of sound insulation partition

There are infinitely many twin primes 30n+11 and 30n+13, 30n+17 and 30n+19, 30n+29 and 30n+31

Entropies and fractal dimensions

Nodal and divergence-conforming boundary-element methods applied to electromagnetic scattering problems

Widely Linear Estimation with Complex Data

Computable priors sharpened into Occam s razors

Thomas Lugand. To cite this version: HAL Id: tel

On infinite permutations

Solution to Sylvester equation associated to linear descriptor systems

Multiple sensor fault detection in heat exchanger system

Some tight polynomial-exponential lower bounds for an exponential function

A Context free language associated with interval maps

A New Performance Evaluation Metric for Sub-Optimal Iterative Decoders

Determination of absorption characteristic of materials on basis of sound intensity measurement

Towards an active anechoic room

Unbiased minimum variance estimation for systems with unknown exogenous inputs

Factorisation of RSA-704 with CADO-NFS

A non-commutative algorithm for multiplying (7 7) matrices using 250 multiplications

Axiom of infinity and construction of N

On Symmetric Norm Inequalities And Hermitian Block-Matrices

Cutwidth and degeneracy of graphs

Interactions of an eddy current sensor and a multilayered structure

On The Exact Solution of Newell-Whitehead-Segel Equation Using the Homotopy Perturbation Method

Tight Bounds on Minimum Maximum Pointwise Redundancy

STATISTICAL ENERGY ANALYSIS: CORRELATION BETWEEN DIFFUSE FIELD AND ENERGY EQUIPARTITION

Climbing discrepancy search for flowshop and jobshop scheduling with time-lags

Computation and Experimental Measurements of the Magnetic Fields between Filamentary Circular Coils

Impulse response measurement of ultrasonic transducers

approximation results for the Traveling Salesman and related Problems

Particle-in-cell simulations of high energy electron production by intense laser pulses in underdense plasmas

Improvement of YBCO Superconductor Magnetic Shielding by Using Multiple Bulks

Hardware Operator for Simultaneous Sine and Cosine Evaluation

Tropical Graph Signal Processing

A remark on a theorem of A. E. Ingham.

A numerical analysis of chaos in the double pendulum

Unfolding the Skorohod reflection of a semimartingale

Some explanations about the IWLS algorithm to fit generalized linear models

The magnetic field diffusion equation including dynamic, hysteresis: A linear formulation of the problem

The Riemann Hypothesis Proof And The Quadrivium Theory

Influence of a Rough Thin Layer on the Potential

Optical component modelling and circuit simulation using SERENADE suite

Exogenous input estimation in Electronic Power Steering (EPS) systems

A Study of the Regular Pentagon with a Classic Geometric Approach

Avalanche Polynomials of some Families of Graphs

Comparison of Harmonic, Geometric and Arithmetic means for change detection in SAR time series

Stickelberger s congruences for absolute norms of relative discriminants

MODal ENergy Analysis

Parallel Repetition of entangled games on the uniform distribution

From Unstructured 3D Point Clouds to Structured Knowledge - A Semantics Approach

New estimates for the div-curl-grad operators and elliptic problems with L1-data in the half-space

Numerical Modeling of Eddy Current Nondestructive Evaluation of Ferromagnetic Tubes via an Integral. Equation Approach

AC Transport Losses Calculation in a Bi-2223 Current Lead Using Thermal Coupling With an Analytical Formula

Electromagnetic characterization of magnetic steel alloys with respect to the temperature

RHEOLOGICAL INTERPRETATION OF RAYLEIGH DAMPING

Best linear unbiased prediction when error vector is correlated with other random vectors in the model

Thermodynamic form of the equation of motion for perfect fluids of grade n

A note on the computation of the fraction of smallest denominator in between two irreducible fractions

Teaching Reitlinger Cycles To Improve Students Knowledge And Comprehension Of Thermodynamics

Note on winning positions on pushdown games with omega-regular winning conditions

Quantum efficiency and metastable lifetime measurements in ruby ( Cr 3+ : Al2O3) via lock-in rate-window photothermal radiometry

Fast Computation of Moore-Penrose Inverse Matrices

Two-step centered spatio-temporal auto-logistic regression model

Dissipative Systems Analysis and Control, Theory and Applications: Addendum/Erratum

Trench IGBT failure mechanisms evolution with temperature and gate resistance under various short-circuit conditions

Sensitivity of hybrid filter banks A/D converters to analog realization errors and finite word length

Chebyshev polynomials, quadratic surds and a variation of Pascal s triangle

Lower bound of the covering radius of binary irreducible Goppa codes

On production costs in vertical differentiation models

Nel s category theory based differential and integral Calculus, or Did Newton know category theory?

Transcription:

Infinite anti-uniform sources Daniela Tarniceriu, Valeriu Munteanu, Gheorghe Zaharia To cite this version: Daniela Tarniceriu, Valeriu Munteanu, Gheorghe Zaharia. Infinite anti-uniform sources. AEU - International Journal of Electronics Communications, 23, 67 (3), pp.27-222. <.6/j.aeue.22.8.2>. <hal-784939> HAL Id: hal-784939 https://hal.archives-ouvertes.fr/hal-784939 Submitted on 5 Feb 23 HAL is a multi-disciplinary open access archive for the deposit dissemination of scientific research documents, whether they are published or not. The documents may come from teaching research institutions in France or abroad, or from public or private research centers. L archive ouverte pluridisciplinaire HAL, est destinée au dépôt et à la diffusion de documents scientifiques de niveau recherche, publiés ou non, émanant des établissements d enseignement et de recherche français ou étrangers, des laboratoires publics ou privés.

INFINITE ANTI - UNIFORM SOURCES Daniela G. Tarniceriu, Valeriu B. Munteanu, Gheorghe Zaharia 2 Gheorghe Asachi Technical University, Faculty of Electronics, Telecommunications Information Technology, Department of Telecommunications, Bd. Carol I, 756, Iasi, Romania 2 IETR INSA, UMR CNRS 664 Rennes, France, emails: vmuntean@etti.tuiasi.ro, tarniced@etti.tuiasi.ro, gheorghe.zaharia@insa-rennes.fr Abstract In this paper we consider the class of anti-uniform Huffman (AUH) codes for sources with infinite alphabet. Poisson, negative binomial, geometric exponential distributions lead to infinite anti uniform sources for some ranges of their parameters. Huffman coding of these sources results in AUH codes. We prove that as a result of this encoding, we obtain sources with memory. For these sources we attach the graph derive the transition matrix between states, the state probabilities the entropy. If c c denote the costs for storing or transmission of symbols, respectively, we compute the average cost for these AUH codes. Keywords: Huffman coding, average codeword length, code entropy, average cost.. Introduction Consider a discrete source with infinite size alphabet ξ :( ss 2 s ) associated ordered probability distribution Ppp :( p ), where pp p.... It is well n 2 2... nown that the Huffman encoding algorithm [] provides an optimal prefix free code for this source. A binary Huffman code is usually represented using a binary tree T, whose leaves correspond to the source messages. The two edges emanating from each intermediate tree node (father) are labelled either or. The length between the root a leaf is the length of the binary codeword associated with the corresponding message. Assuming that v, =,2,..., is the codeword representing the message s, we denote the length of v by l. The optimality of Huffman coding implies that ll j, if p > pj.

Anti uniform Huffman (AUH) codes representing a infinite source ξ n were firstly introduced in [2] they are characterized by the fact that l = +, for =,,2,.... For this, the following condition has to be fulfilled [2]: p pi, i n 3 () i =+ 2 The AUH codes have been extensively analysed, concerning bounds on average codeword length, entropy redundancy for different types of probability distribution. In [3] it has been shown that these codes maximize the average length entropy tight lower upper bounds on average codeword length, entropy redundancy of finite infinite AUH codes in terms of alphabet size are derived. Related topics are addressed in [4]-[6]. The problem of D-ary Huffman codes is analysed in [7], [8] it is shown that for AUH codes, by a proper chose of the source probabilities, the average codeword length can be made closer to unity. The problem of bounding the average length of an optimal (Huffman) source code when only limited nowledge of the source symbol probability distribution is available is considered in [9]. The AUH sources appear in a wide variety of situations in the real world, because this class of sources have the property of achieving minimum redundancy in different situations minimal average cost in highly unbalanced cost regime []-[2]. These properties determine a wide range of applications motivate us to study these sources from an information theoretic perspective. Unequal cost letter problem modeling situations in which different characters have different transmission times or storage costs was addressed in [3], [4]. One example is the telegraph channel with the alphabet {. -} in which dashes are twice as long as dots [5]. Another example is the {a, b} run length limited codes used in magnetic optical storage, in which the binary codewords are constrained so that each must be preceded by at least a, at most b, s [6]. In [7] the authors have introduced a class of binary prefix codes having the property that each codeword ends in a one. In [8] an application of ended prefix codes is considered, where it is shown how to construct self-synchronizing codes how to use them for group testing. A study 2

concerning the bounds on average codeword length for these codes is performed in [9]. As another example, binary codes whose codewords contain at most a specified number of s are used for energy minimization of transmissions in mobile environments [2]. There is a large literature addressing the problem of cost for prefix-free codes with unequal letter cost encoding alphabet [2] references therein. AUH sources can be generated by several probability distributions. It has been shown that Poisson, negative binomial, geometric, exponential distributions lay in the class of AUH sources for some regimes of their parameters [2], [6], [22], [23]. Related topic was addressed in [24], where the authors studied wealy super increasing (WSI) partial WSI sources in connection with Fibonacci numbers golden mean, which appeared extensively in modern science, in particular, have applications in coding information theory. The rest of the paper is organized as follows. In Section 2 we present the Huffman encoding of an anti uniform source with infinite alphabet show that by Huffman binary encoding a source with memory results. For this source we compute the code entropy H(X). The average cost of the code is also derived. Sections 3, 4, 5 6 apply these results for infinite sources with Poisson, negative binomial, geometric exponential distributions, respectively. We conclude the paper in Section 7. 2. The entropy the average cost of AUH codes for sources with infinite alphabet Let there be a discrete source with infinite alphabet, characterized by the distribution: s s s ξ : p p p 2 2 p s, (2) We assume that the source is complete, that is p = (3) = relation s fulfilled. 3

After a binary Huffman encoding of this source, the graph in Fig. results, that is, an infinite anti uniform code. The terminal nodes s represent the messages resulting by binary Huffman encoding p denote their probabilities. Fig.. The graph of Huffman encoding for the source ξ with distribution in (2) We denote by s p the intermediate node in the encoding graph its probability, respectively. Unlie a leaf, an intermediate node is not corresponding to a source message, therefore no probability mass is associated. However, with slight abuse we can call the weight of the intermediate node also probability. Considering (3), the probabilities of intermediate nodes p are obtained recursively, as the sum of the two siblings. In this way, we get: p j j = = p ; =,,2,... (4) The structure of the codewords resulting by Huffman binary encoding is: 4

s v s v s2 v2... s v...... (5) () The length of the codeword associated with the message t is the number of edges on the l s path between the root the node s in the Huffman tree. The average codeword length is determined with l = +, =,, 2,... (6) l = pl. (7) = The branches between succesive nodes have the probabilities equal to the ratio between the probability of the node in which the branch ends the probability of the node from which it starts. We note that the probabilities to deliver the symbols x = or x = depend on the node from which they are generated. In other words, as a result of Huffman encoding of the source, a new source X = { xx, } with memory is obtained. Its states correspond to terminal or intermediate nodes (excepting the root) in the graph in Fig.. When a terminal node is reached, the binary encoding Huffman procedure is resumed from the graph root. The graph attached to the source with memory X can be obtained from the Huffman encoding graph of the source ξ, as follows: a) We lin the terminal nodes in the graph of the source ξ with root; S S through the graph b) The branch probabilities are the same as in the Huffman encoding graph; 5

() c) Each terminal or intermediate node (excepting the graph root) will represent a state S t or S, =,,2,... (as represented in Fig. 2). Fig. 2 The graph of the source with memory Let SSS = {,,..., SS,,..., SS,,..., SS,,...} be the state set of the source with memory. The probabilities of delivering the symbols x = or x = from the state S,,2,... = are equal to the probabilities of transition from the state S, =,2,... to the states S, =,2,... () S i, =,2,..., respectively, i.e. p px ( S ) =, =,2,... (8) p p px ( S ) =, =,2,... (9) p The transition matrix between states is: 6

S S S S S S S 2 p p S p p S p p S T = p p S p p p p p p S (2) () Let π t π, =,,2,..., denote the state probabilities of the source with memory. They can be determined by means of [25]: [ π π... π... π π... π...] = [ π π... π... π π... π...]t (3) ( π π = ) + = (4) Considering (7) (2), from (3) (4) we get the state probabilities as: π = p, =,,2,... (5) l π = p = p j l l j = (6) Generally, the entropy of the source with memory is computed by [26] ( ) ( ) ( ) ( () () π i i j j π j j = j= = j= (7) HX ( ) = pxs logpxs pxs logpxs Let c be the costs associated to the bits, respectively. The average cost of a c code is defined by [4] C = p ( nc ( ) + nc ( ) ), (8) = where we denote by n ( ) n ( ) the number of s s in the codeword corresponding to () the source symbol s t. ) 7

Considering (5) (6), the average cost is C = p ( c c ) = + (9) 3. AUH sources with Poisson distribution Let there be a discrete source with infinite alphabet, characterized by Poisson distribution: s s s2 s ξ : 2 λ λ λ p e p e p2 e p e = = = =, (2)! 2!! In [22] it is shown that any Poisson distribution with parameter λ satisfies condition () leads to an AUH code. The source is complete, because e = (2)! = λ λ Theorem The entropy the average cost of the code resulting by binary encoding of AUH source with Poisson distribution are determined by: λ HX ( ) = λloge λlogλ + e log(!) (22) l =! λ Cc = + c (23) Proof Considering (2) (4), the probabilities of intermediate nodes are obtained by: p = e ; =,,2,... (24) j! j = j λ The average codeword length is obtained considering (2) (6) in (7) l = λ +. (25) The probabilities of delivering the symbols x = or x = from the state S,,2,... are = 8

λ e p px (! S ) = =, =,2,... j p λ e j! j = (26) j λ e p! j = j px ( S ) = =, =,2,... j p λ e j! j = (27) respectively. The probabilities of delivering the symbols x = or x = from the state S, =,,2,... are px S = e = (28) ( ),,,2,... px S ( ) e,,,2,... = = (29) respectively. Considering (2), (24) (25) into (5) (6), we get the stationary state probabilities: λ π = e, =,,2,... l! (3) π j λ = e, =,,2,.... (3) l j = j! Substituting (26) - (3) into (7), we get the entropy of the source with memory in (22). The average cost of the AUH code for the source with Poisson distribution, given in (23), is obtained by substituting (2) into (9). 4. AUH sources with negative binomial distribution Let there be a discrete source with infinite alphabet, characterized by the negative binomial distribution: s s s s 2 ξ : r r r r r r 2 r r p = Cpp r = Cpqp r 2 = Cpq r+ p = Cr+ pq, (32) 9

where q=-p. The source is complete, because r r Cr pq = (33) + = Theorem 2 The entropy the average cost of the code resulting by binary encoding of AUH source with negative binomial distribution are determined by: qr HX ( ) = rlogp+ logq+ p C q logc l p r r r r + r + = (34) qr Cc = + c (35) p Proof Considering (32) (4), the probabilities of intermediate nodes are obtained by: p r r+ j r j j = = C pq ; =,,2,... (36) The average codeword length is obtained considering (32) (6) in relation (7) rq + p l = + = r r ( ) Cr + pq. (37) = p The probabilities of delivering the symbols x = or x = from the state S,,2,... are = p C pq px ( S ) = =, =,2,... r r r+ p r r j Crj + pq j = (38) r r j C rj + pq p j = px ( S ) = =, =,2,... p r j C pq j = r rj + (39) respectively. The probabilities of delivering the symbols x = or x = from the state S, =,,2,... are

px ( S ) = C p = p, =,,2,... (4) r r r r px ( S ) = C p = p, =,,2,... (4) r r r r+ respectively. Considering (32), (36) (37) in (5) (6), we get the stationary state probabilities: π = ( Cr+ pq ), =,,2,... (42) l r r π = C pq, =,,2,... (43) r r j r+ j l j = Substituting (38) - (43) into (7), we get the entropy of the source with memory, given in (34). We obtain the average cost of the AUH code in (35) for the source with negative binomial distribution, by substituting (32) in (9). 5. AUH sources with geometric distribution Let there be a discrete source with infinite alphabet, characterized by the geometric distribution: s s s2 s ξ : 2 p = q p = pq p2 = pq p = pq, (44) where q=-p. In [23] it is shown that geometric distribution with parameter < p ( 5 ) /2 satisfies condition () leads to an AUH code. The source is complete, because pq= (45) = Theorem 3 The entropy the average cost of the code resulting by binary encoding of AUH source with geometric distribution are determined by:

p HX ( ) = log( p) + logp l p (46) p Cc = + c p (47) Proof Considering (44) (4), the probabilities of intermediate nodes are obtained by: + p = p ; =,,2,... (48) The average codeword length is obtained considering (44) (6) in (7) l = ( + ) qp =. (49) p = The probabilities of delivering the symbols x = or x = from the state S,,2,... are = px ( S ) = q= p, =,2,... (5) px ( S ) = p, =,2,... (5) respectively. The probabilities of delivering the symbols x = or x = from the state S, =,,2,... are px S ( ) p,,,2,... = = (52) px S ( ) p,,,2,... = = (53) respectively. Considering (44), (48) (49) in (5) (6), we get the stationary state probabilities: ( ) π = p p (54) l π = p (55) l + Substituting (5) - (55) in (7), we get the entropy of the source with memory, given in (46). 2

We obtain the average cost of the AUH code in (47) for the source with geometric distribution substituting (44) in (9). 6. AUH sources with exponential distribution Let there be a discrete source with infinite alphabet, characterized by the exponential distribution: s s s s 2 ξ : λ 2 λ p = e p = e ( e ) p2 = e ( e ) p = e ( e ), (56) In [6] it is shown that the discrete form of exponential distribution is anti uniform iff λ ln{( + 5) / 2)}.482. The source is complete, because e ( e ) = (57) = Theorem 4 The entropy of the source ξ, as well as the entropy the average cost of the source with memory resulted by binary encoding of AUH source with exponential distribution are determined by: e loge HX ( ) = log( e ) + l e λ (58) e Cc = + c λ e (59) Proof Considering (56) (4), the probabilities of intermediate nodes are obtained by: p = e ; =,,2,... (6) ( + ) λ The average codeword length is obtained considering (56) (6) in relation (7) l = ( + ) e ( e ) = (6) e = The probabilities of delivering the symbols x = or x = from the state S,,2,... are = px ( S ) e =, =,2,... (62) 3

px ( S ) e =, =,2,... (63) respectively. The probabilities of delivering the symbols x = or x = from the state S, =,,2,... are px S ( ) e,,,2,... = = (64) px S = e = (65) ( ),,,2,... respectively. Considering (56), (6) (6) in (5) (6), we get the stationary state probabilities as: ( ee ) π = (66) l π = e (67) l ( + ) λ Substituting (62) - (67) in (7), we get the entropy of the source with memory, given in (58). We obtain the average cost of the AUH code in (59) for the source with exponential distribution, substituting (56) in (9). 7. Conclusions In this paper we have considered the class of AUH sources with infinite alphabets. We have shown that performing a binary Huffman encoding of these sources, sources with memory result. For these sources we have build the encoding graph have specified the rules for drawing the graph of the source with memory X = { x =, x = }. The graph of the source with memory is obtained from the encoding graph by lining the terminal nodes with the states S S through the graph root. The states of the source with memory correspond to the terminal or intermediate nodes in the encoding graph, excepting the root. We have determined in the general case the state probabilities of the source with memory, as well as the transition probabilities between states. The 4

entropy of this source with memory is computed. We assumed the costs c c for the symbols, respectively, compute the average cost for these Huffman codes. Obviously, the Huffman encoding procedure assures minimum average length, but the average cost is not minimum. It can be easily verified that if the costs of symbols are equal to unity, the average cost becomes equal to the average length. We applied the results for several AUH sources with Poisson, negative binomial, geometric exponential distributions. 8. References [] Huffman R. A method for the construction of minimum redundancy codes. Proc. IRE 952; 4: p. 98. [2] Esmaeili M, Kahbod A. On antiuniform partially antiuniform sources. Proc. IEEE ICC, 26. p. 6 5. [3] Mohajer S. Kahbod A. Anti uniform Huffman codes. IET Commun. 2; 5: p. 23 9. [4] Mohajer S, Kahbod A. Tight bounds on the AUH codes. Proceedings on Information Sciences Systems, Princeton, New Jersey, USA, 28. p. 4. [5] Mohajer S, Pazad P, Kahbod A. Tight bounds on the redundancy of Huffman codes. Proceedings on IEEE ITW, Punta del Este 26. p. 3 5. [6] Esmaeili M, Kahbod A. On information theory parameters of infinite anti-uniform sources. IET Commun 27; : p. 39 4. [7] Zaharia G, Munteanu V, Tarniceriu D. Tight bounds on the codeword length average codeword length for D-ary Huffman codes Part. Proceedings IEEE ISSCS, Iasi, Romania 29. p. 53-4. [8] Zaharia G, Munteanu V, Tarniceriu D. Tight bounds on the codeword length average codeword length for D-ary Huffman codes Part 2. Proceedings IEEE ISSCS, Iasi, Romania 29. p. 535-8. [9] Cicalese F, Vaccaro U. Bounding the Average Length of Optimal Source Codes via Majorization Theory. Trans. Inf. Theory 24; 5: p. 633-637. [] Johnsen O. On the redundancy of binary Huffman codes. IEEE Trans. Inf. Theory 98; p. 26:22-222. [] Bradford P, Golin M, Larmore LL Rytter W. Optimal prefix free codes for unequal letter costs dynamic programming with the Monge property. J. Algorithms, 22; 42: p. 277 33. [2] Gilbert E N. Coding with digits of unequal costs. Trans. Inf. Theory 995; 4: p. 596-6. [3] Blachman NM. Minimum cost coding of information. IRE Trans. Inf. Theory 954; PGIT-3: p. 39-49. 5

[4] Varn B. Optimal variable length codes (arbitrary symbol cost equal code word probability). Inf. Contr. 97; 9: p. 289-3. [5] Krause R M. Channels which transmit letters of unequal duration. Inf. Contr. 962; 55: p. 3 24. [6] Golin M, Rote G. A dynamic programming algorithm for constructing optimal prefix-free codes for unequal costs. IEEE Trans. Inf. Theory 998; 44(5): p. 77-78. [7] Berger T, Yeung R. Optimum -ended binary prefix codes. Trans. Inf. Theory 99; 36(6): p. 434-44. [8] Capocelli RM, De Santis A, Gargano L, Vaccaro U. On the construction of statistically synchronizable codes. IEEE Trans. Inf. Theory 992; 38: p. 47-44. [9] Capocelli RM, De Santis A, Persiano G. Binary prefix codes ending in a Trans. Inf. Theory 994; 4: p. 296-32. [2] Korach E, Dolov S, Yuelson D. The sound of silence: Guessing games for saving energy in mobile environment. Proc. Of the Eighteenth Annual Joint Conference of IEEE Computer Communications Societies (IEEE INFOCOM 99) 999. p. 768-75. [2] Golin M, Li J. More efficient algorithms analyses for unequal letter cost prefix-free coding. Trans. Inf. Theory 28; 54(8): p. 342-3424. [22] Humblet P. Optimal source coding for a class of integer alphabets. IEEE Trans. Inf. Theory 978; 24(): p. 2. [23] Gallager R, Van Voorhis D. Optimal source coding for geometrically distributed integer alphabets. IEEE Trans. Inf. Theory 975; 2(2): p. 228 23. [24] Esmaeili M, Gulliver A, Kahbod A. The Golden Mean, Fibonacci Matrices Partial Wealy Super- Increasing Sources. Chaos, Solitions Fractals 29; 42(): p. 435 44. [25] Kemeny JG, Snell TL. Finite Marov Chains. Princeton, NJ: Van Nostr, 96. [26] Cover TM, Thomas JA. Elements of Information Theory. John Wiley Sons, Inc. New Yor, 99. 6