An Extended Fano s Inequality for the Finite Blocklength Coding

Similar documents
Upper Bounds on the Capacity of Binary Intermittent Communication

ELEC546 Review of Information Theory

LECTURE 13. Last time: Lecture outline

Chapter 4. Data Transmission and Channel Capacity. Po-Ning Chen, Professor. Department of Communications Engineering. National Chiao Tung University

Transmitting k samples over the Gaussian channel: energy-distortion tradeoff

Dispersion of the Gilbert-Elliott Channel

EE5139R: Problem Set 7 Assigned: 30/09/15, Due: 07/10/15

Lecture 3: Channel Capacity

Solutions to Homework Set #1 Sanov s Theorem, Rate distortion

(Classical) Information Theory III: Noisy channel coding

Cut-Set Bound and Dependence Balance Bound

Lecture 2: August 31

Soft Covering with High Probability

Capacity of a channel Shannon s second theorem. Information Theory 1/33

Entropies & Information Theory

EE5139R: Problem Set 4 Assigned: 31/08/16, Due: 07/09/16

The Method of Types and Its Application to Information Hiding

A General Formula for Compound Channel Capacity

ECE 4400:693 - Information Theory

Lecture 11: Quantum Information III - Source Coding

National University of Singapore Department of Electrical & Computer Engineering. Examination for

Lecture 8: Shannon s Noise Models

Principles of Communications

Superposition Encoding and Partial Decoding Is Optimal for a Class of Z-interference Channels

Achieving Shannon Capacity Region as Secrecy Rate Region in a Multiple Access Wiretap Channel

Lecture 6 I. CHANNEL CODING. X n (m) P Y X

An instantaneous code (prefix code, tree code) with the codeword lengths l 1,..., l N exists if and only if. 2 l i. i=1

Arimoto Channel Coding Converse and Rényi Divergence

Chapter 2: Entropy and Mutual Information. University of Illinois at Chicago ECE 534, Natasha Devroye

Capacity of the Discrete Memoryless Energy Harvesting Channel with Side Information

An Achievable Error Exponent for the Mismatched Multiple-Access Channel

Variable Length Codes for Degraded Broadcast Channels

arxiv: v4 [cs.it] 17 Oct 2015

Noisy channel communication

Information measures in simple coding problems

arxiv: v1 [cs.it] 5 Sep 2008

5 Mutual Information and Channel Capacity

Large Deviations Performance of Knuth-Yao algorithm for Random Number Generation

Two Applications of the Gaussian Poincaré Inequality in the Shannon Theory

Training-Based Schemes are Suboptimal for High Rate Asynchronous Communication

Chapter 8: Differential entropy. University of Illinois at Chicago ECE 534, Natasha Devroye

LECTURE 15. Last time: Feedback channel: setting up the problem. Lecture outline. Joint source and channel coding theorem

Multiaccess Channels with State Known to One Encoder: A Case of Degraded Message Sets

Lecture 4 Channel Coding

Channels with cost constraints: strong converse and dispersion


Computing and Communications 2. Information Theory -Entropy

Bounds on Achievable Rates for General Multi-terminal Networks with Practical Constraints

Shannon s noisy-channel theorem

A new converse in rate-distortion theory

(each row defines a probability distribution). Given n-strings x X n, y Y n we can use the absence of memory in the channel to compute

Communications Theory and Engineering

Chapter 3 Source Coding. 3.1 An Introduction to Source Coding 3.2 Optimal Source Codes 3.3 Shannon-Fano Code 3.4 Huffman Code

Solutions to Homework Set #3 Channel and Source coding

4F5: Advanced Communications and Coding Handout 2: The Typical Set, Compression, Mutual Information

Arimoto-Rényi Conditional Entropy. and Bayesian M-ary Hypothesis Testing. Abstract

Source Coding. Master Universitario en Ingeniería de Telecomunicación. I. Santamaría Universidad de Cantabria

An introduction to basic information theory. Hampus Wessman

EE 4TM4: Digital Communications II. Channel Capacity

EE5319R: Problem Set 3 Assigned: 24/08/16, Due: 31/08/16

Information Theory, Statistics, and Decision Trees

A Relation Between Weight Enumerating Function and Number of Full Rank Sub-matrices

On Multiple User Channels with State Information at the Transmitters

Exercise 1. = P(y a 1)P(a 1 )

Discrete Memoryless Channels with Memoryless Output Sequences

Strong Converse Theorems for Classes of Multimessage Multicast Networks: A Rényi Divergence Approach

Feedback Capacity of the Gaussian Interference Channel to Within Bits: the Symmetric Case

(Classical) Information Theory II: Source coding

Phase Transitions of Random Codes and GV-Bounds

Lecture 2. Capacity of the Gaussian channel

Lecture 4 Noisy Channel Coding

Joint Write-Once-Memory and Error-Control Codes

A Tight Upper Bound on the Second-Order Coding Rate of Parallel Gaussian Channels with Feedback

Lecture 22: Final Review

Information-theoretic Secrecy A Cryptographic Perspective

Frans M.J. Willems. Authentication Based on Secret-Key Generation. Frans M.J. Willems. (joint work w. Tanya Ignatenko)

Information Theory CHAPTER. 5.1 Introduction. 5.2 Entropy

Shannon s Noisy-Channel Coding Theorem

On the Secrecy Capacity of Fading Channels

A One-to-One Code and Its Anti-Redundancy

Optimization in Information Theory

EE229B - Final Project. Capacity-Approaching Low-Density Parity-Check Codes

EE376A: Homework #3 Due by 11:59pm Saturday, February 10th, 2018

Lecture 5: Channel Capacity. Copyright G. Caire (Sample Lectures) 122

Network coding for multicast relation to compression and generalization of Slepian-Wolf

Fixed-Length-Parsing Universal Compression with Side Information

Channel Dispersion and Moderate Deviations Limits for Memoryless Channels

On the Polarization Levels of Automorphic-Symmetric Channels

An approach from classical information theory to lower bounds for smooth codes

An Alternative Proof of Channel Polarization for Channels with Arbitrary Input Alphabets

ELECTRONICS & COMMUNICATIONS DIGITAL COMMUNICATIONS

Resource Allocation for Wireless Fading Relay Channels: Max-Min Solution 1 2

Approaching Blokh-Zyablov Error Exponent with Linear-Time Encodable/Decodable Codes

EE/Stat 376B Handout #5 Network Information Theory October, 14, Homework Set #2 Solutions

Solutions to Set #2 Data Compression, Huffman code and AEP

IN this paper, we consider the capacity of sticky channels, a

LECTURE 3. Last time:

ECE Information theory Final (Fall 2008)

Mismatched Multi-letter Successive Decoding for the Multiple-Access Channel

On Dependence Balance Bounds for Two Way Channels

Transcription:

An Extended Fano s Inequality for the Finite Bloclength Coding Yunquan Dong, Pingyi Fan {dongyq8@mails,fpy@mail}.tsinghua.edu.cn Department of Electronic Engineering, Tsinghua University, Beijing, P.R. China. arxiv:131.763v1 [cs.it] 31 Jan 13 Abstract Fano s inequality reveals the relation between the conditional entropy and the probability of error. It has been the ey tool in proving the converse of coding theorems in the past sixty years. In this paper, an extended Fano s inequality is proposed, which is tighter and more applicable for codings in the finite bloclength regime. Lower bounds on the mutual information and an upper bound on the codeboo size are also given, which are shown to be tighter than the original Fano s inequality. Especially, the extended Fano s inequality is tight for some symmetric channels such as the q-ary symmetric channels (QSC). Index Terms Fano s inequality, finite bloclength regime, channel coding, Shannon theory. I. INTRODUCTION As nown to all, Shannon s information theory deals mainly with the representation and transmission of information. In the development of both source and channel coding theorems, especially for their converses, Fano s inequality serves as the ey tool []. Theorem 1: Fano s Inequality X and Y are two random variables following (X,Y) p(x,y) and X,Y X. Define P e = Pr{X Y}, then H(X Y) H(P e )+P e log(m 1) (1) where M = X is the cardinality of X and Y, H(P e ) is the binary entropy function H(x) = [x log x+(1 x) log(1 x)] for x 1. Thus Fano inequality can be further relaxed by H(P e ) 1. Usually, the left hand side of (1) is referred to as the equivocation, which is quantified by the conditional entropy. Particularly, it represents the uncertainty whether the restored/decoded message Y is the same as the original one, i.e., X. On the other hand, the right hand side implies the reliability of the source/channel coding in terms of a function of error probability P e. It was shown in [3] that vanishing equivocation implies vanishing error probability. However, vanishing error probability does not necessarily guarantee a vanishing equivocation, especially for some X,Y of countably infinite alphabet. In proving the converse of coding theorems, one wants to find the upper bound on the size of the codeboo given arbitrary code length and error probability. The following theorem is an immediate inference of Fano s inequality, simple but useful. Theorem : [4] Suppose X and Y are two random variables that tae values on the same finite set with cardinalitym and at least one of them is equiprobable. Then the mutual information between them satisfies I (1 P e )logm H(P e ), () where P e = Pr{X Y} and H(x) = [xlogx + (1 x) log(1 x)]. As an inference of this result, the following theorem gives an upper bound on the size of a code as a function of the average error probability. Theorem 3: [8] Every (M, ǫ)-code (average probability of error) for a random transformation P Y X satisfies logm 1 (1 ǫ) sup I+ 1 H(ǫ), (3) X (1 ǫ) where ǫ = Pr{X Y}, H(x) = [xlogx+(1 x)log(1 x)]. Although simple, these two theorems are insightful and easy to compute both in theory and numerically. The classical coding theorems are mainly based on the asymptotic equipartition property (AEP) and typical/jointtypical decoder [1]. However, applications of AEP requires infinite long codewords, where the error probability goes either to or 1 as the code length goes to infinity. Although these coding theorems provide fundamental limits for modern communications, research on the finite bloclength coding schemes are more important in engineering applications. Given the bloc length, upper bounds on the achievable error probability and the achievable code size were obtained in [8]. Most importantly, a tight approximation for the achievable maximal rate given the error probability and code length was presented. In this paper, we consider the entropy of one random variable vector conditioned on another, and the corresponding probability of error in guessing one from the other, by proposing an extended Fano s inequality. The extended Fano s equality has better performance by taing advantage of a more careful consideration on the error patterns. It suits codings in the finite bloclength regime better and is useful in bounding the mutual information between random vectors, and the codeboo size given the bloc length and average symbol error probability constraint. In the following part of this paper, we present the extended Fano s inequality in Section II first. The lower bounds on the mutual information between two random variable vectors and a upper bound on the codeboo size given the bloc length and error probability are given in Section III. An application of the

the obtained result to the q-ary symmetric channels (QSC) are presented in Section IV, which shows that the extended Fano s inequality is tight for such channels. Finally, we concluded the paper in Section V. Throughout the paper, vectors indicated by bold. II. FANO S INEQUALITY EXTENSION Although Fano s inequality has been used widely in the past few years, it can be improved by treating the error events more carefully. In this section, a refinement of Fano s inequality is presented, which is tighter and more applicable for finite bloclength coding design. Theorem 4: Fano s Inequality Extension Suppose that X = {X 1,X,,X n } and Y = {Y 1,Y,,Y n } are two n-dimension random vectors where X and Y ( = 1,,,n) tae values on the same finite set X with cardinality X = q. Then the conditional entropy satisfies H(X Y) H(p)+ ( p log Cn(q 1) ), (4) where H(p) = n p logp is the discrete entropy function. p = {p,p 1,,p n } is the error distribution, where the error probabilities are p = Pr(H d (X,Y ) = ) for =,1,,n. H d (X,Y ) is the generalized Hamming distance, defined as the number of symbols in X that are different from the corresponding symbol in Y. Proof: Define the error random variable as E = if H d (X,Y ) = for =,1,,n. According to the chain rule of the joint entropy, H(E,X Y ) can be expressed in the following ways, H(E,X Y ) = H(X Y )+H(E X,Y ) (5a) = H(E Y )+H(X E,Y ) (5b) Particularly, in (5.a), it is clear that H(E X,Y ) =. Then we have H(X Y ) =H(E Y )+H(X E,Y ) (a) H(E)+H(X E,Y ), where (a) follows the fact that entropy increases if its condition is removed, i.e., H(E) H(E Y ). Particularly, we have H(E) = H(p) = n p logp. According to its definition, we have H(X E,Y ) = p H(X E =,Y ). (7) When consideringh(x E =,Y ), we now that there are disaccord symbol pairs between X and Y. For each fixed Y = y, every symbol in X which belongs to a disaccord pair has q 1 possible choices except the one in y. Thus X (E =,y) has (q 1) choices. Besides, there C n selections for the positions of error symbols for each given. Therefore, the total number of possible codeword X is C n (q 1), which means H(X E =,Y ) log ( C n (q 1)). (8) (6) Fig. 1. The q-ary Symmetric Channel Particularly, note that H(X E =,Y) = since there is no uncertainty in determiningx fromy,if they are the same. Then, (7) can be written as H(X E,Y ) = P(E = )H(X E =,Y ) p log ( Cn (q 1)). By combining (6) and (9), the proof of the theorem is completed. Remar 1: In fact, the error distribution p is easy to calculate, especially for some special channels. For example, the discrete q-ary symmetric channels is shown in Fig 1. In this situation, p = C n ((q 1)ε) (1 (q 1)ε) n. Remar : It is clear that Theorem 4 is a generalization of Fano s inequality. Specifically, when the bloc length is 1, i.e., n = 1, we have p 1 = p e, p = 1 p e, and C 1 1 = 1. In this case, Theorem 4 reduces to (9) H(X Y) H(p e )+p e log(q 1) (1) which is exactly the same as Fano s inequality. As a variant of Theorem 4, the following theorem presents the conditional entropy in terms of relative entropy. Theorem 5: Suppose that X = {X 1,X,,X n } and Y = {Y 1,Y,,Y n } are two n-dimension random vectors where X and Y ( = 1,,,n) tae values on the same finite set X with cardinality X = q. Then the conditional entropy satisfies H(X Y ) nlogq D(p q), (11) where D(p q) = n p log p q is the discrete relative entropy function. The error probabilities are p = Pr(H d (X,Y ) = ) for =,1,,n. Donate p = {p,p 1,,p n }, and q = {q,q 1,,q n } is a probability distribution with q = C n (q 1) q n Proof: Firstly, we now from the binomial theorem that { C n a b n (a+b) } n n is a probability distribution. Let a = q 1 and b = 1, we now that q is a probability distribution where q = C n (q 1) q. n According to Theorem 4, H(X Y ) (a) p p log = nlogq D(p q) (q 1) C n

where (a) holds because log ( C n(q 1) ) = for =. Remar 3: By the definition of p, it is clear that it reflects the error performance of the channel and is totaly determined by the channel itself. On the contrary,q is a distribution where each error patten is assumed to appear equiprobablely. In this situation, the probability that there are error symbols in the codeword is q = C n (q 1 q ) ( 1 q )n. Thus D(p q) is the distance between the actual error pattern distribution and the uniform error pattern distribution. Particularly, if the channel is an error free one, i.e., p = 1 and p = for 1 n, we have D(p q) = n p log p q = 1 log 1 q = logq n. According to Theorem 5, we get H(X Y ) =, which is reasonable with the assumption on the channel. In this sense, Theorem 5 is tight. The Fano s inequality has been playing an important role in the history of information theory because it built a close connection between conditional entropy and error probability. For extended Fano s inequality given in Theorem 4, it is especially applicable in finite bloclength coding. It also presents the relationship between conditional entropy and error probabilities, which are defined as follows. Definition 1: Bloc error probabilityp b is the average error probability of a bloc (codeword), i.e., P b = Pr{X Y }. Then we have P b = n p. Thus, we have P b p for any < n. Definition : Symbol error probability P s is the average error probability of a symbol, i.e., P s = Pr{X Y }, which can be expressed by P s = 1 n n p. Remar 4: In many communication systems, especially those using error correction channel codings, a bloc error doesn t imply a system failure. On the contrary, the error can be corrected or part of the bloc can still be used with some performance degradation. In this case, the symbol error is more useful than the bloc error. Particularly, a corollary following our result as shown below will answer this problem. Corollary 1: Suppose that X = {X 1,X,,X n } and Y = {Y 1,Y,,Y n } are two n-dimension random vectors where X and Y tae values on the same finite set X with cardinality X = q. Then the conditional entropy satisfies H(X Y ) n D(p w)+np s log(q 1), (1) where w = {w,w 1,,w n } is a probability distribution with w = C n. Proof: According n to Theorem 4, one has ( ) C H(X Y) H(p)+ p log n n n + p log(q 1) = p logp + p logw + p log n = n D(p w)+np slog(q 1). Remar 5: Since the distribution w can be expressed as w = C n (1 ) ( 1 )n, which is a binomial distribution with the symbol error probability of.5. D(p w) is a measure of the distance between the error probability distribution and the binomial distribution with parameter.5. Remar 6: If one taes n = 1, Corollary 1 will reduces to H(X Y) 1+P s log(q 1), which is a frequently used form of Fano s inequality. Remar 7: It is seen that the extended Fano s inequality builds a natural connection between conditional entropy and symbol error and is especially applicable for finite length codings. III. CONVERSE RESULTS A. Lower Bounds on the Mutual Information Based on the proposed generalized Fano s inequality, the following lower bounds on the mutual information between X and Y can be obtained. Theorem 6: Suppose that X = {X 1,X,,X n } and Y = {Y 1,Y,,Y n } are two n-dimension random vectors that satisfy the following. 1) X and Y ( = 1,,,n) tae values on the same finite set X with cardinality X = q. ) Either X or Y is equiprobable. 3) The error probabilities are p = Pr(H d (X,Y ) = ) for =,1,,n. Donate the error distribution as p = {p,p 1,,p n }. Then the mutual information between X and Y satisfies I(X;Y ) nlogq H(p) p log ( Cn (q 1)), (13) where H(p) = n p logp is the entropy function. Proof: If X is equiprobable, H(X) = logq n = nlogq. On the other hand, the mutual information is given by I(X;Y ) = H(X) H(X Y ). Together with Theorem 4, I(X;Y ) nlogq H(p) p log ( Cn(q 1) ). (14) Note thatx andy are totally symmetric in (14). Therefore, if Y is assumed to be equiprobable at the beginning of the proof, one can get the same result. Thus Theorem 6 is proved. By using Theorem 5, the mutual information between X and Y can be bounded by the following Corollary. Corollary : Suppose that X = {X 1,X,,X n } and Y = {Y 1,Y,,Y n } are two n-dimension random vectors that satisfy the following. 1) X and Y ( = 1,,,n) tae values on the same finite set X with cardinality X = q. ) Either X or Y is equiprobable. 3) The error probabilities are p = Pr(H d (X,Y ) = ) and p = {p,p 1,,p n }. Then the mutual information between X and Y satisfies I(X;Y ) D(p q) (15) where D(p q) = n p log p q is the discrete relative entropy function and q = {q,q 1,,q n } is a probability distribution with q = C n (q 1) q n

The distribution q means that the symbol in Y taes any value on B with equal probability, regardless of what is sent in X. So it is a pure random distribution when X and Y are independent from each other. The most desirable coding scheme is that its error distribution p is farthermost from q, which also ensures a larger coding rate. B. Upper Bounds on the Codeboo Size Suppose X is a finite alphabet with cardinality X = q. Let s consider the input and output alphabets A n = B n X n with A n = B n = M and a channel to be a sequence of conditional probabilities [5] {P Y X : A n B n }. We donate a codeboo with M codewords by (X 1,X,,X M ) A n. A decoder is a random transformation P Z Y : B n {,1,,,M} where indicates that the decoder choose error. If messages are equiprobable, the average error probability is defined as P b = 1 1 M P Z X(m X m ). An codeboo with M codewords and a decoder whose average probability of error is smaller than ǫ are called an (n,m,ǫ)-code. An upper bound on the size of a code as a function of the average probability of symbol error follows the Corollary 1. Theorem 7: Every (n, M, ǫ)-code for a random transformation P Z Y satisfies logm supi(x;y ) D(p w)+n(1+p s log(q 1)) X (16) where p = {p,p 1,,p n } is the error distribution with p = Pr(H d (X,Y ) = ) for =,1,,n, w = {w,w 1,,w n } is a probability distribution with w = C n. n Proof: Since the messages are equiprobable, we have H(X) = logm. According Corollary 1, I(X;Y ) =H(X) H(X Y ) logm n+d(p w) np s log(q 1). (17) SolvinglogM from (17), one can get (16), which completes the proof. IV. APPLICATION TO CHANNEL CODING Consider information transmission over a memoryless discrete q-ary symmetric channel with a channel code (n,m,ǫ) with crossover probability ε, as shown in Fig. 1. In this case, the probability of symbol error is p e = (q 1)ε. Then the error probabilities are p = Pr{H d (X,Y ) = } = C n p e (1 p e) n (18) and the bloc error probability is P b = 1 p = 1 (1 p e ) n. (19) Using the extended Fano s inequality in Theorem 4, we have ( C H e (X Y ) p log n (q 1) ). () It is easy to see that the conditional entropy in theory is p H(X Y ) = H (1 p e,ε,,ε) = (1 p e )log(1 p e ) (q 1)logε. (1) By Corollary, mutual information is lower bounded by ( ) C I e(x;y ) p log n p e(1 p e) n (C n(q 1) )/q n () = nlogq + p [logp e +(n )log(1 p e) log(q 1)]. while the capacity of the memoryless QSC is given by I = logq H (1 p e,ε,,ε) =logq +(1 p e )log(1 p e )+(q 1)εlogε. And the relative entropy D(p w) can be derived as ( C D(p w) = p log n ε (1 (q 1)ε) n ) C n /n =n+ p [logp e +(n )log(1 p e )] (3) (4) For a given average symbol error probability constraint P s = ǫ, the upper bound on the maximum codeboo size given by Theorem 7 is logm e I(X;Y ) D(p w)+n(1+p slog(q 1)) =nlogq nh(1 p e,ε,,ε) p [logp e + (n )log(1 p e)]+nǫlog(q 1). On the other hand, by Fano s inequality we have (5) H f (X Y ) H(P b )+P b log(q n 1) (6) with P b given by (19). Then the lower bound of the mutual information is I f (X;Y ) (1 P b )logq n H(P b ). (7) Finally, the upper bound on the codeboo size is logm f 1 1 ǫ (ni+h(ǫ)). (8) Suppose the QSC parameter are ε =.1 and q = 7, we calculated the bounds on conditional entropy, mutual information and codeboo size by our proposed results and Fano s inequality. Firstly, the upper bound on the conditional entropy is presented in Fig.. Specially,H e (X Y ) is obtained by the extended Fano s inequality (), H f (X Y ) is calculated according to Fano s inequality (6) and H(X Y ) is the conditional entropy in theory (1). It is clear that Theorem 4 is tighter than Fano s inequality. Particularly, we have H e (X Y ) = H(X Y ) for the QSC. This is because the error distributions are the same for any Y = y. So H(E) = H(E Y ) holds. Besides, the error pattern is uniformly distributed for a given, regardless of y and H(E) = H(Y E, Y ) = H(Y E) = logc n (q 1) holds. Therefore, the upper bound is tight. However, for Fano s equality, there are relaxations in both H(E) = H(E Y ) to H(E) and H(X E,Y ) to P b log(m 1).

5 13 Conditional entropy H(Y X) 15 1 5 H e (Y X) H f (Y X) H(Y X) Mutual information I 3 I e I f I The codeboo size log M 1 11 1 log M e log M f I 1 15 5 3 35 Fig.. The upper bound on conditional entropy, q = 7,ε =.1 Mutual information I 11 1 3 I e I f I 1 3 The codeboo size log M 1 1 1 1 1 log M e log M f 1 3 Fig. 3. Bounds on mutual information and codeboo size v.s. blocleng, q = 7,ε =.1 Similarly, the lower bound on mutual informationi e (X;Y ) given by () coincides with I(X; Y ) in theory, given by (3) and is better than that given by Fano s inequality (7). When we use the upper bound on the codeboo size in Theorem 7, it should be noted that it is presented as a function of symbol error probability P s. In fact, P b is always larger than P s. Therefore, we use the same fraction of them in the calculation of the bounds to mae sense of the comparison, i.e., ǫ = Ps for (5) and ǫ = P b for (8). It is also seen from Fig. 3 that our new developed result is tighter. The performances of Theorem 6 and Theorem 7 versus the QSC parameter ε are shown in Fig. 4, where the bloc length is chosen as n = 3. As shown, the mutual information bound is tight and our results are much better than Fano s inequality. In the calculation of the upper bounds on codeboo size, the selection of the error probability constraints are also chosen as ǫ e = Ps(ε) and ǫ f = P b(ε) so that they are comparable. V. CONCLUSION In this paper, we revisited Fano s inequality and extended it to a general form. Particularly, the relation between the conditional entropy and error probability of two random vectors was considered, other than that between two random variables. This maes the developed results more suitable 1.5.1.15. QSC parameter ε.5.1.15. QSC parameter ε Fig. 4. Bounds on mutual information and codeboo size v.s. QSC parameter ε, q = 7,n = 3 for source/channel codings in the finite bloclength regime. By investigating the bloc error pattern more detailedly, the conditional entropy of the original random vector given the received one is upper bounded more tightly by the extended Fano s inequality. Furthermore, the extended Fano s inequality is completely tight for some symmetric channels such the q- ary symmetric channels. Converse results are also presented in terms of lower bounds on the mutual information and a upper bound on the codeboo size under the bloclength and symbol error constraints, which also have better performances. ACKNOWLEDGEMENT This wor was partially supported by INC research grant of Chinese University of Hongong and the China Major State Basic Research Development Program (973 Program) No. 1CB3161(). REFERENCES [1] C. E. Shannon, A mathematical theory of communication, Bell Syst. Tech. J. vol. pp.63-656, Oct. 1948. [] R. M. Fano, Class Notes for Transmission of information, Course6.574. Combridge, MA: MIT. 195. [3] R. W. Raymond, Information Theory and Networ Coding, Berlin/Newyor: Springer. 8. [4] T.S. Han and S. Verdu, Generalizing the fano inequality, IEEE Trans. Inform. Theory, vol., no. 7, pp. 147-1, Jul. 1994. [5] S. W. Ho and S. Verdu, On the interplay between conditional entropy and error probability IEEE Trans. Inform. Theory, vol.56, no.1, pp.593-594, Dec. 1. [6] D. L. Tebbe and S. J. Dwyer, III, Uncertainty and probability of error, IEEE Trans. Inform. Theory, vol. 14, no. 3, pp. 516-518, May 1968. [7] M. Feder and N. Merhav, Relations between entropy and error probability, IEEE Trans. Inform. Theory, vol., no. 1, pp. 59-66, Jan. 1994. [8] Y. Polyansiy, H. V. Poor, and S. Verdu, Channel coding rate in the finite bloclength regime, IEEE Trans. Inform. Theory, vol. 56, no. 5, pp. 37-359, May 1.