Some Coding Theorems on Fuzzy Entropy Function Depending Upon Parameter R and Ѵ

Similar documents
A Generalized Fuzzy Inaccuracy Measure of Order ɑ and Type β and Coding Theorems

New Generalized Entropy Measure and its Corresponding Code-word Length and Their Characterizations

CODING THEOREMS ON NEW ADDITIVE INFORMATION MEASURE OF ORDER

54 D. S. HOODA AND U. S. BHAKER Belis and Guiasu [2] observed that a source is not completely specied by the probability distribution P over the sourc

A Coding Theorem Connected on R-Norm Entropy

[1] Abramson, N. [1963]: [2] Aczel, J. [1975]: [3] Asadi, M. Ebrahimi N. [2000]: [4] Ash, B.R. [1990]: [5] Atanassov,K. [1983]: [6] Atanassov,

Fuzzy directed divergence measure and its application to decision making

On Some New Measures of Intutionstic Fuzzy Entropy and Directed Divergence

Chapter 3 Source Coding. 3.1 An Introduction to Source Coding 3.2 Optimal Source Codes 3.3 Shannon-Fano Code 3.4 Huffman Code

Generalized Entropy for Intuitionistic Fuzzy Sets

Cross-entropy measure on interval neutrosophic sets and its applications in Multicriteria decision making

An instantaneous code (prefix code, tree code) with the codeword lengths l 1,..., l N exists if and only if. 2 l i. i=1

Type-2 Fuzzy Shortest Path

Chapter 2 Date Compression: Source Coding. 2.1 An Introduction to Source Coding 2.2 Optimal Source Codes 2.3 Huffman Code

Coding of memoryless sources 1/35

Generalized Triangular Fuzzy Numbers In Intuitionistic Fuzzy Environment

Algorithmic probability, Part 1 of n. A presentation to the Maths Study Group at London South Bank University 09/09/2015

Information Theory and Coding Techniques

PART III. Outline. Codes and Cryptography. Sources. Optimal Codes (I) Jorge L. Villar. MAMME, Fall 2015

Bandwidth: Communicate large complex & highly detailed 3D models through lowbandwidth connection (e.g. VRML over the Internet)

Coding for Discrete Source

On Some Measure of Conditional Uncertainty in Past Life

@FMI c Kyung Moon Sa Co.

Entropy and Ergodic Theory Lecture 3: The meaning of entropy in information theory

A New Method to Forecast Enrollments Using Fuzzy Time Series

Entropy for intuitionistic fuzzy sets

Source Coding. Master Universitario en Ingeniería de Telecomunicación. I. Santamaría Universidad de Cantabria

COMM901 Source Coding and Compression. Quiz 1

On Intuitionistic Fuzzy Entropy as Cost Function in Image Denoising

Chapter 2: Source coding

Chapter 9 Fundamental Limits in Information Theory

Lecture 1: September 25, A quick reminder about random variables and convexity

Fuzzy economic production in inventory model without shortage

Some Coding Theorems on New Generalized Fuzzy Entropy of Order α and Type β

DCSP-3: Minimal Length Coding. Jianfeng Feng

Kybernetika. Harish C. Taneja; R. K. Tuteja Characterization of a quantitative-qualitative measure of inaccuracy

10-704: Information Processing and Learning Fall Lecture 10: Oct 3

arxiv: v1 [cs.it] 5 Sep 2008

Information Theory CHAPTER. 5.1 Introduction. 5.2 Entropy

Rough Soft Sets: A novel Approach


Normalized Hamming Similarity Measure for Intuitionistic Fuzzy Multi Sets and Its Application in Medical diagnosis

UNIT I INFORMATION THEORY. I k log 2

@FMI c Kyung Moon Sa Co.

Solution of Fuzzy Maximal Flow Network Problem Based on Generalized Trapezoidal Fuzzy Numbers with Rank and Mode

1 Background on Information Theory

1 Introduction to information theory

APC486/ELE486: Transmission and Compression of Information. Bounds on the Expected Length of Code Words

arxiv: v2 [cs.ds] 28 Jan 2009

Asymptotic redundancy and prolixity

arxiv: v1 [cs.lo] 16 Jul 2017

EE5139R: Problem Set 4 Assigned: 31/08/16, Due: 07/09/16

CSCI 2570 Introduction to Nanocomputing

Solution of Fuzzy System of Linear Equations with Polynomial Parametric Form

Optimal codes - I. A code is optimal if it has the shortest codeword length L. i i. This can be seen as an optimization problem. min.

FRAMES IN QUANTUM AND CLASSICAL INFORMATION THEORY

Entropy Coding. Connectivity coding. Entropy coding. Definitions. Lossles coder. Input: a set of symbols Output: bitstream. Idea

Multi attribute decision making using decision makers attitude in intuitionistic fuzzy context

SOME IDEAL CONVERGENT SEQUENCE SPACES OF FUZZY REAL NUMBERS

1 Ex. 1 Verify that the function H(p 1,..., p n ) = k p k log 2 p k satisfies all 8 axioms on H.

Fibonacci Coding for Lossless Data Compression A Review

Information Theory in Intelligent Decision Making

4. Quantization and Data Compression. ECE 302 Spring 2012 Purdue University, School of ECE Prof. Ilya Pollak

The Fuzzy Description Logic ALC F LH

Lecture 16. Error-free variable length schemes (contd.): Shannon-Fano-Elias code, Huffman code

Entropy as a measure of surprise

The Optimal Fix-Free Code for Anti-Uniform Sources

Tight Bounds on Minimum Maximum Pointwise Redundancy

Upper Bounds on the Capacity of Binary Intermittent Communication

Enumeration Schemes for Words Avoiding Permutations

On correlation between two real interval sets

3F1 Information Theory, Lecture 3

PAijpam.eu ON FUZZY INVENTORY MODEL WITH ALLOWABLE SHORTAGE

Introduction to Information Theory. By Prof. S.J. Soni Asst. Professor, CE Department, SPCE, Visnagar

Properties of intuitionistic fuzzy line graphs

FUZZY ASSOCIATION RULES: A TWO-SIDED APPROACH

Compenzational Vagueness

Financial Informatics IX: Fuzzy Sets

ON INTUITIONISTIC FUZZY SOFT TOPOLOGICAL SPACES. 1. Introduction

ISSN: Received: Year: 2018, Number: 24, Pages: Novel Concept of Cubic Picture Fuzzy Sets

Basic Principles of Lossless Coding. Universal Lossless coding. Lempel-Ziv Coding. 2. Exploit dependences between successive symbols.

(Classical) Information Theory III: Noisy channel coding

On Generalization of Fuzzy Connectives

Information Theory, Statistics, and Decision Trees

REFERENCES. Aczel, J. and Daroczy, Z. (1963). Characterisierung der entropien positiver ordnung

Multimedia Communications. Mathematical Preliminaries for Lossless Compression

@FMI c Kyung Moon Sa Co.

A Criterion for the Stochasticity of Matrices with Specified Order Relations

8. Classification and Pattern Recognition

Rule-Based Fuzzy Model

Fuzzy Local Trend Transform based Fuzzy Time Series Forecasting Model

Kraft-Chaitin Inequality Revisited

Semi Global Dominating Set of Intuitionistic fuzzy graphs

Properties of Fuzzy Labeling Graph

Friedman s test with missing observations

Metamorphosis of Fuzzy Regular Expressions to Fuzzy Automata using the Follow Automata

Fuzzy Expert Systems Lecture 3 (Fuzzy Logic)

lossless, optimal compressor

γ-synchronized fuzzy automata and their applications V. Karthikeyan, M. Rajasekar

2 Basic Results on Subtraction Algebra

Transcription:

IOSR Journal of Mathematics IOSR-JM) e-issn: 2278-5728, p-issn:2319-765x. Volume 9, Issue 6 Jan. 2014), PP 119-123 Some Coding Theorems on Fuzzy Entropy Function Depending Upon Parameter R and Ѵ M.A.K. Baig and Mohd Javid Dar P.G. Department of Statistics University of Kashmir, Hazratbal, Srinagar-190006 INDIA) Abstract: In the literature of information theory several types of coding theorems involving fuzzy entropy functions exists. In this paper, some new fuzzy coding theorems have been obtained involving utilities. The fuzzy coding theorems obtained here are not only new but also generalizes some well known results available in the literature. Key Words: Fuzzy Set, Fuzzy Entropy Function, Fuzzy Useful Entropy Function, Fuzzy code word length and Fuzzy average useful codeword length. AMS Subject Classification: 60E15, 62N05, 90B25, 94A17, 94A24. I. Introduction The notion of fuzzy sets was proposed by Zadeh [1965] with a view to tackling problems in which indefiniteness arising from a sort of intrinsic ambiguity plays a significant role. Fuzziness, a feature of uncertainty, results from the lack of sharp distinction of the boundary of a set, i.e., an individual is neither definitely a member of the set nor definitely not a member of it. The first to qualify the fuzziness was made by Zadeh [1968], who based on probabilistic framework introduced the entropy combining probability and membership function of a fuzzy event as weighted Shannon entropy. Definition 1: Let be a discrete universe of discourse. A fuzzy set A on is defined by a characteristic function { } [ ].The value of of at stands for the degree of membership of in. If every element of the set is 0 or 1, there is no uncertainty about it and a set is said to be crisp set. Definition 2: A fuzzy set is called the sharpened version of fuzzy set, if the following conditions are satisfied. and De Luca and Termini [1972] formulated a set of four properties and these properties are widely accepted as criterion for defining any fuzzy entropy. In fuzzy set theory, the entropy is a measure of fuzziness which expresses the amount of average ambiguity in making a decision whether an element belongs to a set or not. So, a measure of average fuzziness in fuzzy set should have the following properties to be valid fuzzy entropy. Sharpness): is minimum if and only if is a crisp set i.e., Maximality): is maximum if and only if is most fuzzy set i.e., Resolution):, where is sharpened version of. Symmetry):, where is the complement of i.e., II. Basic Concepts Let be discrete random variable taking on a finite number of possible values with respective membership function { } [ ] give of the element s the degree of belongingness to the set A.The function associates with each a grade of membership to the set and is known as membership function. Denote * + We call the scheme as a finite fuzzy information scheme. Every finite scheme describes a state of uncertainty. De Luca and termini [1972] introduced a quantity which, in a reasonable way to measures the amount of uncertainty fuzzy entropy) associated with a given finite scheme. This measure is given by [ ] The measure serve as a very suitable measure of fuzzy entropy of the finite information scheme. 119 Page

Let a finite source of n source symbols be encoded using alphabet of D symbols, then it has been shown by Feinstein [1958] that there is a uniquely decipherable/ instantaneous code with lengths iff the following Kraft [1949] inequality is satisfied Belis and Guiasu [1968] observed that a source is not completely specified by the probability distribution over the source alphabet in the absence of qualitative character. So it can be assumed Belis and Guiasu [1968]) that the source alphabet letters are assigned weights according to their importance or utilities in view of the experimenter. Let be the set of positive real numbers, is the utility or importance of. The utility, in general, is independent of probability of encoding of source symbol i.e,. The information source is thus given by [ ] Belis and Guiasu [1968] introduced the following quantitative- qualitative measure of information Which is a measure for the average of quantity of variable or useful information provided by the information source. Guiasu and Picard [1971] considered the problem of encoding the letter output by the source by means of a single letter prefix code whose codeword s are of lengths respectively and satisfy the Kraft s inequality, they included the following useful mean length of the code Further they derived a lower bound for. Now, corresponding to and, we have the following fuzzy measures { } { } and { } { } respectively. In the next section, fuzzy coding theorem have been obtained by considering a new parametric fuzzy entropy function involving utilities and generalized useful fuzzy code word mean length. The result obtained here are not only new, but also generalized some well known results available in the literature of information theory.. III. Fuzzy Coding Theorem on New Fuzzy Parametric Entropy Function Consider a function ) [ ) ) ] ) Where ) ) Remark 3.1). 1) When reduces to the useful R-norm fuzzy information measure corressponding to Singh, Kumar and Tuteja [2003]. 2) When reduces to the R-norm fuzzy informatin measure corressponding to Boekee and Lubbee [1980]. 3) When reduces to the De Luca and Termini [1972] measure of fuzzy entropy corresponding to the Shannon [1948] measure of entropy. Further, consider a generalized useful codeword mean length ) [ ) ) ] Where ) ) is the size of the code alphabet. 120 Page

Remark 3.2). 1) When 3.2) reduces to the fuzzy useful codeword mean length corresponding to the Sing, Kumar and Tuteja [2003]. 2) When 3.2) reduces to the fuzzy codeword mean length corresponding to the Boekee et al [1980]. 3) when 3.2) reduces to the fuzzy optimal codeword mean length corresponding to the Shannon [1948] We now establish a result that in a sense, gives a characterization of ) Under the condition Remark 3.3). ) ) When and ) 3.3) is a generalization of 2.3) which is Kraft s [1949] inequality. Theorem 3.1). For every code whose lengths satisfies 3.3), then the average fuzzy codeword length satisfies ) ) Equality holds iff ) Proof: By holder s inequality [1967] ) ) + + holds iff there exists a positve constant c such that We see the equality Setting ) ), In 3.6) and using 3.3), we get [ ) ] * )+ ) Dividing both sides of 3.8) by )), we get [ ) ) ] [ ] ) ) Taking raising both sides to the power for and after suitable operations, we obtain the resullt 3.4). For the inequality 2.3.4) can be obtained in a similar fashion. Theorem 3.2). For every code with lengths satisfies 3.3). Then ) can be made to satisfy the inequality ) ) * Proof: Let be the positive integer satisfying the inequality 121 Page

) ) ) ) ) ) Consider the interval ) ) ) ) ) [ of length 1. In every there lies exactly one positive integer such that ) ) ] ) ) ) ) ) We will first show that the sequence{ }, thus defined satisfies 3.3). From 3.12), we have or ) ) ), ) * ) ), ) * ) ) Multiplying both sides ) and summing ove we get 3.3). The last inequality in 3.12) gives ) ) ) or ) ) ) ) ) * ) ) ) Taking and raising both sides to the power, we get ) Multiplying both sides by 3.9). For )* ) the proof follows along the similar lines. ) * ) and summing over and after simplifying, gives 122 Page

Reference: [1]. Belis, M. and Guiasu, S. [1968]: A quantitative and qualitative measure of information in cybernetic system, IEEE Transaction on information theory, Vol.IT-14, pp. 593-594. [2]. Boekee, E. and Van Der Lubbe, J.C.A. [1980]: The R- Norm Information measure, Information and Control, Vol. 45, pp. 136-155. [3]. De Luca and S. Termini [1972]: A Definition of Non-probabilistic Entropy in the Setting of fuzzy sets theory, Information and Control, Vol.20, pp. 301-312. [4]. Guiasu, S. and Picard,C.F. [1971]: Borne inferieure de la longer de certain codes, C.R. Academic Sciences, Paris,Vol. 273, pp. 248-251 [5]. Hooda, D. S. and Ram, A. [1998]: Characterization of Non-additive useful information measure, Recent Advances in information theory, Statistics and Computer applications, CCC Haryana, Agriculture University, Hisar, pp. 64-77 [6]. Kraft, L.J. [1949]: A device for quantizing grouping and coding amplitude modulates pulses, M. S. Thesis, Department of Electrical Engineering, MIT, Cambridge. [7]. Longo,G. [1976]: A noiseless coding theorem for source having utilities, SIAM Journal of Applied Mathematics, Vol.30, pp.739-748 [8]. Shannon, C.E. [1948]: The Mathematical theory of Communication, Bell Syst. Tech. Journal, Vol.27, pp.423-467. [9]. Shisha, O. [1967]: Inequalities, Academic press, New York. [10]. Sing, R.P., Kumar, R. and Tuteja, R.K. [2003]: Application of Holder s inequality in information theory,information Sciences, Vol. 152, pp. 145-154 [11]. Zadeh, L. A. [1965]: Fuzzy sets. Information and Control, Vol.8, pp. 338-353. [12]. Zadeh, L. A. [1968]: Probability measures of fuzzy events. Journal of Mathematical Analysis and Applications, Vol.23, pp.421-427. 123 Page